Science.gov

Sample records for seismic vulnerability analysis

  1. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  2. Integrating Social impacts on Health and Health-Care Systems in Systemic Seismic Vulnerability Analysis

    NASA Astrophysics Data System (ADS)

    Kunz-Plapp, T.; Khazai, B.; Daniell, J. E.

    2012-04-01

    This paper presents a new method for modeling health impacts caused by earthquake damage which allows for integrating key social impacts on individual health and health-care systems and for implementing these impacts in quantitative systemic seismic vulnerability analysis. In current earthquake casualty estimation models, demand on health-care systems is estimated by quantifying the number of fatalities and severity of injuries based on empirical data correlating building damage with casualties. The expected number of injured people (sorted by priorities of emergency treatment) is combined together with post-earthquake reduction of functionality of health-care facilities such as hospitals to estimate the impact on healthcare systems. The aim here is to extend these models by developing a combined engineering and social science approach. Although social vulnerability is recognized as a key component for the consequences of disasters, social vulnerability as such, is seldom linked to common formal and quantitative seismic loss estimates of injured people which provide direct impact on emergency health care services. Yet, there is a consensus that factors which affect vulnerability and post-earthquake health of at-risk populations include demographic characteristics such as age, education, occupation and employment and that these factors can aggravate health impacts further. Similarly, there are different social influences on the performance of health care systems after an earthquake both on an individual as well as on an institutional level. To link social impacts of health and health-care services to a systemic seismic vulnerability analysis, a conceptual model of social impacts of earthquakes on health and the health care systems has been developed. We identified and tested appropriate social indicators for individual health impacts and for health care impacts based on literature research, using available European statistical data. The results will be used to

  3. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic

  4. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  5. Effect of beta on Seismic Vulnerability Curve for RC Bridge Based on Double Damage Criterion

    SciTech Connect

    Feng Qinghai; Yuan Wancheng

    2010-05-21

    In the analysis of seismic vulnerability curve based on double damage criterion, the randomness of structural parameter and randomness of seismic should be considered. Firstly, the distribution characteristics of structure capability and seismic demand are obtained based on IDA and PUSHOVER, secondly, the vulnerability of the bridge is gained based on ANN and MC and a vulnerability curve according to this bridge and seismic is drawn. Finally, the analysis for a continuous bridge is displayed as an example, and parametric analysis for the effect of beta is done, which reflects the bridge vulnerability overall from the point of total probability, and in order to reduce the discreteness, large value of beta are suggested.

  6. Extreme seismicity and disaster risks: Hazard versus vulnerability (Invited)

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.

    2013-12-01

    Although the extreme nature of earthquakes has been known for millennia due to the resultant devastation from many of them, the vulnerability of our civilization to extreme seismic events is still growing. It is partly because of the increase in the number of high-risk objects and clustering of populations and infrastructure in the areas prone to seismic hazards. Today an earthquake may affect several hundreds thousand lives and cause significant damage up to hundred billion dollars; it can trigger an ecological catastrophe if occurs in close vicinity to a nuclear power plant. Two types of extreme natural events can be distinguished: (i) large magnitude low probability events, and (ii) the events leading to disasters. Although the first-type events may affect earthquake-prone countries directly or indirectly (as tsunamis, landslides etc.), the second-type events occur mainly in economically less-developed countries where the vulnerability is high and the resilience is low. Although earthquake hazards cannot be reduced, vulnerability to extreme events can be diminished by monitoring human systems and by relevant laws preventing an increase in vulnerability. Significant new knowledge should be gained on extreme seismicity through observations, monitoring, analysis, modeling, comprehensive hazard assessment, prediction, and interpretations to assist in disaster risk analysis. The advanced disaster risk communication skill should be developed to link scientists, emergency management authorities, and the public. Natural, social, economic, and political reasons leading to disasters due to earthquakes will be discussed.

  7. A Methodology for Assessing the Seismic Vulnerability of Highway Systems

    SciTech Connect

    Cirianni, Francis; Leonardi, Giovanni; Scopelliti, Francesco

    2008-07-08

    Modern society is totally dependent on a complex and articulated infrastructure network of vital importance for the existence of the urban settlements scattered on the territory. On these infrastructure systems, usually indicated with the term lifelines, are entrusted numerous services and indispensable functions of the normal urban and human activity.The systems of the lifelines represent an essential element in all the urbanised areas which are subject to seismic risk. It is important that, in these zones, they are planned according to opportune criteria based on two fundamental assumptions: a) determination of the best territorial localization, avoiding, within limits, the places of higher dangerousness; b) application of constructive technologies finalized to the reduction of the vulnerability.Therefore it is indispensable that in any modern process of seismic risk assessment the study of the networks is taken in the rightful consideration, to be integrated with the traditional analyses of the buildings.The present paper moves in this direction, dedicating particular attention to one kind of lifeline: the highway system, proposing a methodology of analysis finalized to the assessment of the seismic vulnerability of the system.

  8. Evaluation Of The Seismic Vulnerability of Fortified Structures

    SciTech Connect

    Baratta, Alessandro; Corbi, Ileana; Coppari, Sandro

    2008-07-08

    In the paper a prompt method to evaluate the seismic vulnerability of an ancient structure has been applied to the seismic vulnerability of the fortified structures in Italy, having as basics the elaboration of rather gross information about the state, the consistency and the history of the considered population of fabrics. The procedure proves to be rather effective and able to produce reliable results, despite the poor initial data.

  9. Seismic Vulnerability and Performance Level of confined brick walls

    SciTech Connect

    Ghalehnovi, M.; Rahdar, H. A.

    2008-07-08

    There has been an increase on the interest of Engineers and designers to use designing methods based on displacement and behavior (designing based on performance) Regarding to the importance of resisting structure design against dynamic loads such as earthquake, and inability to design according to prediction of nonlinear behavior element caused by nonlinear properties of constructional material.Economically speaking, easy carrying out and accessibility of masonry material have caused an enormous increase in masonry structures in villages, towns and cities. On the other hand, there is a necessity to study behavior and Seismic Vulnerability in these kinds of structures since Iran is located on the earthquake belt of Alpide.Different reasons such as environmental, economic, social, cultural and accessible constructional material have caused different kinds of constructional structures.In this study, some tied walls have been modeled with software and with relevant accelerator suitable with geology conditions under dynamic analysis to research on the Seismic Vulnerability and performance level of confined brick walls. Results from this analysis seem to be satisfactory after comparison of them with the values in Code ATC40, FEMA and standard 2800 of Iran.

  10. Remote sensing techniques applied to seismic vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Juan Arranz, Jose; Torres, Yolanda; Hahgi, Azade; Gaspar-Escribano, Jorge

    2016-04-01

    Advances in remote sensing and photogrammetry techniques have increased the degree of accuracy and resolution in the record of the earth's surface. This has expanded the range of possible applications of these data. In this research, we have used these data to document the construction characteristics of the urban environment of Lorca, Spain. An exposure database has been created with the gathered information to be used in seismic vulnerability assessment. To this end, we have used data from photogrammetric flights at different periods, using both orthorectified images in the visible and infrared spectrum. Furthermore, the analysis is completed using LiDAR data. From the combination of these data, it has been possible to delineate the building footprints and characterize the constructions with attributes such as the approximate date of construction, area, type of roof and even building materials. To carry out the calculation, we have developed different algorithms to compare images from different times, segment images, classify LiDAR data, and use the infrared data in order to remove vegetation or to compute roof surfaces with height value, tilt and spectral fingerprint. In addition, the accuracy of our results has been validated with ground truth data. Keywords: LiDAR, remote sensing, seismic vulnerability, Lorca

  11. Evaluation of socio-spatial vulnerability of citydwellers and analysis of risk perception: industrial and seismic risks in Mulhouse

    NASA Astrophysics Data System (ADS)

    Glatron, S.; Beck, E.

    2008-10-01

    Social vulnerability has been studied for years with sociological, psychological and economical approaches. Our proposition focuses on perception and cognitive representations of risks by city dwellers living in a medium size urban area, namely Mulhouse (France). Perception, being part of the social vulnerability and resilience of the society to disasters, influences the potential damage; for example it leads to adequate or inadequate behaviour in the case of an emergency. As geographers, we assume that the spatial relationship to danger or hazard can be an important factor of vulnerability and we feel that the spatial dimension is a challenging question either for better knowledge or for operational reasons (e.g. management of preventive information). We interviewed 491 people, inhabitants and workers, regularly distributed within the urban area to get to know their opinion on hazards and security measures better. We designed and mapped a vulnerability index on the basis of their answers. The results show that the social vulnerability depends on the type of hazard, and that the distance to the source of danger influences the vulnerability, especially for hazards with a precise location (industrial for example). Moreover, the effectiveness of the information campaigns is doubtful, as the people living close to hazardous industries (target of specific preventive information) are surprisingly more vulnerable and less aware of industrial risk.

  12. Highway bridge seismic design: Summary of FHWA/MCEER project on seismic vulnerability of new highway construction

    NASA Astrophysics Data System (ADS)

    Friedland, Ian M.; Buckle, Ian G.; Lee, George C.

    2002-06-01

    The Federal Highway Administration (FHWA) sponsored a large, multi-year project conducted by the Multidisciplinary Center for Earthquake Engineering Research (MCEER) titled “Seismic Vulnerability of New Highway Construction” (MCEER Project 112), which was completed in 1998. MCEER coordinated the work of many researchers, who performed studies on the seismic design and vulnerability analysis of highway bridges, tunnels, and retaining structures. Extensive research was conducted to provide revisions and improvements to current design and detailing approaches and national design specifications for highway bridges. The program included both analytical and experimental studies, and addressed seismic hazard exposure and ground motion input for the U.S. highway system; foundation design and soil behavior; structural importance, analysis, and response; structural design issues and details; and structural design criteria.

  13. Key geophysical indicators of seismic vulnerability in Kingston, Jamaica

    NASA Astrophysics Data System (ADS)

    Brown, L. A.; Hornbach, M. J.; Salazar, W.; Kennedy, M.

    2012-12-01

    Kingston, the major city and hub of all commercial and industrial activity in Jamaica, has a history of moderate seismic activity; however, two significant (>Mw 6) earthquakes (1692 and 1907) caused major devastation resulting in thousands of casualties. Both the 1692 and 1907 events also triggered widespread liquefaction and tsunamis within Kingston Harbor. Kingston remains vulnerable to these earthquakes today because the city sits on 200-m to 600-m thick alluvial fan deposits adjacent to the Enriquillo-Plantain Garden Fault—the same fault system that activated during the Haiti 2010 earthquake. Recent GPS results suggest the potential for a Mw 7-7.5 earthquake near Kingston along the Enriquillo- Plantain Garden fault Zone (EPGFZ), the dominant east-west trending fault through Jamaica. Whether active strands EPGFZ extend through downtown Kingston remains unclear, however, recent sonar mapping in Kingston harbor show evidence for active faulting, with offshore faults connecting to proposed active on-land fault systems that extend through populated areas of the city. Seismic "Chirp" reflections also shows evidence for multiple recent (Holocene) submarine slide deposits in the harbor that may be associated with historic tsunamis. Using recently acquired chirp and sediment cores, we are currently studying the recurrence interval of earthquake events. We also recently performed a microtremor survey to identify areas prone to earthquake-induced ground shaking throughout the city of Kingston & St. Andrew parish. Data was collected at 200 points with a lateral spacing of 500 metres between each point. Our analysis shows significant variations in the fundamental frequency across the city and results clearly indicate areas of potential amplification, with areas surrounding Kingston harbor (much of which has been built on reclaimed land) showing the highest potential for ground amplification. The microtremor analysis suggests several high-density urban areas as well as key

  14. Coping with seismic vulnerability: small manufacturing firms in western Athens.

    PubMed

    Sapountzaki, Kalliopi

    2005-06-01

    This paper attempts to contribute to international discourse on the responsibility of macro structures (economic and political) and private agencies for the production and distribution of vulnerability. It does so by focusing on an individual economic entity, small manufacturing firms (SMFs), in a specific location, western Athens, Greece. By evaluating the losses that SMFs sustained in the earthquake of 7 September 1999, the paper points to variations in vulnerability levels among such firms and highlights the 'sources' of vulnerability they confront. Furthermore, the SMF recovery cycle is systematically monitored in parallel with relevant public policies and state reactions to private recovery methods. The analysis illustrates processes that externalise recovery costs, alter the relationship between physical and socio-economic vulnerability and shift the vulnerability load from macro structures to individual agencies or vice versa. It is based on two methodological approaches: the division of vulnerability into three constituent components (exposure, resistance and resilience); and the conceptual split between producers and carriers of vulnerability.

  15. Rapid Assessment of Seismic Vulnerability in Palestinian Refugee Camps

    NASA Astrophysics Data System (ADS)

    Al-Dabbeek, Jalal N.; El-Kelani, Radwan J.

    Studies of historical and recorded earthquakes in Palestine demonstrate that damaging earthquakes are occurring frequently along the Dead Sea Transform: Earthquake of 11 July 1927 (ML 6.2), Earthquake of 11 February 2004 (ML 5.2). In order to reduce seismic vulnerability of buildings, losses in lives, properties and infrastructures, an attempt was made to estimate the percentage of damage degrees and losses at selected refugee camps: Al Ama`ri, Balata and Dhaishe. Assessing the vulnerability classes of building structures was carried out according to the European Macro-Seismic Scale 1998 (EMS-98) and the Fedral Emergency Management Agency (FEMA). The rapid assessment results showed that very heavy structural and non structural damages will occur in the common buildings of the investigated Refugee Camps (many buildings will suffer from damages grades 4 and 5). Bad quality of buildings in terms of design and construction, lack of uniformity, absence of spaces between the building and the limited width of roads will definitely increase the seismic vulnerability under the influence of moderate-strong (M 6-7) earthquakes in the future.

  16. Seismic vulnerability and risk assessment of Kolkata City, India

    NASA Astrophysics Data System (ADS)

    Nath, S. K.; Adhikari, M. D.; Devaraj, N.; Maiti, S. K.

    2015-06-01

    The city of Kolkata is one of the most urbanized and densely populated regions in the world and a major industrial and commercial hub of the eastern and northeastern region of India. In order to classify the seismic risk zones of Kolkata we used seismic hazard exposures on the vulnerability components, namely land use/land cover, population density, building typology, age and height. We microzoned seismic hazard of the city by integrating seismological, geological and geotechnical themes in GIS, which in turn are integrated with the vulnerability components in a logic-tree framework for the estimation of both the socioeconomic and structural risk of the city. In both the risk maps, three broad zones have been demarcated as "severe", "high" and "moderate". There had also been a risk-free zone in the city that is termed as "low". The damage distribution in the city due to the 1934 Bihar-Nepal earthquake of Mw = 8.1 matches satisfactorily well with the demarcated risk regime. The design horizontal seismic coefficients for the city have been worked out for all the fundamental periods that indicate suitability for "A", "B" and "C" type of structures. The cumulative damage probabilities in terms of "none", "slight", "moderate", "extensive" and "complete" have also been assessed for the predominantly four model building types viz. RM2L, RM2M, URML and URMM for each seismic structural risk zone in the city. Both the seismic hazard and risk maps are expected to play vital roles in the earthquake-inflicted disaster mitigation and management of the city of Kolkata.

  17. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Tóth, László; Kovács, Károly; Konecny, Pavel; Lednicka, Marketa; Spötl, Christoph; Bednárik, Martin; Brimich, Ladislav; Hegymegi, Erika; Novák, Attila

    2016-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. Obtaining an unbiased view of seismic hazard (and risk) is therefore very important. In principle, the best way to test Probabilistic Seismic Hazard Assessments (PSHA) is to compare them with observations that are entirely independent of the procedure used to produce PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occurring over time intervals that are at least as long as a seismic cycle. Long-term information can in principle be gained from intact stalagmites in natural caves. These formations survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that time period. Here we present such a stalagmite-based case study from the Little Carpathians of Slovakia. A specially shaped, intact and vulnerable stalagmite (IVSTM) in Plavecká priepast cave was examined in 2013. This IVSTM is suitable for estimating the upper limit of horizontal peak ground acceleration generated by pre-historic earthquakes. The approach, used in our study, yields significant new constraints on the seismic hazard, as tectonic structures close to Plavecká priepast cave did not generate strong paleoearthquakes in the last few thousand years. A particular importance of this study results from the seismic hazard of two close-by capitals: Vienna and Bratislava.

  18. Seismic vulnerability study Los Alamos Meson Physics Facility (LAMPF)

    SciTech Connect

    Salmon, M.; Goen, L.K.

    1995-12-01

    The Los Alamos Meson Physics Facility (LAMPF), located at TA-53 of Los Alamos National Laboratory (LANL), features an 800 MeV proton accelerator used for nuclear physics and materials science research. As part of the implementation of DOE Order 5480.25 and in preparation for DOE Order 5480.28, a seismic vulnerability study of the structures, systems, and components (SSCs) supporting the beam line from the accelerator building through to the ends of die various beam stops at LAMPF has been performed. The study was accomplished using the SQUG GIP methodology to assess the capability of the various SSCs to resist an evaluation basis earthquake. The evaluation basis earthquake was selected from site specific seismic hazard studies. The goals for the study were as follows: (1) identify SSCs which are vulnerable to seismic loads; and (2) ensure that those SSCs screened during die evaluation met the performance goals required for DOE Order 5480.28. The first goal was obtained by applying the SQUG GIP methodology to those SSCS represented in the experience data base. For those SSCs not represented in the data base, information was gathered and a significant amount of engineering judgment applied to determine whether to screen the SSC or to classify it as an outlier. To assure the performance goals required by DOE Order 5480.28 are met, modifications to the SQUG GIP methodology proposed by Salmon and Kennedy were used. The results of this study ire presented in this paper.

  19. Regional tsunami vulnerability analysis through ASTER imagery

    NASA Astrophysics Data System (ADS)

    Dall'Osso, Filippo; Cavalletti, Alessandra; Immordino, Francesco; Gonella, Marco

    2010-05-01

    Analysis of vulnerability to natural hazards is a key issue of prevention measures within ICZM. Knowledge of susceptibility to damage and how this is distributed along the coast allows to optimize possible prevention and mitigation actions. The present study focuses on tsunami vulnerability of a large extension of coastline: the entire westerly Thailand's coast. The work is a follow up of the CRATER project (Coastal Risk Analysis for Tsunamis and Environmental Remediation) carried out on the aftermath of the 26th December 2004 Tsunami event. Vulnerability is analyzed considering an inundation scenario given by a tsunami of seismic origin, causing a maximum run-up of 25m.. An innovative methodology have been here developed and applied, based on the combined use of ASTER (Advanced Spaceborn Thermal Emission and Reflection Radiometer) satellite imagery, SRTM v-3 (Shuttle Radar Topography Mission - version #3) DEMs and GIS. Vulnerability level has been calculated combining information on coastal geomorphology, land use, topography and distance from the shoreline. Land use has been extrapolated from ASTER images through a multi-spectral analysis (a pixel-based and supervised classification process) of ASTER bands 1 to 9, plus one band for the NDVI index (Normalized Difference Vegetation Index). Coastal geomorphology has been obtained through a photo-interpretation process. Results have been organized in a set of vectorial vulnerability maps with horizontal resolution of 90m. The proposed methodology has the great advantage of being repeatable for any case of vulnerability analysis at small-medium scale (i.e. at Regional/National level) with a moderate investment in term of costs and human resources.

  20. Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Not Available

    1988-01-01

    The purpose of Probabilistic Seismic Hazard Analysis (PSHA) is to evaluate the hazard of seismic ground motion at a site by considering all possible earthquakes in the area, estimating the associated shaking at the site, and calculating the probabilities of these occurrences. The Panel on Seismic Hazard Analysis is charged with assessment of the capabilities, limitations, and future trends of PSHA in the context of alternatives. The report identifies and discusses key issues of PSHA and is addressed to decision makers with a modest scientific and technical background and to the scientific and technical community. 37 refs., 19 figs.

  1. SANITARY VULNERABILITY OF A TERRITORIAL SYSTEM IN HIGH SEISMIC AREAS

    NASA Astrophysics Data System (ADS)

    Teramo, A.; Termini, D.; de Domenico, D.; Marino, A.; Marullo, A.; Saccà, C.; Teramo, M.

    2009-12-01

    An evaluation procedure of sanitary vulnerability of a territorial system falling within a high seismic risk area, related to casualty treatment capability of hospitals after an earthquake, is proposed. The goal of the study is aimed at highlighting hospital criticalities for the arrangement of a prevention policy on the basis of territorial, demographic and sanitary type specific analyses of a given area. This is the first step of a procedure of territorial context reading within a damage scenario, addressed to a verification of preparedness level of the territorial system to a sanitary emergency referable both to a natural disaster and anthropic one. The results of carried out surveys are shown, at a different scale, on several sample areas of Messina Province (Italy) territory, evaluating the consistency of damage scenario with the number of casualties, medical doctors, available beds for the implementation of a emergency sanitary circuit.

  2. Rapid assessment for seismic vulnerability of low and medium rise infilled RC frame buildings

    NASA Astrophysics Data System (ADS)

    Al-Nimry, Hanan; Resheidat, Musa; Qeran, Saddam

    2015-06-01

    An indexing method for rapid evaluation of the seismic vulnerability of infilled RC frame buildings in Jordan is proposed. The method aims at identifying low and medium rise residential buildings as safe or in need of further detailed evaluation. Following a rapid visual screening, the building is assigned a Basic Capacity Index (BCI); five performance modifiers are identified and multiplied by the BCI to arrive at the Capacity Index (CI) of the building. A Capacity Index lower than a limit CI value indicates that the screened building could experience moderate earthquake damage whereas a higher value implies that minor damage, if any, would take place. To establish the basic evaluation parameters; forty RC frame buildings were selected, designed and analyzed using static nonlinear analysis and incorporating the effect of infill walls. Effects of seismicity, local site conditions, horizontal irregularities (setbacks and re-entrant corners), vertical irregularities (soft story at ground floor level) and overhangs on the seismic performance of local buildings were examined. Assessment forms were designed and used to evaluate and rank 112 sample buildings. About 40% of the surveyed buildings were found to be in need of detailed evaluation to better define their seismic vulnerabilities.

  3. Fault zone regulation, seismic hazard, and social vulnerability in Los Angeles, California: Hazard or urban amenity?

    NASA Astrophysics Data System (ADS)

    Toké, Nathan A.; Boone, Christopher G.; Arrowsmith, J. Ramón

    2014-09-01

    Public perception and regulation of environmental hazards are important factors in the development and configuration of cities. Throughout California, probabilistic seismic hazard mapping and geologic investigations of active faults have spatially quantified earthquake hazard. In Los Angeles, these analyses have informed earthquake engineering, public awareness, the insurance industry, and the government regulation of developments near faults. Understanding the impact of natural hazards regulation on the social and built geography of cities is vital for informing future science and policy directions. We constructed a relative social vulnerability index classification for Los Angeles to examine the social condition within regions of significant seismic hazard, including areas regulated as Alquist-Priolo (AP) Act earthquake fault zones. Despite hazard disclosures, social vulnerability is lowest within AP regulatory zones and vulnerability increases with distance from them. Because the AP Act requires building setbacks from active faults, newer developments in these zones are bisected by parks. Parcel-level analysis demonstrates that homes adjacent to these fault zone parks are the most valuable in their neighborhoods. At a broad scale, a Landsat-based normalized difference vegetation index shows that greenness near AP zones is greater than the rest of the metropolitan area. In the parks-poor city of Los Angeles, fault zone regulation has contributed to the construction of park space within areas of earthquake hazard, thus transforming zones of natural hazard into amenities, attracting populations of relatively high social status, and demonstrating that the distribution of social vulnerability is sometimes more strongly tied to amenities than hazards.

  4. Integrated Estimation of Seismic Physical Vulnerability of Tehran Using Rule Based Granular Computing

    NASA Astrophysics Data System (ADS)

    Sheikhian, H.; Delavar, M. R.; Stein, A.

    2015-08-01

    Tehran, the capital of Iran, is surrounded by the North Tehran fault, the Mosha fault and the Rey fault. This exposes the city to possibly huge earthquakes followed by dramatic human loss and physical damage, in particular as it contains a large number of non-standard constructions and aged buildings. Estimation of the likely consequences of an earthquake facilitates mitigation of these losses. Mitigation of the earthquake fatalities may be achieved by promoting awareness of earthquake vulnerability and implementation of seismic vulnerability reduction measures. In this research, granular computing using generality and absolute support for rule extraction is applied. It uses coverage and entropy for rule prioritization. These rules are combined to form a granule tree that shows the order and relation of the extracted rules. In this way the seismic physical vulnerability is assessed, integrating the effects of the three major known faults. Effective parameters considered in the physical seismic vulnerability assessment are slope, seismic intensity, height and age of the buildings. Experts were asked to predict seismic vulnerability for 100 randomly selected samples among more than 3000 statistical units in Tehran. The integrated experts' point of views serve as input into granular computing. Non-redundant covering rules preserve the consistency in the model, which resulted in 84% accuracy in the seismic vulnerability assessment based on the validation of the predicted test data against expected vulnerability degree. The study concluded that granular computing is a useful method to assess the effects of earthquakes in an earthquake prone area.

  5. Seismic evaluation of vulnerability for SAMA educational buildings in Tehran

    NASA Astrophysics Data System (ADS)

    Amini, Omid Nassiri; Amiri, Javad Vaseghi

    2008-07-01

    Earthquake is a destructive phenomenon that trembles different parts of the earth yearly and causes many destructions. Iran is one of the (high seismicity) quack- prone parts of the world that has received a lot of pecuniary damages and life losses each year, schools are of the most important places to be protected during such crisis. There was no special surveillance on designing and building of school's building in Tehran till the late 70's, and as Tehran is on faults, instability of such buildings may cause irrecoverable pecuniary damages and especially life losses, therefore preventing this phenomenon is in an urgent need. For this purpose, some of the schools built during 67-78 mostly with Steel braced frame structures have been selected, first, by evaluating the selected Samples, gathering information and Visual Survey, the prepared questionnaires were filled out. With the use of ARIA and SABA (Venezuela) Methods, new modified combined method for qualified evaluations was found and used. Then, for quantified evaluation, with the use of computer 3D models and nonlinear statically analysis methods, a number of selected buildings of qualified evaluation, were reevaluated and finally with nonlinear dynamic analysis method the real behavior of structures on the earthquakes is studied. The results of qualified and quantified evaluations were compared and a proper Pattern for seismic evaluation of Educational buildings was presented. Otherwise the results can be a guidance for the person in charge of retrofitting or if necessary rebuilding the schools.

  6. Seismic evaluation of vulnerability for SAMA educational buildings in Tehran

    SciTech Connect

    Amini, Omid Nassiri; Amiri, Javad Vaseghi

    2008-07-08

    Earthquake is a destructive phenomenon that trembles different parts of the earth yearly and causes many destructions. Iran is one of the (high seismicity) quack- prone parts of the world that has received a lot of pecuniary damages and life losses each year, schools are of the most important places to be protected during such crisis.There was no special surveillance on designing and building of school's building in Tehran till the late 70's, and as Tehran is on faults, instability of such buildings may cause irrecoverable pecuniary damages and especially life losses, therefore preventing this phenomenon is in an urgent need.For this purpose, some of the schools built during 67-78 mostly with Steel braced frame structures have been selected, first, by evaluating the selected Samples, gathering information and Visual Survey, the prepared questionnaires were filled out. With the use of ARIA and SABA (Venezuela) Methods, new modified combined method for qualified evaluations was found and used.Then, for quantified evaluation, with the use of computer 3D models and nonlinear statically analysis methods, a number of selected buildings of qualified evaluation, were reevaluated and finally with nonlinear dynamic analysis method the real behavior of structures on the earthquakes is studied.The results of qualified and quantified evaluations were compared and a proper Pattern for seismic evaluation of Educational buildings was presented. Otherwise the results can be a guidance for the person in charge of retrofitting or if necessary rebuilding the schools.

  7. Use of expert judgment elicitation to estimate seismic vulnerability of selected building types

    USGS Publications Warehouse

    Jaiswal, K.S.; Aspinall, W.; Perkins, D.; Wald, D.; Porter, K.A.

    2012-01-01

    Pooling engineering input on earthquake building vulnerability through an expert judgment elicitation process requires careful deliberation. This article provides an overview of expert judgment procedures including the Delphi approach and the Cooke performance-based method to estimate the seismic vulnerability of a building category.

  8. Seismic Vulnerability Assessment Rest House Building TA-16-41

    SciTech Connect

    Cuesta, Isabel; Salmon, Michael W.

    2003-10-01

    The purpose of this report is to present the results of the evaluation completed on the Rest House Facility (TA-16-4111) in support of hazard analysis for a Documented Safety Assessment (DSA). The Rest House facility has been evaluated to verify the structural response to seismic, wind, and snow loads in support of the DynEx DSA. The structural analyses consider the structure and the following systems and/or components inside the facility as requested by facility management: cranes, lighting protection system, and fire protection system. The facility has been assigned to Natural Phenomena Hazards (NPH) Performance Category (PC) –3. The facility structure was evaluated to PC-3 criteria because it serves to confine hazardous material, and in the event of an accident, the facility cannot fail or collapse. Seismicinduced failure of the cranes, lighting, and fire-protection systems according to DOE-STD-1021-93 (Ref. 1) “may result in adverse release consequences greater than safety-class Structures, Systems, and Components (SSC) Evaluation Guideline limits but much less than those associated with PC-4 SSC.” Therefore, these items will be evaluated to PC-3 criteria as well. This report presents the results of those analyses and suggests recommendations to improve the seismic capacity of the systems and components cited above.

  9. Metadata for selecting or submitting generic seismic vulnerability functions via GEM's vulnerability database

    USGS Publications Warehouse

    Jaiswal, Kishor

    2013-01-01

    This memo lays out a procedure for the GEM software to offer an available vulnerability function for any acceptable set of attributes that the user specifies for a particular building category. The memo also provides general guidelines on how to submit the vulnerability or fragility functions to the GEM vulnerability repository, stipulating which attributes modelers must provide so that their vulnerability or fragility functions can be queried appropriately by the vulnerability database. An important objective is to provide users guidance on limitations and applicability by providing the associated modeling assumptions and applicability of each vulnerability or fragility function.

  10. A S.M.A.R.T. system for the seismic vulnerability mitigation of Cultural Heritages

    NASA Astrophysics Data System (ADS)

    Montuori, Antonio; Costanzo, Antonio; Gaudiosi, Iolanda; Vecchio, Antonio; Minasi, Mario; Falcone, Sergio; La Piana, Carmelo; Stramondo, Salvatore; Casula, Giuseppe; Giovanna Bianchi, Maria; Fabrizia Buongiorno, Maria; Musacchio, Massimo; Doumaz, Fawzi; Ilaria Pannaccione Apa, Maria

    2016-04-01

    Both assessment and mitigation of seismic vulnerability connected to cultural heritages monitoring are non-trivial issues, based on the knowledge of structural and environmental factors potential impacting the cultural heritage. A holistic approach could be suitable to provide an effective monitoring of cultural heritages within their surroundings at different spatial and temporal scales. On the one hand, the analysis about geometrical and structural properties of monuments is important to assess their state of conservation, their response to external stresses as well as anomalies related to natural and/or anthropogenic phenomena (e.g. the aging of materials, seismic stresses, vibrational modes). On the other hand, the investigation of the surrounding area is relevant to assess environmental properties and natural phenomena (e.g. landslides, earthquakes, subsidence, seismic response) as well as their related impacts on the monuments. Within such a framework, a multi-disciplinary system has been developed and here presented for the monitoring of cultural heritages for seismic vulnerability assessment and mitigation purposes*. It merges geophysical investigations and modeling, in situ measurements and multi-platforms remote sensing sensors for the non-destructive and non-invasive multi-scales monitoring of historic buildings in a seismic-prone area. In detail, the system provides: a) the long-term and the regional-scale analysis of buildings' environment through the integration of seismogenic analysis, airborne magnetic surveys, space-borne Synthetic Aperture Radar (SAR) and multi-spectral sensors. They allow describing the sub-surface fault systems, the surface deformation processes and the land use mapping of the regional-scale area on an annual temporal span; b) the short-term and the basin-scale analysis of building's neighborhood through geological setting and geotechnical surveys, airborne Light Detection And Radar (LiDAR) and ground-based SAR sensors. They

  11. Seismic Data Analysis Center

    NASA Astrophysics Data System (ADS)

    1983-01-01

    The effort required to operate and maintain the Seismic Data Analysis Center during the fiscal year of 1981 is described. Statistics concerning the operational effectiveness and the utilization of the systems at the Center are also given. The major activities associated with maintaining the operating systems, providing data services, and performing maintenance are discussed. The development effort and improvements made to the systems supporting the geophysical research include capabilities added to the Regional Event Location System and the Automatic Association program. Other tasks reported include the result of implementing a front end processor (called an intelligent line interface) to do real time signal detection, the effects of altering the configuration of the detection systems, and the status of software developed to do interactive discrimination. A computer study was performed to determine a preferred system to accomplish the on-line data recording and support the data services activity.

  12. Seismic vulnerability: theory and application to Algerian buildings

    NASA Astrophysics Data System (ADS)

    Mebarki, Ahmed; Boukri, Mehdi; Laribi, Abderrahmane; Farsi, Mohammed; Belazougui, Mohamed; Kharchi, Fattoum

    2014-04-01

    results to the observed damages. For pre-earthquake analysis, the methodology widely used around the world relies on the prior calibration of the seismic response of the structures under given expected scenarios. As the structural response is governed by the constitutive materials and structural typology as well as the seismic input and soil conditions, the damage prediction depends intimately on the accuracy of the so-called fragility curve and response spectrum established for each type of structure (RC framed structures, confined or unconfined masonry, etc.) and soil (hard rock, soft soil, etc.). In the present study, the adaptation to Algerian buildings concerns the specific soil conditions as well as the structural dynamic response. The theoretical prediction of the expected damages is helpful for the calibration of the methodology. Thousands (˜3,700) of real structures and the damages caused by the earthquake (Algeria, Boumerdes: Mw = 6.8, May 21, 2003) are considered for the a posteriori calibration and validation process. The theoretical predictions show the importance of the elastic response spectrum, the local soil conditions, and the structural typology. Although the observed and predicted categories of damage are close, it appears that the existing form used for the visual damage inspection would still require further improvements, in order to allow easy evaluation and identification of the damage level. These methods coupled to databases, and GIS tools could be helpful for the local and technical authorities during the post-earthquake evaluation process: real time information on the damage extent at urban or regional scales as well as the extent of losses and the required resources for reconstruction, evacuation, strengthening, etc.

  13. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Kovács, Károly; Konecny, Pavel; Lednicka, Marketa; Bednárik, Martin; Brimich, Ladislav

    2015-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. This raises the important issue for society, how to react to the natural hazard: potential damages are huge, but infrastructure costs for addressing these hazards are huge as well. Furthermore, seismic hazard is only one of the many hazards facing society. Societal means need to be distributed in a reasonable manner - to assure that all of these hazards (natural as well as societal) are addressed appropriately. Obtaining an unbiased view of seismic hazard (and risk) is very important therefore. In principle, the best way to test PSHA models is to compare with observations that are entirely independent of the procedure used to produce the PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occuring over time intervals that are at least as long as a seismic cycle - if that exists. Such information would be very valuable, even if it concerned only a single site, namely that of a particularly sensitive infrastructure. Such a request may seem hopeless - but it is not. Long-term information can in principle be gained from intact stalagmites in natural caves. These have survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that period. We are focusing here on case studies in Austria, which has moderate seismicity, but a well-documented history of major earthquake-induced damage, e.g., Villach in 1348 and 1690, Vienna in 1590, Leoben in 1794, and Innsbruck in 1551, 1572, and 1589. Seismic intensities have reached levels up to 10. It is clearly important to know which "worst-case" damages to expect. We have identified sets of particularly sensitive stalagmites in the general vicinity of two major cities in

  14. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    SciTech Connect

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly

  15. Comparative Application of Capacity Models for Seismic Vulnerability Evaluation of Existing RC Structures

    SciTech Connect

    Faella, C.; Lima, C.; Martinelli, E.; Nigro, E.

    2008-07-08

    Seismic vulnerability assessment of existing buildings is one of the most common tasks in which Structural Engineers are currently engaged. Since, its is often a preliminary step to approach the issue of how to retrofit non-seismic designed and detailed structures, it plays a key role in the successful choice of the most suitable strengthening technique. In this framework, the basic information for both seismic assessment and retrofitting is related to the formulation of capacity models for structural members. Plenty of proposals, often contradictory under the quantitative standpoint, are currently available within the technical and scientific literature for defining the structural capacity in terms of force and displacements, possibly with reference to different parameters representing the seismic response. The present paper shortly reviews some of the models for capacity of RC members and compare them with reference to two case studies assumed as representative of a wide class of existing buildings.

  16. Seismic Analysis Capability in NASTRAN

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.

    1984-01-01

    Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.

  17. Unauthorised development and seismic hazard vulnerability: a study of squatters and engineers in Istanbul, Turkey.

    PubMed

    Green, Rebekah A

    2008-09-01

    Many cities in developing nations have experienced an influx of poor migrants in search of work. This population influx has often been accommodated through land squatting, irregular construction and unauthorised housing. For the urban poor, this has resulted in immediate affordable housing; however, this housing frequently has long-term vulnerability to natural hazards. This article examines the ways in which squatters in Istanbul, Turkey, understand the seismic vulnerability of their unauthorised housing. Distrust of professional engineers and contractors has led Istanbul squatters to believe that self-built housing will not only be less costly but also safer than commercially built housing. The impact of residents' risk perceptions on their vulnerability to natural hazards is examined through a comparison of social attitudes regarding safe housing and the quality of unauthorised construction. This comparison highlights how squatters' risk perceptions necessitate innovative means of reducing vulnerability in unauthorised neighbourhoods of developing cities.

  18. Generalized seismic analysis

    NASA Technical Reports Server (NTRS)

    Butler, Thomas G.

    1993-01-01

    There is a constant need to be able to solve for enforced motion of structures. Spacecraft need to be qualified for acceleration inputs. Truck cargoes need to be safeguarded from road mishaps. Office buildings need to withstand earthquake shocks. Marine machinery needs to be able to withstand hull shocks. All of these kinds of enforced motions are being grouped together under the heading of seismic inputs. Attempts have been made to cope with this problem over the years and they usually have ended up with some limiting or compromise conditions. The crudest approach was to limit the problem to acceleration occurring only at a base of a structure, constrained to be rigid. The analyst would assign arbitrarily outsized masses to base points. He would then calculate the magnitude of force to apply to the base mass (or masses) in order to produce the specified acceleration. He would of necessity have to sacrifice the determination of stresses in the vicinity of the base, because of the artificial nature of the input forces. The author followed the lead of John M. Biggs by using relative coordinates for a rigid base in a 1975 paper, and again in a 1981 paper . This method of relative coordinates was extended and made operational as DMAP ALTER packets to rigid formats 9, 10, 11, and 12 under contract N60921-82-C-0128. This method was presented at the twelfth NASTRAN Colloquium. Another analyst in the field developed a method that computed the forces from enforced motion then applied them as a forcing to the remaining unknowns after the knowns were partitioned off. The method was translated into DMAP ALTER's but was never made operational. All of this activity jelled into the current effort. Much thought was invested in working out ways to unshakle the analysis of enforced motions from the limitations that persisted.

  19. Application of PRA to HEMP vulnerability analysis

    SciTech Connect

    Mensing, R.W.

    1985-09-01

    Vulnerability analyses of large systems, e.g., control and communication centers, aircraft, ships, are subject to many uncertainties. A basic source of uncertainty is the random variation inherent in the physical world. Thus, vulnerability is appropriately described by an estimate of the probability of survival (or failure). A second source of uncertainty that also needs to be recognized is the uncertainty associated with the analysis or estimation process itself. This uncertainty, often called modeling uncertainty, has many contributors. There are the approximations introduced by using mathematical models to describe reality. Also, the appropriate values of the model parameters are derived from several sources, e.g., based on experimental or test data, based on expert judgment and opinion. In any case, these values are subject to uncertainty. This uncertainty must be considered in the description of vulnerability. Thus, the estimate of the probability of survival is not a single value but a range of values. Probabilistic risk analysis (PRA) is a methodology which deals with these uncertainty issues. This report discusses the application of PRA to HEMP vulnerability analyses. Vulnerability analysis and PRA are briefly outlined and the need to distinguish between random variation and modeling uncertainty is discussed. Then a sequence of steps appropriate for applying PRA to vulnerability problems is outlined. Finally, methods for handling modeling uncertainty are identified and discussed.

  20. Uncertainty Management in Seismic Vulnerability Assessment Using Granular Computing Based on Covering of Universe

    NASA Astrophysics Data System (ADS)

    Khamespanah, F.; Delavar, M. R.; Zare, M.

    2013-05-01

    Earthquake is an abrupt displacement of the earth's crust caused by the discharge of strain collected along faults or by volcanic eruptions. Earthquake as a recurring natural cataclysm has always been a matter of concern in Tehran, capital of Iran, as a laying city on a number of known and unknown faults. Earthquakes can cause severe physical, psychological and financial damages. Consequently, some procedures should be developed to assist modelling the potential casualties and its spatial uncertainty. One of these procedures is production of seismic vulnerability maps to take preventive measures to mitigate corporeal and financial losses of future earthquakes. Since vulnerability assessment is a multi-criteria decision making problem depending on some parameters and expert's judgments, it undoubtedly is characterized by intrinsic uncertainties. In this study, it is attempted to use Granular computing (GrC) model based on covering of universe to handle the spatial uncertainty. Granular computing model concentrates on a general theory and methodology for problem solving as well as information processing by assuming multiple levels of granularity. Basic elements in granular computing are subsets, classes, and clusters of a universe called elements. In this research GrC is used for extracting classification rules based on seismic vulnerability with minimum entropy to handle uncertainty related to earthquake data. Tehran was selected as the study area. In our previous research, Granular computing model based on a partition model of universe was employed. The model has some kinds of limitations in defining similarity between elements of the universe and defining granules. In the model similarity between elements is defined based on an equivalence relation. According to this relation, two objects are similar based on some attributes, provided for each attribute the values of these objects are equal. In this research a general relation for defining similarity between

  1. Vulnerability

    NASA Technical Reports Server (NTRS)

    Taback, I.

    1979-01-01

    The discussion of vulnerability begins with a description of some of the electrical characteristics of fibers before definiting how vulnerability calculations are done. The vulnerability results secured to date are presented. The discussion touches on post exposure vulnerability. After a description of some shock hazard work now underway, the discussion leads into a description of the planned effort and some preliminary conclusions are presented.

  2. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  3. Information systems vulnerability: A systems analysis perspective

    SciTech Connect

    Wyss, G.D.; Daniel, S.L.; Schriner, H.K.; Gaylor, T.R.

    1996-07-01

    Vulnerability analyses for information systems are complicated because the systems are often geographically distributed. Sandia National Laboratories has assembled an interdisciplinary team to explore the applicability of probabilistic logic modeling (PLM) techniques (including vulnerability and vital area analysis) to examine the risks associated with networked information systems. The authors have found that the reliability and failure modes of many network technologies can be effectively assessed using fault trees and other PLM methods. The results of these models are compatible with an expanded set of vital area analysis techniques that can model both physical locations and virtual (logical) locations to identify both categories of vital areas simultaneously. These results can also be used with optimization techniques to direct the analyst toward the most cost-effective security solution.

  4. Aircraft vulnerability analysis by modeling and simulation

    NASA Astrophysics Data System (ADS)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    Infrared missiles pose a significant threat to civilian and military aviation. ManPADS missiles are especially dangerous in the hands of rogue and undisciplined forces. Yet, not all the launched missiles hit their targets; the miss being either attributable to misuse of the weapon or to missile performance restrictions. This paper analyses some of the factors affecting aircraft vulnerability and demonstrates a structured analysis of the risk and aircraft vulnerability problem. The aircraft-missile engagement is a complex series of events, many of which are only partially understood. Aircraft and missile designers focus on the optimal design and performance of their respective systems, often testing only in a limited set of scenarios. Most missiles react to the contrast intensity, but the variability of the background is rarely considered. Finally, the vulnerability of the aircraft depends jointly on the missile's performance and the doctrine governing the missile's launch. These factors are considered in a holistic investigation. The view direction, altitude, time of day, sun position, latitude/longitude and terrain determine the background against which the aircraft is observed. Especially high gradients in sky radiance occur around the sun and on the horizon. This paper considers uncluttered background scenes (uniform terrain and clear sky) and presents examples of background radiance at all view angles across a sphere around the sensor. A detailed geometrical and spatially distributed radiometric model is used to model the aircraft. This model provides the signature at all possible view angles across the sphere around the aircraft. The signature is determined in absolute terms (no background) and in contrast terms (with background). It is shown that the background significantly affects the contrast signature as observed by the missile sensor. A simplified missile model is constructed by defining the thrust and mass profiles, maximum seeker tracking rate, maximum

  5. Seismic piping test and analysis

    SciTech Connect

    Not Available

    1980-09-01

    This report presents selected results to date of a dynamic testing and analysis program focusing on a piping system at Consolidated Edison Company of New York's Indian Point-1 Nuclear Generating Station. The goal of this research program is the development of more accurate and realistic models of piping systems subjected to seismic, hydraulic, operating, and other dynamic loads. The program seeks to identify piping system properties significant to dynamic response rather than seeking to simulate any particular form of excitation. The fundamental experimental approach is the excitation of piping/restraint devices/supports by a variety of dynamic test methods and the analysis of the resulting response to identify the characteristic dynamic properties of the system tested. The comparison of the identified dynamic properties to those predicted by alternative analytical approaches will support improvements in methods used in the dynamic analysis of piping, restraint, devices, and supports.

  6. Analysis of the ambient seismic noise at Bulgarian seismic stations

    NASA Astrophysics Data System (ADS)

    Dimitrova, Liliya; Nikolova, Svetlana

    2010-05-01

    Modernization of Bulgarian National Seismological Network has been performed during a month in 2005. Broadband seismometers and 24-bits digital acquisition systems with dynamic range more than 132dB type DAS130-01 produced by RefTek Inc. were installed at the seismic stations from the existing analog network. In the present study the ambient seismic noise at Bulgarian National Digital Seismological Network (BNDSN) stations is evaluated. In order to compare the performance of the network against international standards the detail analysis of the seismic noise was performed using software and models that are applied in the international practice. The method of McNamara and Bulland was applied and the software code PDFSA was used to determine power spectral density function (PSD) of the background noise and to evaluate the probability density function (PDF). The levels of the ambient seismic noise were determined and the full range of the factors influencing the quality of the data and the performance of a seismic station was analyzed. The estimated PSD functions were compared against two models for high (NHNM) and low (NLNM) noise that are widely used in seismological practice for seismic station monitoring qualities assessment. The mode PDF are used to prepare annual, seasonal, diurnal and frequency analyses of the noise levels at BNDSN stations. The annual analysis shows that the noise levels at the Northern Bulgarian stations are higher than the ones at Central and Southern stations for the microseisms' periods (1sec -7sec). It is well observable at SS PRV and PSN located near Black sea. This is due to the different geological conditions of the seismic stations as well. For the periods of "cultural" noise the power distribution depends on the type of noise sources and as a rule is related to human activities at or near the Earth surface. Seismic stations MPE, VTS and MMB have least mode noise levels and the noisiest stations are PGB, PVL и JMB. The seasonal

  7. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  8. Unified Approach to Vulnerability Analysis of Web Applications

    NASA Astrophysics Data System (ADS)

    Le, H. T.; Loh, P. K. K.

    2008-11-01

    Web vulnerabilities in web-based applications may be detected, classified and documented. Several Web scanners exist for vulnerabilities in Web applications implemented via different technologies. However, none of them provides the technology-independent, generic coverage of possible vulnerabilities. In this project that is funded by Mindef Singapore, we propose a new approach for Web application security and vulnerability analysis. The design addresses the categorization of scanner results with a generic data model and the design of a language-independent rule-based engine that detects, analyses and reports suspected vulnerabilities in web-based applications.

  9. Seismic hazard analysis at Rocky Flats Plant

    SciTech Connect

    McGuire, R.K.

    1993-10-01

    A probabilistic seismic hazard analysis is being conducted for the DOE Rocky Flats Plant, Jefferson County, Colorado. This is part of the overall review of the seismic exposure to facilities being conducted by DOE. The study has four major elements. (1) The historical seismicity in Colorado is being reviewed and synthesized to estimate historical rates of earthquake activity in the region of the site. (2) The geologic and tectonic evidence in Colorado and along the Front Range is being reviewed to determine appropriate seismic zones, potentially active faults, and constraints on fault slip rates. (3) Earthquake ground motion equations are being derived based on seismological knowledge of the earth`s crust. Site specific soil amplification factors are also being developed using on-site shear wave velocity measurements. (4) The probability of exceedence of various seismic ground motion levels is being calculated based on the inputs developed on tectonic sources, faults, ground motion, and soil amplification. Deterministic ground motion estimates are also being made. This study is a state-of-the-art analysis of seismic hazard. It incorporates uncertainties in the major aspects governing seismic hazard, and has a documented basis founded on solid data interpretations for the ranges of inputs used. The results will be a valid basis on which to evaluate plant structures, equipment, and components for seismic effects.

  10. Seismic analysis of nuclear power plant structures

    NASA Technical Reports Server (NTRS)

    Go, J. C.

    1973-01-01

    Primary structures for nuclear power plants are designed to resist expected earthquakes of the site. Two intensities are referred to as Operating Basis Earthquake and Design Basis Earthquake. These structures are required to accommodate these seismic loadings without loss of their functional integrity. Thus, no plastic yield is allowed. The application of NASTRAN in analyzing some of these seismic induced structural dynamic problems is described. NASTRAN, with some modifications, can be used to analyze most structures that are subjected to seismic loads. A brief review of the formulation of seismic-induced structural dynamics is also presented. Two typical structural problems were selected to illustrate the application of the various methods of seismic structural analysis by the NASTRAN system.

  11. a Novel Approach to Support Majority Voting in Spatial Group Mcdm Using Density Induced Owa Operator for Seismic Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Moradi, M.; Delavar, M. R.; Moshiri, B.; Khamespanah, F.

    2014-10-01

    Being one of the most frightening disasters, earthquakes frequently cause huge damages to buildings, facilities and human beings. Although the prediction of characteristics of an earthquake seems to be impossible, its loss and damage is predictable in advance. Seismic loss estimation models tend to evaluate the extent to which the urban areas are vulnerable to earthquakes. Many factors contribute to the vulnerability of urban areas against earthquakes including age and height of buildings, the quality of the materials, the density of population and the location of flammable facilities. Therefore, seismic vulnerability assessment is a multi-criteria problem. A number of multi criteria decision making models have been proposed based on a single expert. The main objective of this paper is to propose a model which facilitates group multi criteria decision making based on the concept of majority voting. The main idea of majority voting is providing a computational tool to measure the degree to which different experts support each other's opinions and make a decision regarding this measure. The applicability of this model is examined in Tehran metropolitan area which is located in a seismically active region. The results indicate that neglecting the experts which get lower degrees of support from others enables the decision makers to avoid the extreme strategies. Moreover, a computational method is proposed to calculate the degree of optimism in the experts' opinions.

  12. Initial guidelines for probabilistic seismic hazard analysis

    SciTech Connect

    Budnitz, R.J.

    1994-10-01

    In the late 1980s, the methodology for performing probabilistic seismic hazard analysis (PSHA) was exercised extensively for eastern-U.S. nuclear power plant sites by the Electric Power Research Institute (EPRI) and Lawrence Livermore National Laboratory (LLNL) under NRC sponsorship. Unfortunately, the seismic-hazard-curve results of these two studies differed substantially for many of the eastern reactor sites, which has motivated all concerned to revisit the approaches taken. This project is that revisitation.

  13. Sensitivity Analysis of Ordered Weighted Averaging Operator in Earthquake Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Moradi, M.; Delavar, M. R.; Moshiri, B.

    2013-09-01

    The main objective of this research is to find the extent to which the minimal variability Ordered Weighted Averaging (OWA) model of seismic vulnerability assessment is sensitive to variation of optimism degree. There are a variety of models proposed for seismic vulnerability assessment. In order to examine the efficiency of seismic vulnerability assessment models, the stability of results could be analysed. Seismic vulnerability assessment is done to estimate the probable losses in the future earthquake. Multi-Criteria Decision Making (MCDM) methods have been applied by a number of researchers to estimate the human, physical and financial losses in urban areas. The study area of this research is Tehran Metropolitan Area (TMA) which has more than eight million inhabitants. In addition, this paper assumes that North Tehran Fault (NTF) is activated and caused an earthquake in TMA. 1996 census data is used to extract the attribute values for six effective criteria in seismic vulnerability assessment. The results demonstrate that minimal variability OWA model of Seismic Loss Estimation (SLE) is more stable where the aggregated seismic vulnerability degree has a lower value. Moreover, minimal variability OWA is very sensitive to optimism degree in northern areas of Tehran. A number of statistical units in southern areas of the city also indicate considerable sensitivity to optimism degree due to numerous non-standard buildings. In addition, the change of seismic vulnerability degree caused by variation of optimism degree does not exceed 25 % of the original value which means that the overall accuracy of the model is acceptable.

  14. Verification the data on critical facilities inventory and vulnerability for seismic risk assessment taking into account possible accidents

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Aleksander

    2015-04-01

    The paper contains the results of the recent study that has been done by Seismological Center of IGE, Russian Academy of Sciences and Extreme Situations Research Center within the Russian Academy of Sciences Project "Theoretical and Methodological basis for seismic risk assessment taking into account technological accidents at local level; constructing the seismic risk maps for the Big Sochi City territory including the venue of Olympic Games facilities." The procedure of critical facilities inventory and vulnerability verification which makes use of space images and web technologies in social networks is presented. The numerical values of the criteria of accidents at fire and chemical hazardous facilities triggered by strong earthquakes are obtained. The seismic risk maps for Big Sochi City territory including the Olympic Games venue constructed taking into account new data on critical facilities obtained with application panorama photos of these facilities, space images of high resolution and web technologies. The obtained values of individual seismic risk taking into account secondary technological accidents exceed the values seismic risk without taking secondary hazard, return period T= 500 years, at 0.5-1.0 10-51/year.

  15. Probabilistic seismic demand analysis of nonlinear structures

    NASA Astrophysics Data System (ADS)

    Shome, Nilesh

    Recent earthquakes in California have initiated improvement in current design philosophy and at present the civil engineering community is working towards development of performance-based earthquake engineering of structures. The objective of this study is to develop efficient, but accurate procedures for probabilistic analysis of nonlinear seismic behavior of structures. The proposed procedures help the near-term development of seismic-building assessments which require an estimation of seismic demand at a given intensity level. We also develop procedures to estimate the probability of exceedance of any specified nonlinear response level due to future ground motions at a specific site. This is referred as Probabilistic Seismic Demand Analysis (PSDA). The latter procedure prepares the way for the next stage development of seismic assessment that consider the uncertainties in nonlinear response and capacity. The proposed procedures require structure-specific nonlinear analyses for a relatively small set of recorded accelerograms and (site-specific or USGS-map-like) seismic hazard analyses. We have addressed some of the important issues of nonlinear seismic demand analysis, which are selection of records for structural analysis, the number of records to be used, scaling of records, etc. Initially these issues are studied through nonlinear analysis of structures for a number of magnitude-distance bins of records. Subsequently we introduce regression analysis of response results against spectral acceleration, magnitude, duration, etc., which helps to resolve these issues more systematically. We illustrate the demand-hazard calculations through two major example problems: a 5story and a 20-story SMRF building. Several simple, but quite accurate closed-form solutions have also been proposed to expedite the demand-hazard calculations. We find that vector-valued (e.g., 2-D) PSDA estimates demand hazard more accurately. This procedure, however, requires information about 2

  16. Clustering analysis of seismicity and aftershock identification.

    PubMed

    Zaliapin, Ilya; Gabrielov, Andrei; Keilis-Borok, Vladimir; Wong, Henry

    2008-07-01

    We introduce a statistical methodology for clustering analysis of seismicity in the time-space-energy domain and use it to establish the existence of two statistically distinct populations of earthquakes: clustered and nonclustered. This result can be used, in particular, for nonparametric aftershock identification. The proposed approach expands the analysis of Baiesi and Paczuski [Phys. Rev. E 69, 066106 (2004)10.1103/PhysRevE.69.066106] based on the space-time-magnitude nearest-neighbor distance eta between earthquakes. We show that for a homogeneous Poisson marked point field with exponential marks, the distance eta has the Weibull distribution, which bridges our results with classical correlation analysis for point fields. The joint 2D distribution of spatial and temporal components of eta is used to identify the clustered part of a point field. The proposed technique is applied to several seismicity models and to the observed seismicity of southern California.

  17. Space Station Program threat and vulnerability analysis

    NASA Technical Reports Server (NTRS)

    Van Meter, Steven D.; Veatch, John D.

    1987-01-01

    An examination has been made of the physical security of the Space Station Program at the Kennedy Space Center in a peacetime environment, in order to furnish facility personnel with threat/vulnerability information. A risk-management approach is used to prioritize threat-target combinations that are characterized in terms of 'insiders' and 'outsiders'. Potential targets were identified and analyzed with a view to their attractiveness to an adversary, as well as to the consequentiality of the resulting damage.

  18. Seismic Vulnerability Assessment Waste Characterization Reduction and Repackaging Building, TA-50-69

    SciTech Connect

    M.W.Sullivan; J.Ruminer; I.Cuesta

    2003-02-02

    This report presents the results of the seismic structural analyses completed on the Waste Characterization Reduction and Repackaging (WCRR) Building in support of ongoing safety analyses. WCRR is designated as TA-50-69 at Los Alamos National Laboratory, Los Alamos, New Mexico. The facility has been evaluated against Department of Energy (DOE) seismic criteria for Natural Phenomena Hazards (NPH) Performance Category II (PC 2). The seismic capacities of two subsystems within the WCRR building, the material handling glove box and the lift rack immediately adjacent to the Glove Box are also documented, and the results are presented.

  19. Vulnerability-attention analysis for space-related activities

    NASA Technical Reports Server (NTRS)

    Ford, Donnie; Hays, Dan; Lee, Sung Yong; Wolfsberger, John

    1988-01-01

    Techniques for representing and analyzing trouble spots in structures and processes are discussed. Identification of vulnerable areas usually depends more on particular and often detailed knowledge than on algorithmic or mathematical procedures. In some cases, machine inference can facilitate the identification. The analysis scheme proposed first establishes the geometry of the process, then marks areas that are conditionally vulnerable. This provides a basis for advice on the kinds of human attention or machine sensing and control that can make the risks tolerable.

  20. Seismic vulnerability of the Himalayan half-dressed rubble stone masonry structures, experimental and analytical studies

    NASA Astrophysics Data System (ADS)

    Ahmad, N.; Ali, Q.; Ashraf, M.; Alam, B.; Naeem, A.

    2012-11-01

    Half-Dressed rubble stone (DS) masonry structures as found in the Himalayan region are investigated using experimental and analytical studies. The experimental study included a shake table test on a one-third scaled structural model, a representative of DS masonry structure employed for public critical facilities, e.g. school buildings, offices, health care units, etc. The aim of the experimental study was to understand the damage mechanism of the model, develop damage scale towards deformation-based assessment and retrieve the lateral force-deformation response of the model besides its elastic dynamic properties, i.e. fundamental vibration period and elastic damping. The analytical study included fragility analysis of building prototypes using a fully probabilistic nonlinear dynamic method. The prototypes are designed as SDOF systems assigned with lateral, force-deformation constitutive law (obtained experimentally). Uncertainties in the constitutive law, i.e. lateral stiffness, strength and deformation limits, are considered through random Monte Carlo simulation. Fifty prototype buildings are analyzed using a suite of ten natural accelerograms and an incremental dynamic analysis technique. Fragility and vulnerability functions are derived for the damageability assessment of structures, economic loss and casualty estimation during an earthquake given the ground shaking intensity, essential within the context of risk assessment of existing stock aiming towards risk mitigation and disaster risk reduction.

  1. Masonry Infilling Effect On Seismic Vulnerability and Performance Level of High Ductility RC Frames

    NASA Astrophysics Data System (ADS)

    Ghalehnovi, M.; Shahraki, H.

    2008-07-01

    In last years researchers preferred behavior-based design of structure to force-based one for designing and construction of the earthquake-resistance structures, this method is named performance based designing. The main goal of this method is designing of structure members for a certain performance or behavior. On the other hand in most of buildings, load bearing frames are infilled with masonry materials which leads to considerable changes in mechanical properties of frames. But usually infilling wall's effect has been ignored in nonlinear analysis of structures because of complication of the problem and lack of simple logical solution. As a result lateral stiffness, strength, ductility and performance of the structure will be computed with less accuracy. In this paper by use of Smooth hysteretic model for masonry infillings, some high ductile RC frames (4, 8 stories including 1, 2 and 3 spans) designed according to Iranian code are considered. They have been analyzed by nonlinear dynamic method in two states, with and without infilling. Then their performance has been determined with criteria of ATC 40 and compared with recommended performance in Iranian seismic code (standard No. 2800).

  2. Masonry Infilling Effect On Seismic Vulnerability and Performance Level of High Ductility RC Frames

    SciTech Connect

    Ghalehnovi, M.; Shahraki, H.

    2008-07-08

    In last years researchers preferred behavior-based design of structure to force-based one for designing and construction of the earthquake-resistance structures, this method is named performance based designing. The main goal of this method is designing of structure members for a certain performance or behavior. On the other hand in most of buildings, load bearing frames are infilled with masonry materials which leads to considerable changes in mechanical properties of frames. But usually infilling wall's effect has been ignored in nonlinear analysis of structures because of complication of the problem and lack of simple logical solution. As a result lateral stiffness, strength, ductility and performance of the structure will be computed with less accuracy. In this paper by use of Smooth hysteretic model for masonry infillings, some high ductile RC frames (4, 8 stories including 1, 2 and 3 spans) designed according to Iranian code are considered. They have been analyzed by nonlinear dynamic method in two states, with and without infilling. Then their performance has been determined with criteria of ATC 40 and compared with recommended performance in Iranian seismic code (standard No. 2800)

  3. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  4. Reservoir permeability from seismic attribute analysis

    SciTech Connect

    Silin, Dmitriy; Goloshubin, G.; Silin, D.; Vingalov, V.; Takkand, G.; Latfullin, M.

    2008-02-15

    In case of porous fluid-saturated medium the Biot's poroelasticity theory predicts a movement of the pore fluid relative to the skeleton on seismic wave propagation through the medium. This phenomenon opens an opportunity for investigation of the flow properties of the hydrocarbon-saturated reservoirs. It is well known that relative fluid movement becomes negligible at seismic frequencies if porous material is homogeneous and well cemented. In this case the theory predicts an underestimated seismic wave velocity dispersion and attenuation. Based on Biot's theory, Helle et al. (2003) have numerically demonstrated the substantial effects on both velocity and attenuation by heterogeneous permeability and saturation in the rocks. Besides fluid flow effect, the effects of scattering (Gurevich, et al., 1997) play very important role in case of finely layered porous rocks and heterogeneous fluid saturation. We have used both fluid flow and scattering effects to derive a frequency-dependent seismic attribute which is proportional to fluid mobility and applied it for analysis of reservoir permeability.

  5. GIS modeling of seismic vulnerability of residential fabrics considering geotechnical, structural, social and physical distance indicators in Tehran using multi-criteria decision-making techniques

    NASA Astrophysics Data System (ADS)

    Rezaie, F.; Panahi, M.

    2015-03-01

    The main issue in determining seismic vulnerability is having a comprehensive view of all probable damages related to earthquake occurrence. Therefore, taking into account factors such as peak ground acceleration at the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education and the physical distance to hospitals (or medical care centers) and categorizing them into four indicators of geotechnical, structural, social and physical distance to needed facilities and from dangerous ones will provide us with a better and more exact outcome. To this end, this paper uses the analytic hierarchy process to study the importance of criteria or alternatives and uses the geographical information system to study the vulnerability of Tehran to an earthquake. This study focuses on the fact that Tehran is surrounded by three active and major faults: Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran is analyzed and classified into four levels: high, medium, low and safe. The results show that, regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey make, respectively, 6, 16 and 10% of Tehran highly vulnerable, while 34, 14 and 27% is safe.

  6. GIS modelling of seismic vulnerability of residential fabrics considering geotechnical, structural, social and physical distance indicators in Tehran city using multi-criteria decision-making (MCDM) techniques

    NASA Astrophysics Data System (ADS)

    Rezaie, F.; Panahi, M.

    2014-09-01

    The main issue in determining the seismic vulnerability is having a comprehensive view to all probable damages related to earthquake occurrence. Therefore, taking factors such as peak ground acceleration (PGA) in the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education, the physical distance to a hospitals (or medical care centers), etc. into account and categorized under four indicators of geotechnical, structural, social and physical distance to needed facilities and distance from dangerous ones will provide us with a better and more exact outcome. To this end in this paper using analytic hierarchy process (AHP), the amount of importance of criteria or alternatives are determined and using geographical information system (GIS), the vulnerability of Tehran metropolitan as a result of an earthquake, is studied. This study focuses on the fact that Tehran is surrounded by three active and major faults of the Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran city is analysed and classified into four levels including high, medium, low and safe. The results show that regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey respectively make 6, 16 and 10% of Tehran area highly vulnerable and also 34, 14 and 27% are safe.

  7. Vulnerability Analysis Considerations for the Transportation of Special Nuclear Material

    SciTech Connect

    Nicholson, Lary G.; Purvis, James W.

    1999-07-21

    The vulnerability analysis methodology developed for fixed nuclear material sites has proven to be extremely effective in assessing associated transportation issues. The basic methods and techniques used are directly applicable to conducting a transportation vulnerability analysis. The purpose of this paper is to illustrate that the same physical protection elements (detection, delay, and response) are present, although the response force plays a dominant role in preventing the theft or sabotage of material. Transportation systems are continuously exposed to the general public whereas the fixed site location by its very nature restricts general public access.

  8. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites for the surroundings of Katerloch cave, Austria

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Kovács, Károly; Kalmár, János

    2016-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. This raises the important issue for society, how to react to the natural hazard: potential damages are huge, and infrastructure costs for addressing these hazards are huge as well. Obtaining an unbiased view of seismic hazard (and risk) is very important therefore. In principle, the best way to test Probabilistic Seismic Hazard Assessments (PSHA) is to compare with observations that are entirely independent of the procedure used to produce the PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occurring over time intervals that are at least as long as a seismic cycle. Such information would be very valuable, even if it concerned only a single site. Long-term information can in principle be gained from intact stalagmites in natural karstic caves. These have survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that period. We are focusing here on a case study from the Katerloch cave close to the city of Graz, Austria. A specially-shaped (candle stick style: high, slim, and more or less cylindrical form) intact and vulnerable stalagmites (IVSTM) in the Katerloch cave has been examined in 2013 and 2014. This IVSTM is suitable for estimating the upper limit for horizontal peak ground acceleration generated by pre-historic earthquakes. For this cave, we have extensive information about ages (e.g., Boch et al., 2006, 2010). The approach, used in our study, yields significant new constraints on seismic hazard, as the intactness of the stalagmites suggests that tectonic structures close to Katerloch cave, i.p. the Mur-Mürz fault did not generate very strong paleoearthquakes in the last few thousand years

  9. WHE-PAGER Project: A new initiative in estimating global building inventory and its seismic vulnerability

    USGS Publications Warehouse

    Porter, K.A.; Jaiswal, K.S.; Wald, D.J.; Greene, M.; Comartin, Craig

    2008-01-01

    The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.

  10. Seismic Vulnerability Evaluations Within The Structural And Functional Survey Activities Of The COM Bases In Italy

    SciTech Connect

    Zuccaro, G.; Cacace, F.; Albanese, V.; Mercuri, C.; Papa, F.; Pizza, A. G.; Sergio, S.; Severino, M.

    2008-07-08

    The paper describes technical and functional surveys on COM buildings (Mixed Operative Centre). This activity started since 2005, with the contribution of both Italian Civil Protection Department and the Regions involved. The project aims to evaluate the efficiency of COM buildings, checking not only structural, architectonic and functional characteristics but also paying attention to surrounding real estate vulnerability, road network, railways, harbours, airports, area morphological and hydro-geological characteristics, hazardous activities, etc. The first survey was performed in eastern Sicily, before the European Civil Protection Exercise 'EUROSOT 2005'. Then, since 2006, a new survey campaign started in Abruzzo, Molise, Calabria and Puglia Regions. The more important issue of the activity was the vulnerability assessment. So this paper deals with a more refined vulnerability evaluation technique by means of the SAVE methodology, developed in the 1st task of SAVE project within the GNDT-DPC programme 2000-2002 (Zuccaro, 2005); the SAVE methodology has been already successfully employed in previous studies (i.e. school buildings intervention programme at national scale; list of strategic public buildings in Campania, Sicilia and Basilicata). In this paper, data elaborated by SAVE methodology are compared with expert evaluations derived from the direct inspections on COM buildings. This represents a useful exercise for the improvement either of the survey forms or of the methodology for the quick assessment of the vulnerability.

  11. A Preliminary Tsunami Vulnerability Analysis for Yenikapi Region in Istanbul

    NASA Astrophysics Data System (ADS)

    Ceren Cankaya, Zeynep; Suzen, Lutfi; Cevdet Yalciner, Ahmet; Kolat, Cagil; Aytore, Betul; Zaytsev, Andrey

    2015-04-01

    One of the main requirements during post disaster recovery operations is to maintain proper transportation and fluent communication at the disaster areas. Ports and harbors are the main transportation hubs which must work with proper performance at all times especially after the disasters. Resilience of coastal utilities after earthquakes and tsunamis have major importance for efficient and proper rescue and recovery operations soon after the disasters. Istanbul is a mega city with its various coastal utilities located at the north coast of the Sea of Marmara. At Yenikapi region of Istanbul, there are critical coastal utilities and vulnerable coastal structures and critical activities occur daily. Fishery ports, commercial ports, small craft harbors, passenger terminals of intercity maritime transportation, water front commercial and/or recreational structures are some of the examples of coastal utilization which are vulnerable against marine disasters. Therefore their vulnerability under tsunami or any other marine hazard to Yenikapi region of Istanbul is an important issue. In this study, a methodology of vulnerability analysis under tsunami attack is proposed with the applications to Yenikapi region. In the study, high resolution (1m) GIS database of Istanbul Metropolitan Municipality (IMM) is used and analyzed by using GIS implementation. The bathymetry and topography database and the vector dataset containing all buildings/structures/infrastructures in the study area are obtained for tsunami numerical modeling for the study area. GIS based tsunami vulnerability assessment is conducted by applying the Multi-criteria Decision Making Analysis (MCDA). The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability parameters in the region due to two different classifications i) vulnerability of buildings/structures and ii) vulnerability of (human) evacuation

  12. Vulnerability Analysis and Evaluation of Urban Road System in Tianjin

    NASA Astrophysics Data System (ADS)

    Liu, Y. Q.; Wu, X.

    In recent years, with the development of economy, the road construction of our country has entered into a period of rapid growth. The road transportation network has been expanding and the risk of disasters is increasing. In this paper we study the vulnerability of urban road system in Tianjin. After analyzed many risk factors of the urban road system security, including road construction, road traffic and the natural environment, we proposed an evaluation index of vulnerability of urban road system and established the corresponding evaluation index system. Based on the results of analysis and comprehensive evaluation, appropriate improvement measures and suggestions which may reduce the vulnerability of the road system and improve the safety and reliability of the road system are proposed.

  13. Betweenness as a Tool of Vulnerability Analysis of Power System

    NASA Astrophysics Data System (ADS)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-06-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  14. Overview of Seismic Hazard and Vulnerability of Ordinary Buildings in Belgium: Methodological Aspects and Study Cases

    SciTech Connect

    Barszez, Anne-Marie; Camelbeeck, Thierry; Plumier, Andre; Sabbe, Alain; Wilquin, Hugues

    2008-07-08

    Northwest Europe is a region in which damaging earthquakes exist. Assessing the risks of damages is useful, but this is not an easy work based on exact science.In this paper, we propose a general tool for a first level assessment of seismic risks (rapid diagnosis). General methodological aspects are presented. For a given building, the risk is represented by a volume in a multi-dimensional space. This space is defined by axes representing the main parameters that have an influence on the risk. We notably express the importance of including a parameter to consider the specific value of cultural heritage.Then we apply the proposed tool to analyze and compare methods of seismic risk assessment used in Belgium. They differ by the spatial scale of the studied area. Study cases for the whole Belgian Territory and for part of cities in Liege and Mons (Be) aim also to give some sense of the overall risk in Belgium.

  15. Vulnerability analysis for complex networks using aggressive abstraction.

    SciTech Connect

    Colbaugh, Richard; Glass, Kristin L.

    2010-06-01

    Large, complex networks are ubiquitous in nature and society, and there is great interest in developing rigorous, scalable methods for identifying and characterizing their vulnerabilities. This paper presents an approach for analyzing the dynamics of complex networks in which the network of interest is first abstracted to a much simpler, but mathematically equivalent, representation, the required analysis is performed on the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit vulnerability-preserving, finite state abstractions, and develop efficient algorithms for computing these abstractions. We then propose a vulnerability analysis methodology which combines these finite state abstractions with formal analytics from theoretical computer science to yield a comprehensive vulnerability analysis process for networks of realworld scale and complexity. The potential of the proposed approach is illustrated with a case study involving a realistic electric power grid model and also with brief discussions of biological and social network examples.

  16. An approach to extend seismic vulnerability relationships for large diameter pipelines

    SciTech Connect

    Honegger, D.G.

    1995-12-31

    The most common approach to determining vulnerability is to rely solely upon damage data from past earthquakes as a predictor of future performance. Relying upon past damage data is not an option when data does not exist for a particular type of pipeline. An option discussed in this paper and recently implemented for a large diameter water supply pipelines, relies upon engineering characterization of the relative strength of pipelines to extend existing damage data.

  17. DSOD Procedures for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Howard, J. K.; Fraser, W. A.

    2005-12-01

    DSOD, which has jurisdiction over more than 1200 dams in California, routinely evaluates their dynamic stability using seismic shaking input ranging from simple pseudostatic coefficients to spectrally matched earthquake time histories. Our seismic hazard assessments assume maximum earthquake scenarios of nearest active and conditionally active seismic sources. Multiple earthquake scenarios may be evaluated depending on sensitivity of the design analysis (e.g., to certain spectral amplitudes, duration of shaking). Active sources are defined as those with evidence of movement within the last 35,000 years. Conditionally active sources are those with reasonable expectation of activity, which are treated as active until demonstrated otherwise. The Division's Geology Branch develops seismic hazard estimates using spectral attenuation formulas applicable to California. The formulas were selected, in part, to achieve a site response model similar to the 2000 IBC's for rock, soft rock, and stiff soil sites. The level of dynamic loading used in the stability analysis (50th, 67th, or 84th percentile ground shaking estimates) is determined using a matrix that considers consequence of dam failure and fault slip rate. We account for near-source directivity amplification along such faults by adjusting target response spectra and developing appropriate design earthquakes for analysis of structures sensitive to long-period motion. Based on in-house studies, the orientation of the dam analysis section relative to the fault-normal direction is considered for strike-slip earthquakes, but directivity amplification is assumed in any orientation for dip-slip earthquakes. We do not have probabilistic standards, but we evaluate the probability of our ground shaking estimates using hazard curves constructed from the USGS Interactive De-Aggregation website. Typically, return periods for our design loads exceed 1000 years. Excessive return periods may warrant a lower design load. Minimum

  18. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  19. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  20. A Preliminary Tsunami vulnerability analysis for Bakirkoy district in Istanbul

    NASA Astrophysics Data System (ADS)

    Tufekci, Duygu; Lutfi Suzen, M.; Cevdet Yalciner, Ahmet; Zaytsev, Andrey

    2016-04-01

    Resilience of coastal utilities after earthquakes and tsunamis has major importance for efficient and proper rescue and recovery operations soon after the disasters. Vulnerability assessment of coastal areas under extreme events has major importance for preparedness and development of mitigation strategies. The Sea of Marmara has experienced numerous earthquakes as well as associated tsunamis. There are variety of coastal facilities such as ports, small craft harbors, and terminals for maritime transportation, water front roads and business centers mainly at North Coast of Marmara Sea in megacity Istanbul. A detailed vulnerability analysis for Yenikapi region and a detailed resilience analysis for Haydarpasa port in Istanbul have been studied in previously by Cankaya et al., (2015) and Aytore et al., (2015) in SATREPS project. In this study, the methodology of vulnerability analysis under tsunami attack given in Cankaya et al., (2015) is modified and applied to Bakirkoy district of Istanbul. Bakirkoy district is located at western part of Istanbul and faces to the North Coast of Marmara Sea from 28.77oE to 28.89oE. High resolution spatial dataset of Istanbul Metropolitan Municipality (IMM) is used and analyzed. The bathymetry and topography database and the spatial dataset containing all buildings/structures/infrastructures in the district are collated and utilized for tsunami numerical modeling and following vulnerability analysis. The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability assessment parameters in the district according to vulnerability and resilience are defined; and scored by implementation of a GIS based TVA with appropriate MCDA methods. The risk level is computed using tsunami intensity (level of flow depth from simulations) and TVA results at every location in Bakirkoy district. The preliminary results are presented and discussed

  1. Temperature-based Instanton Analysis: Identifying Vulnerability in Transmission Networks

    SciTech Connect

    Kersulis, Jonas; Hiskens, Ian; Chertkov, Michael; Backhaus, Scott N.; Bienstock, Daniel

    2015-04-08

    A time-coupled instanton method for characterizing transmission network vulnerability to wind generation fluctuation is presented. To extend prior instanton work to multiple-time-step analysis, line constraints are specified in terms of temperature rather than current. An optimization formulation is developed to express the minimum wind forecast deviation such that at least one line is driven to its thermal limit. Results are shown for an IEEE RTS-96 system with several wind-farms.

  2. K-means cluster analysis and seismicity partitioning for Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2014-07-01

    Pakistan and the western Himalaya is a region of high seismic activity located at the triple junction between the Arabian, Eurasian and Indian plates. Four devastating earthquakes have resulted in significant numbers of fatalities in Pakistan and the surrounding region in the past century (Quetta, 1935; Makran, 1945; Pattan, 1974 and the recent 2005 Kashmir earthquake). It is therefore necessary to develop an understanding of the spatial distribution of seismicity and the potential seismogenic sources across the region. This forms an important basis for the calculation of seismic hazard; a crucial input in seismic design codes needed to begin to effectively mitigate the high earthquake risk in Pakistan. The development of seismogenic source zones for seismic hazard analysis is driven by both geological and seismotectonic inputs. Despite the many developments in seismic hazard in recent decades, the manner in which seismotectonic information feeds the definition of the seismic source can, in many parts of the world including Pakistan and the surrounding regions, remain a subjective process driven primarily by expert judgment. Whilst much research is ongoing to map and characterise active faults in Pakistan, knowledge of the seismogenic properties of the active faults is still incomplete in much of the region. Consequently, seismicity, both historical and instrumental, remains a primary guide to the seismogenic sources of Pakistan. This study utilises a cluster analysis approach for the purposes of identifying spatial differences in seismicity, which can be utilised to form a basis for delineating seismogenic source regions. An effort is made to examine seismicity partitioning for Pakistan with respect to earthquake database, seismic cluster analysis and seismic partitions in a seismic hazard context. A magnitude homogenous earthquake catalogue has been compiled using various available earthquake data. The earthquake catalogue covers a time span from 1930 to 2007 and

  3. Analytical and Experimental Assessment of Seismic Vulnerability of Beam-Column Joints without Transverse Reinforcement in Concrete Buildings

    NASA Astrophysics Data System (ADS)

    Hassan, Wael Mohammed

    Beam-column joints in concrete buildings are key components to ensure structural integrity of building performance under seismic loading. Earthquake reconnaissance has reported the substantial damage that can result from inadequate beam-column joints. In some cases, failure of older-type corner joints appears to have led to building collapse. Since the 1960s, many advances have been made to improve seismic performance of building components, including beam-column joints. New design and detailing approaches are expected to produce new construction that will perform satisfactorily during strong earthquake shaking. Much less attention has been focused on beam-column joints of older construction that may be seismically vulnerable. Concrete buildings constructed prior to developing details for ductility in the 1970s normally lack joint transverse reinforcement. The available literature concerning the performance of such joints is relatively limited, but concerns about performance exist. The current study aimed to improve understanding and assessment of seismic performance of unconfined exterior and corner beam-column joints in existing buildings. An extensive literature survey was performed, leading to development of a database of about a hundred tests. Study of the data enabled identification of the most important parameters and the effect of each parameter on the seismic performance. The available analytical models and guidelines for strength and deformability assessment of unconfined joints were surveyed and evaluated. In particular, The ASCE 41 existing building document proved to be substantially conservative in joint shear strength estimation. Upon identifying deficiencies in these models, two new joint shear strength models, a bond capacity model, and two axial capacity models designed and tailored specifically for unconfined beam-column joints were developed. The proposed models strongly correlated with previous test results. In the laboratory testing phase of

  4. Seismic vulnerability evaluation of axially loaded steel built-up laced members II: evaluations

    NASA Astrophysics Data System (ADS)

    Lee, Kangmin; Bruneau, Michel

    2008-06-01

    The test results described in Part 1 of this paper (Lee and Bruneau, 2008) on twelve steel built-up laced members (BLMs) subjected to quasi-static loading are analyzed to provide better knowledge on their seismic behavior. Strength capacity of the BLM specimens is correlated with the strength predicted by the AISC LRFD Specifications. Assessments of hysteretic properties such as ductility capacity, energy dissipation capacity, and strength degradation after buckling of the specimen are performed. The compressive strength of BLMs is found to be relatively well predicted by the AISC LRFD Specifications. BLMs with smaller kl/r were ductile but failed to reach the target ductility of 3.0 before starting to fracture, while those with larger kl/r could meet the ductility demand in most cases. The normalized energy dissipation ratio, E C/ E T and the normalized compressive strength degradation, C r″/ C r of BLMs typically decrease as normalized displacements δ/ δ b,exp increase, and the ratios for specimens with larger kl/r dropped more rapidly than for specimens with smaller kl/r; similar trends were observed for the monolithic braces. The BLMs with a smaller slenderness ratio, kl/r, and width-to-thickness ratio, b/t, experienced a larger number of inelastic cycles than those with larger ratios.

  5. Seismic Isolation Working Meeting Gap Analysis Report

    SciTech Connect

    Justin Coleman; Piyush Sabharwall

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  6. Nonlinear Seismic Analysis of Morrow Point Dam

    SciTech Connect

    Noble, C R; Nuss, L K

    2004-02-20

    This research and development project was sponsored by the United States Bureau of Reclamation (USBR), who are best known for the dams, power plants, and canals it constructed in the 17 western states. The mission statement of the USBR's Dam Safety Office, located in Denver, Colorado, is ''to ensure Reclamation dams do not present unacceptable risk to people, property, and the environment.'' The Dam Safety Office does this by quickly identifying the dams which pose an increased threat to the public, and quickly completing the related analyses in order to make decisions that will safeguard the public and associated resources. The research study described in this report constitutes one element of USBR's research and development work to advance their computational and analysis capabilities for studying the response of dams to strong earthquake motions. This project focused on the seismic response of Morrow Point Dam, which is located 263 km southwest of Denver, Colorado.

  7. Analysis of four Brazilian seismic areas using a nonextensive approach

    NASA Astrophysics Data System (ADS)

    Scherrer, T. M.; França, G. S.; Silva, R.; de Freitas, D. B.; Vilar, C. S.

    2015-02-01

    We analyse four seismic areas in Brazil using a nonextensive model and the data from the Brazilian Seismic Bulletin between 1720 and 2013. Two of those regions are contrasting zones, while the other two are dominated by seismic active faults. We notice that intraplate seismic zones present q-values similar to others fault zones, but the adjustment in contrast areas results in higher values for this parameter. The results reveal the nonextensive approach adjusts robustly also in case of intraplate earthquakes, showing that the Tsallis formalism is unquestionably a powerful tool to the analysis of this phenomenon.

  8. A vulnerability analysis for a drought vulnerable catchment in South-Eastern Austria

    NASA Astrophysics Data System (ADS)

    Hohmann, Clara; Kirchengast, Gottfried; Birk, Steffen

    2016-04-01

    To detect uncertainties and thresholds in a drought vulnerable region we focus on a typical river catchment of the Austrian South-Eastern Alpine forelands with good data availability, the Raab valley. This mid-latitude region in the south-east of the Austrian state Styria (˜ 47° N, ˜ 16° E) exhibits a strong temperature increase over the last decades. Especially the mean summer temperatures (June to August) show a strong increase (˜ 0.7 °C per decade) over the last decades (1971 - 2015) (Kabas et al., Meteorol. Z. 20, 277-289, 2011; pers. comm., 2015). The Styrian Raab valley, with a catchment size of 986 km2, has already struggled with drought periods (e.g., summers of 1992, 2001 and 2003). Thus, it is important to know what happens if warm and dry periods occur more frequently. Therefore we analyze which sensitivities and related uncertainties exist, which thresholds might be crossed, and what the effects on the different components of the water balance equation are, in particular on runoff, soil moisture, groundwater recharge, and evapotranspiration. We use the mainly physics-based hydrological Water Flow and Balance Simulation Model (WaSiM), developed at ETH Zurich (Schulla, Diss., ETH Zurich, CH, 1997). The model is well established and widely used for hydrological modeling at a diversity of spatial and temporal resolutions. We choose a model set up which is as simple as possible but as complex as necessary to perform sensitivity studies on uncertainties and thresholds in the context of climate change. In order to assess the model performance under a wide range of conditions, the calibration and validation is performed with a split sample for dry and wet periods. With the calibrated and validated model we perform a low-flow vulnerability analysis ("stress test"), with focus on drought-related conditions. Therefore we simulate changes in weather and climate (e.g., 20% and 50% less precipitation, 2 °C and 5 °C higher temperature), changes in land use and

  9. A Simple Model for Probabilistic Seismic Hazard Analysis of Induced Seismicity Associated With Deep Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Schlittenhardt, Joerg; Spies, Thomas; Kopera, Juergen; Morales Aviles, Wilhelm

    2014-05-01

    In the research project MAGS (Microseismic activity of geothermal systems) funded by the German Federal Ministry of Environment (BMU) a simple model was developed to determine seismic hazard as the probability of the exceedance of ground motion of a certain size. Such estimates of the annual frequency of exceedance of prescriptive limits of e.g. seismic intensities or ground motions are needed for the planning and licensing, but likewise for the development and operation of deep geothermal systems. For the development of the proposed model well established probabilistic seismic hazard analysis (PSHA) methods for the estimation of the hazard for the case of natural seismicity were adapted to the case of induced seismicity. Important differences between induced and natural seismicity had to be considered. These include significantly smaller magnitudes, depths and source to site distances of the seismic events and, hence, different ground motion prediction equations (GMPE) that had to be incorporated to account for the seismic amplitude attenuation with distance as well as differences in the stationarity of the underlying tectonic and induced processes. Appropriate GMPE's in terms of PGV (peak ground velocity) were tested and selected from the literature. The proposed model and its application to the case of induced seismicity observed during the circulation period (operation phase of the plant) at geothermal sites in Germany will be presented. Using GMPE's for PGV has the advantage to estimate hazard in terms of velocities of ground motion, which can be linked to engineering regulations (e.g. German DIN 4150) which give prescriptive standards for the effects of vibrations on buildings and people. It is thus possible to specify the probability of exceedance of such prescriptive standard values and to decide whether they can be accepted or not. On the other hand hazard curves for induced and natural seismicity can be compared to study the impact at a site. Preliminary

  10. An Analysis of the Mt. Meron Seismic Array

    SciTech Connect

    Pasyanos, M E; Ryall, F

    2008-01-10

    We have performed a quick analysis of the Mt. Meron seismic array to monitor regional seismic events in the Middle East. The Meron array is the only current array in the Levant and Arabian Peninsula and, as such, might be useful in contributing to event location, identification, and other analysis. Here, we provide a brief description of the array and a review of the travel time and array analysis done to assess its performance.

  11. A transferable approach towards rapid inventory data capturing for seismic vulnerability assessment using open-source geospatial technologies

    NASA Astrophysics Data System (ADS)

    Wieland, M.; Pittore, M.; Parolai, S.; Zschau, J.

    2012-04-01

    Geospatial technologies are increasingly being used in pre-disaster vulnerability assessment and post-disaster impact assessment for different types of hazards. Especially the use of remote sensing data has been strongly promoted in recent years due to its capabilities of providing up-to-date information over large areas at a comparatively low cost with increasingly high spatial, temporal and spectral resolution. Despite its clear potentials, a purely remote sensing based approach has its limitations in that it is only capable of providing information about the birds-eye view of the objects of interest. The use of omnidirectional imaging in addition can provide the necessary street-view that furthermore allows for a rapid visual screening of a buildings façade. In this context, we propose an integrated approach to rapid inventory data capturing for the assessment of structural vulnerability of buildings in case of an earthquake. Globally available low-cost data sources are preferred and the tools are developed on an open-source basis to allow for a high degree of transferability and usability. On a neighbourhood scale medium spatial but high temporal and spectral resolution satellite images are analysed to outline areas of homogeneous urban structure. Following a proportional allocation scheme, for each urban structure type representative sample areas are selected for a more detailed analysis of the building stock with high resolution image data. On a building-by-building scale a ground-based, rapid visual survey is performed using an omnidirectional imaging system driven around with a car inside the identified sample areas. Processing of the acquired images allows for an extraction of vulnerability-related features of single buildings (e.g. building height, detection of soft-storeys). An analysis of high resolution satellite images provides with further inventory features (e.g. footprint area, shape irregularity). Since we are dealing with information coming from

  12. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    SciTech Connect

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  13. Do the Most Vulnerable People Live in the Worst Slums? A Spatial Analysis of Accra, Ghana

    PubMed Central

    Jankowska, Marta M.; Weeks, John R.; Engstrom, Ryan

    2011-01-01

    Slums are examples of localized communities within third world urban systems representing a range of vulnerabilities and adaptive capacities. This study examines vulnerability in relation to flooding, environmental degradation, social-status, demographics, and health in the slums of Accra, Ghana by utilizing a place-based approach informed by fieldwork, remote sensing, census data, and geographically weighted regression. The study objectives are threefold: (1) to move slums from a dichotomous into a continuous classification and examine the spatial patterns of the gradient, (2) develop measures of vulnerability for a developing world city and model the relationship between slums and vulnerability, and (3) to assess if the most vulnerable individuals live in the worst slums. A previously developed slum index is utilized, and four new measures of vulnerability are developed through principle components analysis, including a novel component of health vulnerability based on child mortality. Visualizations of the vulnerability measures assess spatial patterns of vulnerability in Accra. Ordinary least squares, spatial, and geographically weighted regression model the ability of the slum index to predict the four vulnerability measures. The slum index performs well for three of the four vulnerability measures, but is least able to predict health vulnerability underscoring the complex relationship between slums and child mortality in Accra. Finally, quintile analysis demonstrates the elevated prevalence of high vulnerability in places with high slum index scores. PMID:22379509

  14. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  15. The SeIsmic monitoring and vulneraBilitY framework for civiL protection (SIBYL) Project: An overview and preliminary results

    NASA Astrophysics Data System (ADS)

    Fleming, Kevin; Parolai, Stefano; Iervolino, Iunio; Pitilakis, Kyriazis; Petryna, Yuriy

    2016-04-01

    The SIBYL project is setting out to contribute to enhancing the capacity of Civil Protection (CP) authorities to rapidly and cost-effectively assess the seismic vulnerability of the built environment. The reason for this arises from the occurrence of seismic swarms or foreshocks, which leads to the requirement that CP authorities must rapidly assess the threatened area's vulnerability. This is especially important for those regions where there is a dearth of up-to-date and reliable information. The result will be a multi-faceted framework, made up of methodologies and software tools, that provides information to advise decision makers as to the most appropriate preventative actions to be taken. It will cover cases where there is a need for short-notice vulnerability assessment in a pre-event situation, and the monitoring of the built environment's dynamic vulnerability during a seismic sequence. Coupled with this will be the ability to stimulate long-term management plans, independent of the hazard or disaster of concern. The monitoring itself will involve low-cost sensing units which may be easily installed in critical infrastructures. The framework will be flexible enough to be employed over multiple spatial scales, and it will be developed with a modular structure which will ease its applicability to other natural hazard types. Likewise, it will be able to be adapted to the needs of CP authorities in different countries within their own hazard context. This presentation therefore provides an overview of the aims and expected outcomes of SIBYL, while explaining the tools currently being developed and refined, as well as preliminary results of several field campaigns.

  16. Analysis of Brazilian data for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Drouet, S.; Assumpção, M.

    2013-05-01

    Seismic hazard analysis in Brazil is going to be re-assessed in the framework of the Global Earthquake Model (GEM) project. Since the last worldwide Global Seismic Hazard Analysis Project (GSHAP) there has been no specific study in this field in Brazil. Brazil is a stable continental region and is characterized by a low seismic activity. In this particular type of regions, seismic hazard assessment is a very hard task due to the limited amount of data available regarding the seismic sources, earthquake catalogue, or ground-motion amplitudes, and the uncertainties associated are very large. This study focuses on recorded data in South-East Brazil where broadband stations are installed, belonging to two networks: the network managed by the seismology group at the IAG-USP in São Paulo which exists since about 20 years, and the network managed by the Observatorio Nacional in Rio de Janeiro which has just been set up. The two networks are now integrated into the national network RSB (Rede Sismográfica Brasileira) which will also include stations from the rest of Brazil currently in installation by the Universities of Brasilia and Natal. There are a couple of events with magnitude greater than 3 recorded at these very sensitive stations, usually at rather large distances. At first sight these data may appear meaningless in the context of seismic hazard but they can help to improve different parts involved in the process. The analysis of the S-wave Fourier spectra can help to better resolve source, path and site effects in Brazil. For instance moment magnitudes can be computed from the flat part of the Fourier spectra. These magnitudes are of utmost importance in order to build an homogeneous catalogue in terms of moment magnitude. At the moment only body wave magnitude (or some equivalent scale) are determined routinely for the events in Brazil. Attenuation and site effect, especially the high-frequency attenuation known as the kappa effect will also help to

  17. Stochastic seismic analysis in the Messina strait area

    SciTech Connect

    Cacciola, P.; Maugeri, N.; Muscolino, G.

    2008-07-08

    After 1908 Messina earthquake significant progresses have been carried out in the field of earthquake engineering. Usually seismic action is represented via the so called elastic response spectrum or alternatively by time histories of ground motion acceleration. Due the random nature of the seismic action, alternative representations assume the seismic action as zero-mean Gaussian process fully defined by the so-called Power Spectral Density function. Aim of this paper is the comparative study of the response of linear behaving structures adopting the above representation of the seismic action using recorded earthquakes in the Messina strait area. In this regard, a handy method for determining the power spectral density function of recorded earthquakes is proposed. Numerical examples conducted on the existing space truss located in Torre Faro (Messina) will show the effectiveness of stochastic approach for coping with the seismic analysis of structures.

  18. Rethinking vulnerability analysis and governance with emphasis on a participatory approach.

    PubMed

    Rossignol, Nicolas; Delvenne, Pierre; Turcanu, Catrinel

    2015-01-01

    This article draws on vulnerability analysis as it emerged as a complement to classical risk analysis, and it aims at exploring its ability for nurturing risk and vulnerability governance actions. An analysis of the literature on vulnerability analysis allows us to formulate a three-fold critique: first, vulnerability analysis has been treated separately in the natural and the technological hazards fields. This separation prevents vulnerability from unleashing the full range of its potential, as it constrains appraisals into artificial categories and thus already closes down the outcomes of the analysis. Second, vulnerability analysis focused on assessment tools that are mainly quantitative, whereas qualitative appraisal is a key to assessing vulnerability in a comprehensive way and to informing policy making. Third, a systematic literature review of case studies reporting on participatory approaches to vulnerability analysis allows us to argue that participation has been important to address the above, but it remains too closed down in its approach and would benefit from embracing a more open, encompassing perspective. Therefore, we suggest rethinking vulnerability analysis as one part of a dynamic process between opening-up and closing-down strategies, in order to support a vulnerability governance framework.

  19. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  20. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  1. HANFORD DOUBLE SHELL TANK THERMAL AND SEISMIC PROJECT SEISMIC ANALYSIS OF HANFORD DOUBLE SHELL TANKS

    SciTech Connect

    MACKEY TC; RINKER MW; CARPENTER BG; HENDRIX C; ABATT FG

    2009-01-15

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Analyses. The original scope of the project was to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). Although Milestone M-48-14 has been met, Revision I is being issued to address external review comments with emphasis on changes in the modeling of anchor bolts connecting the concrete dome and the steel primary tank. The work statement provided to M&D (PNNL 2003) required that a nonlinear soil structure interaction (SSI) analysis be performed on the DSTs. The analysis is required to include the effects of sliding interfaces and fluid sloshing (fluid-structure interaction). SSI analysis has traditionally been treated by frequency domain computer codes such as SHAKE (Schnabel, et al. 1972) and SASSI (Lysmer et al. 1999a). Such frequency domain programs are limited to the analysis of linear systems. Because of the contact surfaces, the response of the DSTs to a seismic event is inherently nonlinear and consequently outside the range of applicability of the linear frequency domain programs. That is, the nonlinear response of the DSTs to seismic excitation requires the use of a time domain code. The capabilities and limitations of the commercial time domain codes ANSYS{reg_sign} and MSC Dytran{reg_sign} for performing seismic SSI analysis of the DSTs and the methodology required to perform the detailed seismic analysis of the DSTs has been addressed in Rinker et al (2006a). On the basis of the results reported in Rinker et al

  2. Benchmark analysis for quantifying urban vulnerability to terrorist incidents.

    PubMed

    Piegorsch, Walter W; Cutter, Susan L; Hardisty, Frank

    2007-12-01

    We describe a quantitative methodology to characterize the vulnerability of U.S. urban centers to terrorist attack, using a place-based vulnerability index and a database of terrorist incidents and related human casualties. Via generalized linear statistical models, we study the relationships between vulnerability and terrorist events, and find that our place-based vulnerability metric significantly describes both terrorist incidence and occurrence of human casualties from terrorist events in these urban centers. We also introduce benchmark analytic technologies from applications in toxicological risk assessment to this social risk/vulnerability paradigm, and use these to distinguish levels of high and low urban vulnerability to terrorism. It is seen that the benchmark approach translates quite flexibly from its biological roots to this social scientific archetype.

  3. Seismic refraction analysis: the path forward

    USGS Publications Warehouse

    Haines, Seth S.; Zelt, Colin; Doll, William

    2012-01-01

    Seismic Refraction Methods: Unleashing the Potential and Understanding the Limitations; Tucson, Arizona, 29 March 2012 A workshop focused on seismic refraction methods took place on 29 May 2012, associated with the 2012 Symposium on the Application of Geophysics to Engineering and Environmental Problems. This workshop was convened to assess the current state of the science and discuss paths forward, with a primary focus on near-surface problems but with an eye on all applications. The agenda included talks on these topics from a number of experts interspersed with discussion and a dedicated discussion period to finish the day. Discussion proved lively at times, and workshop participants delved into many topics central to seismic refraction work.

  4. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  5. Joint analysis of the seismic data and velocity gravity model

    NASA Astrophysics Data System (ADS)

    Belyakov, A. S.; Lavrov, V. S.; Muchamedov, V. A.; Nikolaev, A. V.

    2016-03-01

    We performed joint analysis of the seismic noises recorded at the Japanese Ogasawara station located on Titijima Island in the Philippine Sea using the STS-2 seismograph at the OSW station in the winter period of January 1-15, 2015, over the background of a velocity gravity model. The graphs prove the existence of a cause-and-effect relation between the seismic noise and gravity and allow us to consider it as a desired signal.

  6. Seismic analysis for translational failure of landfills with retaining walls.

    PubMed

    Feng, Shi-Jin; Gao, Li-Ya

    2010-11-01

    In the seismic impact zone, seismic force can be a major triggering mechanism for translational failures of landfills. The scope of this paper is to develop a three-part wedge method for seismic analysis of translational failures of landfills with retaining walls. The approximate solution of the factor of safety can be calculated. Unlike previous conventional limit equilibrium methods, the new method is capable of revealing the effects of both the solid waste shear strength and the retaining wall on the translational failures of landfills during earthquake. Parameter studies of the developed method show that the factor of safety decreases with the increase of the seismic coefficient, while it increases quickly with the increase of the minimum friction angle beneath waste mass for various horizontal seismic coefficients. Increasing the minimum friction angle beneath the waste mass appears to be more effective than any other parameters for increasing the factor of safety under the considered condition. Thus, selecting liner materials with higher friction angle will considerably reduce the potential for translational failures of landfills during earthquake. The factor of safety gradually increases with the increase of the height of retaining wall for various horizontal seismic coefficients. A higher retaining wall is beneficial to the seismic stability of the landfill. Simply ignoring the retaining wall will lead to serious underestimation of the factor of safety. Besides, the approximate solution of the yield acceleration coefficient of the landfill is also presented based on the calculated method.

  7. A graph-based system for network-vulnerability analysis

    SciTech Connect

    Swiler, L.P.; Phillips, C.

    1998-06-01

    This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

  8. A graph-based network-vulnerability analysis system

    SciTech Connect

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-01-01

    This report presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

  9. A graph-based network-vulnerability analysis system

    SciTech Connect

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-05-03

    This paper presents a graph based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level of effort for the attacker, various graph algorithms such as shortest path algorithms can identify the attack paths with the highest probability of success.

  10. Coupling induced seismic hazard analysis with reservoir design

    NASA Astrophysics Data System (ADS)

    Gischig, V.; Wiemer, S.; Alcolea, A. R.

    2013-12-01

    positive impact on seismic hazard. However, as smaller magnitudes contribute less to permeability enhancement the efficiency of stimulation is degraded in case of high b-value conditions. Nevertheless, target permeability enhancement can be still be achieved under high b-value condition without reaching an unacceptable seismic hazard level, if either initial permeability is already high or if several fractures are stimulated. The proposed modelling approach is a first step towards including induced seismic hazard analysis into the design of reservoir stimulation.

  11. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  12. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT SEISMIC ANALYSIS OF HANFORD DOUBLE SHELL TANKS

    SciTech Connect

    MACKEY, T.C.

    2006-03-17

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratory (PNNL) to perform seismic analysis of the Hanford Site double-shell tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project--DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST system at Hanford in support of Tri-Party Agreement Milestone M-48-14, The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). The work statement provided to M&D (PNNL 2003) required that the seismic analysis of the DSTs assess the impacts of potentially non-conservative assumptions in previous analyses and account for the additional soil mass due to the as-found soil density increase, the effects of material degradation, additional thermal profiles applied to the full structure including the soil-structure response with the footings, the non-rigid (low frequency) response of the tank roof, the asymmetric seismic-induced soil loading, the structural discontinuity between the concrete tank wall and the support footing and the sloshing of the tank waste. The seismic analysis considers the interaction of the tank with the surrounding soil and the effects of the primary tank contents. The DSTs and the surrounding soil are modeled as a system of finite elements. The depth and width of the soil incorporated into the analysis model are sufficient to obtain appropriately accurate analytical results. The analyses required to support the work statement differ from previous analysis of the DSTs in that the soil-structure interaction (SSI) model includes several (nonlinear) contact surfaces in the tank structure, and the contained waste must be modeled explicitly in order to capture the fluid-structure interaction behavior between the primary tank and contained

  13. A new passive seismic method based on seismic interferometry and multichannel analysis of surface waves

    NASA Astrophysics Data System (ADS)

    Cheng, Feng; Xia, Jianghai; Xu, Yixian; Xu, Zongbo; Pan, Yudi

    2015-06-01

    We proposed a new passive seismic method (PSM) based on seismic interferometry and multichannel analysis of surface waves (MASW) to meet the demand for increasing investigation depth by acquiring surface-wave data at a low-frequency range (1 Hz ≤ f ≤ 10 Hz). We utilize seismic interferometry to sort common virtual source gathers (CVSGs) from ambient noise and analyze obtained CVSGs to construct 2D shear-wave velocity (Vs) map using the MASW. Standard ambient noise processing procedures were applied to the computation of cross-correlations. To enhance signal to noise ratio (SNR) of the empirical Green's functions, a new weighted stacking method was implemented. In addition, we proposed a bidirectional shot mode based on the virtual source method to sort CVSGs repeatedly. The PSM was applied to two field data examples. For the test along Han River levee, the results of PSM were compared with the improved roadside passive MASW and spatial autocorrelation method (SPAC). For test in the Western Junggar Basin, PSM was applied to a 70 km long linear survey array with a prominent directional urban noise source and a 60 km-long Vs profile with 1.5 km in depth was mapped. Further, a comparison about the dispersion measurements was made between PSM and frequency-time analysis (FTAN) technique to assess the accuracy of PSM. These examples and comparisons demonstrated that this new method is efficient, flexible, and capable to study near-surface velocity structures based on seismic ambient noise.

  14. Regional risk assessment for contaminated sites part 1: vulnerability assessment by multicriteria decision analysis.

    PubMed

    Zabeo, A; Pizzol, L; Agostini, P; Critto, A; Giove, S; Marcomini, A

    2011-11-01

    As highlighted in the EU Soil Communication, local contamination is one of the main soil threats and it is often related to present and past industrial activities which left a legacy of a high number of contaminated sites in Europe. These contaminated sites can be harmful to many different receptors according to their sensitivity/susceptibility to contamination, and specific vulnerability evaluations are needed in order to manage this widely spread environmental issue. In this paper a novel comprehensive vulnerability assessment framework to assess regional receptor susceptibility to contaminated site is presented. The developed methodology, which combines multi criteria decision analysis (MCDA) techniques and spatial analysis, can be applied to different receptors recognized as relevant for regional assessment. In order to characterize each receptor, picked parameters significant for the estimation of the vulnerability to contaminated sites have been selected, normalized and aggregated by means of multi criteria decision analysis (MCDA) techniques. The developed MCDA methodology, based on the Choquet integral, allows to include expert judgments for the elicitation of synergic and conflicting effects between involved criteria and is applied to all the geographical objects representing the identified receptors. To test the potential of the vulnerability methodology, it has been applied to a specific case study area in the upper Silesia region of Poland where it proved to be reliable and consistent with the environmental experts' expected results. The vulnerability assessment results indicate that groundwater is the most vulnerable receptor characterized by a wide area with vulnerability scores belonging to the highest vulnerability class. As far as the other receptors are concerned, human health and surface water are characterized by quite homogeneous vulnerability scores falling in the medium-high vulnerability classes, while protected areas resulted to be the less

  15. A power flow based model for the analysis of vulnerability in power networks

    NASA Astrophysics Data System (ADS)

    Wang, Zhuoyang; Chen, Guo; Hill, David J.; Dong, Zhao Yang

    2016-10-01

    An innovative model which considers power flow, one of the most important characteristics in a power system, is proposed for the analysis of power grid vulnerability. Moreover, based on the complex network theory and the Max-Flow theorem, a new vulnerability index is presented to identify the vulnerable lines in a power grid. In addition, comparative simulations between the power flow based model and existing models are investigated on the IEEE 118-bus system. The simulation results demonstrate that the proposed model and the index are more effective in power grid vulnerability analysis.

  16. The Uncertainty in the Local Seismic Response Analysis

    SciTech Connect

    Pasculli, A.; Pugliese, A.; Romeo, R. W.; Sano, T.

    2008-07-08

    In the present paper is shown the influence on the local seismic response analysis exerted by considering dispersion and uncertainty in the seismic input as well as in the dynamic properties of soils. In a first attempt a 1D numerical model is developed accounting for both the aleatory nature of the input motion and the stochastic variability of the dynamic properties of soils. The seismic input is introduced in a non-conventional way through a power spectral density, for which an elastic response spectrum, derived--for instance--by a conventional seismic hazard analysis, is required with an appropriate level of reliability. The uncertainty in the geotechnical properties of soils are instead investigated through a well known simulation technique (Monte Carlo method) for the construction of statistical ensembles. The result of a conventional local seismic response analysis given by a deterministic elastic response spectrum is replaced, in our approach, by a set of statistical elastic response spectra, each one characterized by an appropriate level of probability to be reached or exceeded. The analyses have been carried out for a well documented real case-study. Lastly, we anticipate a 2D numerical analysis to investigate also the spatial variability of soil's properties.

  17. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  18. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    SciTech Connect

    Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  19. How Does Calcification Influence Plaque Vulnerability? Insights from Fatigue Analysis

    PubMed Central

    Wu, Baijian; Pei, Xuan; Li, Zhi-Yong

    2014-01-01

    Background. Calcification is commonly believed to be associated with cardiovascular disease burden. But whether or not the calcifications have a negative effect on plaque vulnerability is still under debate. Methods and Results. Fatigue rupture analysis and the fatigue life were used to evaluate the rupture risk. An idealized baseline model containing no calcification was first built. Based on the baseline model, we investigated the influence of calcification on rupture path and fatigue life by adding a circular calcification and changing its location within the fibrous cap area. Results show that 84.0% of calcified cases increase the fatigue life up to 11.4%. For rupture paths 10D far from the calcification, the life change is negligible. Calcifications close to lumen increase more fatigue life than those close to the lipid pool. Also, calcifications in the middle area of fibrous cap increase more fatigue life than those in the shoulder area. Conclusion. Calcifications may play a positive role in the plaque stability. The influence of the calcification only exists in a local area. Calcifications close to lumen may be influenced more than those close to lipid pool. And calcifications in the middle area of fibrous cap are seemly influenced more than those in the shoulder area. PMID:24955401

  20. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

    NASA Astrophysics Data System (ADS)

    Liu, Aiwen; Chen, Kun; Wu, Jian

    2010-06-01

    The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

  1. Structural Identification And Seismic Analysis Of An Existing Masonry Building

    SciTech Connect

    Del Monte, Emanuele; Galano, Luciano; Ortolani, Barbara; Vignoli, Andrea

    2008-07-08

    The paper presents the diagnostic investigation and the seismic analysis performed on an ancient masonry building in Florence. The building has historical interest and is subjected to conservative restrictions. The investigation involves a preliminary phase concerning the research of the historic documents and a second phase of execution of in situ and laboratory tests to detect the mechanical characteristics of the masonry. This investigation was conceived in order to obtain the 'LC2 Knowledge Level' and to perform the non-linear pushover analysis according to the new Italian Standards for seismic upgrading of existing masonry buildings.

  2. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    USGS Publications Warehouse

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  3. Sideband analysis and seismic detection in a large ring laser

    NASA Astrophysics Data System (ADS)

    Stedman, G. E.; Li, Z.; Bilger, H. R.

    1995-08-01

    A ring laser unlocked by the Earth's Sagnac effect has attained a frequency resolution of 1 part in 3 \\times 1021 and a rotational resolution of 300 prad. We discuss both theoretically and experimentally the sideband structure of the Earth rotation-induced spectral line induced in the microhertz-hertz region by frequency modulation associated with extra mechanical motion, such as seismic events. The relative sideband height is an absolute measure of the rotational amplitude of that Fourier component. An initial analysis is given of the ring laser record from the Arthur's Pass-Coleridge seismic event of 18 June 1994.

  4. Principal Component Analysis for pattern recognition in volcano seismic spectra

    NASA Astrophysics Data System (ADS)

    Unglert, Katharina; Jellinek, A. Mark

    2016-04-01

    Variations in the spectral content of volcano seismicity can relate to changes in volcanic activity. Low-frequency seismic signals often precede or accompany volcanic eruptions. However, they are commonly manually identified in spectra or spectrograms, and their definition in spectral space differs from one volcanic setting to the next. Increasingly long time series of monitoring data at volcano observatories require automated tools to facilitate rapid processing and aid with pattern identification related to impending eruptions. Furthermore, knowledge transfer between volcanic settings is difficult if the methods to identify and analyze the characteristics of seismic signals differ. To address these challenges we have developed a pattern recognition technique based on a combination of Principal Component Analysis and hierarchical clustering applied to volcano seismic spectra. This technique can be used to characterize the dominant spectral components of volcano seismicity without the need for any a priori knowledge of different signal classes. Preliminary results from applying our method to volcanic tremor from a range of volcanoes including K¯ı lauea, Okmok, Pavlof, and Redoubt suggest that spectral patterns from K¯ı lauea and Okmok are similar, whereas at Pavlof and Redoubt spectra have their own, distinct patterns.

  5. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    DOE Data Explorer

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  6. Ground motion estimation and nonlinear seismic analysis

    SciTech Connect

    McCallen, D.B.; Hutchings, L.J.

    1995-08-14

    Site specific predictions of the dynamic response of structures to extreme earthquake ground motions are a critical component of seismic design for important structures. With the rapid development of computationally based methodologies and powerful computers over the past few years, engineers and scientists now have the capability to perform numerical simulations of many of the physical processes associated with the generation of earthquake ground motions and dynamic structural response. This paper describes application of a physics based, deterministic, computational approach for estimation of earthquake ground motions which relies on site measurements of frequently occurring small (i.e. M < 3 ) earthquakes. Case studies are presented which illustrate application of this methodology for two different sites, and nonlinear analyses of a typical six story steel frame office building are performed to illustrate the potential sensitivity of nonlinear response to site conditions and proximity to the causative fault.

  7. Elastic structure and seismicity of Donegal (Ireland): insights from passive seismic analysis

    NASA Astrophysics Data System (ADS)

    Piana Agostinetti, Nicola

    2016-04-01

    Ireland's crust is the result of a complex geological history, which began in the Palaeozoic with the oblique closure of the Iapetus Ocean and, probably, it is still on-going. In the northwestern portion of the island, the geology of Donegal has been the subject of detailed geological investigation by many workers in the last century. The most widely represented rock types in Donegal are metasediments of Dalradian and Moinian age, invaded by several granites of Caledonian age (so called Donegal granite). Smaller and separate intrusions are present (e.g. Fanad Head). On the contrary, it is widely accepted that the the deep crustal structure of the northern portion of Ireland has been re-worked in more recent time. The several phases of lithospheric stretching associated to the opening of the Atlantic ocean interested such portion of Ireland, with the extrusion of flood basalts. Moreover, the presence of a hot, low-density asthenospheric plume spreading from Iceland has been suggested, with the formation of a thick high-velocity layer of magmatic underplated material at the base of the crust. Oddly, at present, Donegal is the only seismically active area in Ireland, with an average rate of one Mw=2-3 event every 3-4 years. In the last three years, passive seismic data have been recorded at 12 seismic stations deployed across the most seismically active area in Co. Donegal, with the aim of reconstructing the seismic structure down to the upper-mantle depth and of locating the microseismic activity within investigating volume. Both local and teleseismic events were recorded giving the opportunity of integrating results form different techniques for seismic data analysis, and jointly interpret them together with surface geology and mapped fault traces. Local events have been used to define constrain faulting volumes, focal mechanisms and to reconstruct a low-resolution 3D Vp and VpVs velocity models. Teleseismic events have been used to compute receiver function data

  8. Vulnerability analysis for a drought Early Warning System

    NASA Astrophysics Data System (ADS)

    Angeluccetti, Irene; Demarchi, Alessandro; Perez, Francesca

    2014-05-01

    Early Warning Systems (EWS) for drought are often based on risk models that do not, or marginally, take into account the vulnerability factor. The multifaceted nature of drought (hydrological, meteorological, and agricultural) is source of coexistence for different ways to measure this phenomenon and its effects. The latter, together with the complexity of impacts generated by this hazard, causes the current underdevelopment of drought EWS compared to other hazards. In Least Developed Countries, where drought events causes the highest numbers of affected people, the importance of correct monitoring and forecasting is considered essential. Existing early warning and monitoring systems for drought produced at different geographic levels, provide only in a few cases an actual spatial model that tries to describe the cause-effect link between where the hazard is detected and where impacts occur. Integrate vulnerability information in such systems would permit to better estimate affected zones and livelihoods, improving the effectiveness of produced hazard-related datasets and maps. In fact, the need of simplification and, in general, of a direct applicability of scientific outputs is still a matter of concern for field experts and early warning products end-users. Even if the surplus of hazard related information produced right after catastrophic events has, in some cases, led to the creation of specific data-sharing platforms, the conveyed meaning and usefulness of each product has not yet been addressed. The present work is an attempt to fill this gap which is still an open issue for the scientific community as well as for the humanitarian aid world. The study aims at conceiving a simplified vulnerability model to embed into an existing EWS for drought, which is based on the monitoring of vegetation phenological parameters and the Standardized Precipitation Index, both produced using free satellite derived datasets. The proposed vulnerability model includes (i) a

  9. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  10. Variability and Uncertainty in Probabilistic Seismic Hazard Analysis for the Island of Montreal

    NASA Astrophysics Data System (ADS)

    Elkady, Ahmed Mohamed Ahmed

    The current seismic design process for structures in Montreal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montreal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "epsilon" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montreal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.

  11. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    NASA Astrophysics Data System (ADS)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  12. The application of seismic risk-benefit analysis to land use planning in Taipei City.

    PubMed

    Hung, Hung-Chih; Chen, Liang-Chun

    2007-09-01

    In the developing countries of Asia local authorities rarely use risk analysis instruments as a decision-making support mechanism during planning and development procedures. The main purpose of this paper is to provide a methodology to enable planners to undertake such analyses. We illustrate a case study of seismic risk-benefit analysis for the city of Taipei, Taiwan, using available land use maps and surveys as well as a new tool developed by the National Science Council in Taiwan--the HAZ-Taiwan earthquake loss estimation system. We use three hypothetical earthquakes to estimate casualties and total and annualised direct economic losses, and to show their spatial distribution. We also characterise the distribution of vulnerability over the study area using cluster analysis. A risk-benefit ratio is calculated to express the levels of seismic risk attached to alternative land use plans. This paper suggests ways to perform earthquake risk evaluations and the authors intend to assist city planners to evaluate the appropriateness of their planning decisions.

  13. Seismic Earth: Array Analysis of Broadband Seismograms

    NASA Astrophysics Data System (ADS)

    Levander, Alan; Nolet, Guust

    Seismology is one of the few means available to Earth scientists for probing the mechanical structure of the Earth's interior. The advent of modern seismic instrumentation at the end of the 19th century and its installation across the globe was shortly followed by mankind's first general understanding of the Earth's interior: The Croatian seismologist Andrija Mohorovičić discovered the crust-mantle boundary in central Europe in 1909, the German Beno Gutenberg determined the radius of the Earth's core in 1913, Great Britian's Sir Harold Jeffreys established its fluid character by 1926, and the Dane Inge Lehman discovered the solid inner core in 1936. It is notable that seismology, even in its earliest days, was an international science. Unlike much of the Earth sciences, seismology has its roots in physics, notably optics (many university seismology programs are, or initially were, attached to meteorology, astronomy, or physics departments), and draws from the literatures of imaging systems and statistical communications theory developed by, or employed in, astronomy, electrical engineering, medicine, ocean acoustics, and nondestructive materials testing. Seismology has close ties to petro-physics and mineral physics, the measurements of the disciplines being compared to infer the chemical and physical structure of the Earth's interior.

  14. [Human vulnerability under cosmetic surgery. A bioethic analysis].

    PubMed

    Ramos-Rocha de Viesca, Mariablanca

    2012-01-01

    Cosmetic surgery is one of the best examples of the current health empowerment. Aesthetic surgical interventions have been criticized because they expose the healthy individual to an unnecessary risk. In modern society the body has turned into a beauty depository with a commercial value. In published bioethics papers, analyses of the cosmetic problem pointed their attention on the freedom, autonomy and distributive justice. Mexico occupies fifth place in the world of cosmetic surgeries. Vulnerability is an inherent condition of man's existence and marks the limit of human dignity. UNESCO agrees that some populations are more inclined to vulnerability. The aim of this work is to demonstrate that those who wish to make a physical change had given up to social coercion and psychological problems.

  15. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  16. An integrated analysis of controlled- and passive source seismic data

    NASA Astrophysics Data System (ADS)

    Rumpfhuber, Eva-Maria

    This dissertation consists of two parts, which include a study using passive source seismic data, and one using the dataset from a large-scale refraction/wide-angle reflection seismic experiment as the basis for an integrated analysis. The goal of the dissertation is the integration of the two different datasets and a combined interpretation of the results of the "Continental Dynamics of the Rocky Mountains" (CD-ROM) 1999 seismic experiment. I have determined the crustal structure using four different receiver function methods using data collected from the northern transect of the CD-ROM passive seismic experiment. The resulting migrated image and crustal thickness determinations confirm and define prior crustal thickness measurements based on the CD-ROM and Deep Probe datasets. The new results show a very strong lower crustal layer (LCL) with variable thickness beneath the Wyoming Province. In addition, I was able to show that it terminates at 42° latitude and provide a seismic tie between the CD-ROM and Deep Probe seismic experiments so they represent a continuous N-S transect extending from New Mexico into Alberta, Canada. This new tie is particularly important because it occurs close to a major tectonic boundary, the Cheyenne belt, between an Archean craton and a Proterozoic terrane. The controlled-source seismic dataset was analyzed with the aid of forward modeling and inversion to establish a two-dimensional velocity and interface model of the area. I have developed a picking strategy, which helps identify the seismic phases, and improves quality and quantity of the picks. In addition, I was able to pick and identify S-wave phases, which furthermore allowed me to establish an independent S-wave model, and hence the Poisson's and Vp/Vs ratios. The final velocity and interface model was compared to prior results, and the results were jointly interpreted with the receiver function results. Thanks to the integration of the controlled-source and receiver function

  17. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  18. Utilizing Semantic Big Data for realizing a National-scale Infrastructure Vulnerability Analysis System

    SciTech Connect

    Chinthavali, Supriya; Shankar, Mallikarjun

    2016-01-01

    Critical Infrastructure systems(CIs) such as energy, water, transportation and communication are highly interconnected and mutually dependent in complex ways. Robust modeling of CIs interconnections is crucial to identify vulnerabilities in the CIs. We present here a national-scale Infrastructure Vulnerability Analysis System (IVAS) vision leveraging Se- mantic Big Data (SBD) tools, Big Data, and Geographical Information Systems (GIS) tools. We survey existing ap- proaches on vulnerability analysis of critical infrastructures and discuss relevant systems and tools aligned with our vi- sion. Next, we present a generic system architecture and discuss challenges including: (1) Constructing and manag- ing a CI network-of-networks graph, (2) Performing analytic operations at scale, and (3) Interactive visualization of ana- lytic output to generate meaningful insights. We argue that this architecture acts as a baseline to realize a national-scale network based vulnerability analysis system.

  19. Probabilistic seismic demand analysis using advanced ground motion intensity measures

    USGS Publications Warehouse

    Tothong, P.; Luco, N.

    2007-01-01

    One of the objectives in performance-based earthquake engineering is to quantify the seismic reliability of a structure at a site. For that purpose, probabilistic seismic demand analysis (PSDA) is used as a tool to estimate the mean annual frequency of exceeding a specified value of a structural demand parameter (e.g. interstorey drift). This paper compares and contrasts the use, in PSDA, of certain advanced scalar versus vector and conventional scalar ground motion intensity measures (IMs). One of the benefits of using a well-chosen IM is that more accurate evaluations of seismic performance are achieved without the need to perform detailed ground motion record selection for the nonlinear dynamic structural analyses involved in PSDA (e.g. record selection with respect to seismic parameters such as earthquake magnitude, source-to-site distance, and ground motion epsilon). For structural demands that are dominated by a first mode of vibration, using inelastic spectral displacement (Sdi) can be advantageous relative to the conventionally used elastic spectral acceleration (Sa) and the vector IM consisting of Sa and epsilon (??). This paper demonstrates that this is true for ordinary and for near-source pulse-like earthquake records. The latter ground motions cannot be adequately characterized by either Sa alone or the vector of Sa and ??. For structural demands with significant higher-mode contributions (under either of the two types of ground motions), even Sdi (alone) is not sufficient, so an advanced scalar IM that additionally incorporates higher modes is used.

  20. Probabilistic seismic hazard analysis for the city of Quetta, Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Shafiq; Lindholm, Conrad; Ahmed, Najeeb; Rafi, Zahid

    2014-08-01

    Seismic hazard assessment for Quetta is carried out using probabilistic seismic hazard analysis technique based on area sources and augmented by line source used for the first time in Pakistan. Seismic data has been collected and analyzed in spatial and temporal domains. Five Seismic Zones have been modeled in line with tectonics of the region with b-value of 1.14 using regression. The b-value is slightly higher, which is attributed to the fact that aftershocks were not removed as it distorted the dataset. Five fault sources are modeled, with three as reverse and two as strike-slip with 7.8 as maximum magnitude. Mach Structure is included in the tectonics for the first time. The attenuation relation used in the present study is recommended by various researchers. The expected Peak Ground Acceleration for 500-year return period is 4.79 m/s2 for rock outcrop and characterized as very high. Furthermore, variation in spectral acceleration within Quetta city is observed, for which spectral curves are developed for four different places.

  1. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    NASA Astrophysics Data System (ADS)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  2. Parental alienation syndrome. A developmental analysis of a vulnerable population.

    PubMed

    Price, J L; Pioske, K S

    1994-11-01

    1. Parental alienation syndrome is the systematic denigration by one parent of the other parent with the intent of alienating the child. 2. Parents who engage in alienating activity have experienced loss, leading to depression, anger, and aggression. The family system experiences loss during divorce and is adversely affected by the alienating activities of one parent. 3. Understanding the dynamics of parental alienation syndrome will position the nurse to recognize it as a symptom of depression and dependence, and bring care to the vulnerable population.

  3. Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis

    NASA Astrophysics Data System (ADS)

    Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter

    2013-04-01

    Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at

  4. Seismic and hydroacoustic analysis relevant to MH370

    SciTech Connect

    Stead, Richard J.

    2014-07-03

    The vicinity of the Indian Ocean is searched for open and readily available seismic and/or hydroacoustic stations that might have recorded a possible impact of MH370 with the ocean surface. Only three stations are identified: the IMS hydrophone arrays H01 and H08, and the Geoscope seismic station AIS. Analysis of the data from these stations shows an interesting arrival on H01 that has some interference from an Antarctic ice event, large amplitude repeating signals at H08 that obscure any possible arrivals, and large amplitude chaotic noise at AIS precludes any analysis at higher frequencies of interest. The results are therefore rather inconclusive but may point to a more southerly impact location within the overall Indian Ocean search region. The results would be more useful if they can be combined with any other data that are not readily available.

  5. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  6. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  7. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  8. Using multi-criteria decision analysis to assess the vulnerability of drinking water utilities.

    PubMed

    Joerin, Florent; Cool, Geneviève; Rodriguez, Manuel J; Gignac, Marc; Bouchard, Christian

    2010-07-01

    Outbreaks of microbiological waterborne disease have increased governmental concern regarding the importance of drinking water safety. Considering the multi-barrier approach to safe drinking water may improve management decisions to reduce contamination risks. However, the application of this approach must consider numerous and diverse kinds of information simultaneously. This makes it difficult for authorities to apply the approach to decision making. For this reason, multi-criteria decision analysis can be helpful in applying the multi-barrier approach to vulnerability assessment. The goal of this study is to propose an approach based on a multi-criteria analysis method in order to rank drinking water systems (DWUs) based on their vulnerability to microbiological contamination. This approach is illustrated with an application carried out on 28 DWUs supplied by groundwater in the Province of Québec, Canada. The multi-criteria analysis method chosen is measuring attractiveness by a categorical based evaluation technique methodology allowing the assessment of a microbiological vulnerability indicator (MVI) for each DWU. Results are presented on a scale ranking DWUs from less vulnerable to most vulnerable to contamination. MVI results are tested using a sensitivity analysis on barrier weights and they are also compared with historical data on contamination at the utilities. The investigation demonstrates that MVI provides a good representation of the vulnerability of DWUs to microbiological contamination.

  9. Noise analysis of the seismic system employed in the northern and southern California seismic nets

    USGS Publications Warehouse

    Eaton, J.P.

    1984-01-01

    The seismic networks have been designed and operated to support recording on Develocorders (less than 40db dynamic range) and analog magnetic tape (about 50 db dynamic range). The principal analysis of the records has been based on Develocorder films; and background earth noise levels have been adjusted to be about 1 to 2 mm p-p on the film readers. Since the traces are separated by only 10 to 12 mm on the reader screen, they become hopelessly tangled when signal amplitudes on several adjacent traces exceed 10 to 20 mm p-p. Thus, the background noise level is hardly more than 20 db below the level of largest readable signals. The situation is somewhat better on tape playbacks, but the high level of background noise set to accomodate processing from film records effectively limits the range of maximum-signal to background-earth-noise on high gain channels to a little more than 30 db. Introduction of the PDP 11/44 seismic data acquisition system has increased the potential dynamic range of recorded network signals to more than 60 db. To make use of this increased dynamic range we must evaluate the characteristics and performance of the seismic system. In particular, we must determine whether the electronic noise in the system is or can be made sufficiently low so that background earth noise levels can be lowered significantly to take advantage of the increased dynamic range of the digital recording system. To come to grips with the complex problem of system noise, we have carried out a number of measurements and experiments to evaluate critical components of the system as well as to determine the noise characteristics of the system as a whole.

  10. A watershed-based cumulative risk impact analysis: environmental vulnerability and impact criteria.

    PubMed

    Osowski, S L; Swick, J D; Carney, G R; Pena, H B; Danielson, J E; Parrish, D A

    2001-01-01

    Swine Concentrated Animal Feeding Operations (CAFOs) have received much attention in recent years. As a result, a watershed-based screening tool, the Cumulative Risk Index Analysis (CRIA), was developed to assess the cumulative impacts of multiple CAFO facilities in a watershed subunit. The CRIA formula calculates an index number based on: 1) the area of one or more facilities compared to the area of the watershed subunit, 2) the average of the environmental vulnerability criteria, and 3) the average of the industry-specific impact criteria. Each vulnerability or impact criterion is ranked on a 1 to 5 scale, with a low rank indicating low environmental vulnerability or impact and a high rank indicating high environmental vulnerability or impact. The individual criterion ranks, as well as the total CRIA score, can be used to focus the environmental analysis and facilitate discussions with industry, public, and other stakeholders in the Agency decision-making process. PMID:11214349

  11. Assessing the Climate Change Vulnerability of Physical Infrastructures through a Spatial Analysis

    NASA Astrophysics Data System (ADS)

    Myeong, S.

    2012-12-01

    Natural hazards can destroy or damage physical infrastructures and thus incur socioeconomic losses and threaten the safety of people. Therefore, identifying the vulnerability of a given society's physical infrastructure to climate change and developing appropriate adaptation measures are necessary. A recent trend of climate change vulnerability assessment has shifted its focus from the index-based assessment to the spatial analysis of the vulnerability to climate change in order to see the distribution of vulnerable areas. Although some research has been conducted on the US and Southwestern Asia, no formal research has been conducted on Korea that assessed the vulnerable areas in terms of spatial distribution. The current study attempts to see what types of vulnerability exist in what areas of the country through an analysis of data gathered from different sectors of Korea. Three domains, i.e., sensitivity, exposure, and adaptive capacity, were investigated, with subordinate component data under each domain, to assess the vulnerability of the country. The results showed that the vulnerability degree differs between coastal areas and inland areas. For most subordinate components, coastal areas were more vulnerable than inland areas. Within the inland areas, less urbanized areas were more sensitive to the climate change than more urbanized areas, while large metropolitan areas were exposed more to the climate change due to the density of physical infrastructures. Some southern areas of the country had greater adaptive capacity economically and institutionally; however, Seoul and its vicinity had greater adaptive capacity related to physical infrastructures. The study concludes that since damages from natural disasters such as floods and typhoons are becoming increasingly serious around the world as well as in Korea, it is necessary to develop appropriate measures for physical infrastructure to adapt to the climate change, customized to the specific needs of different

  12. A seismic survey in Antarctica, parallel schemes for seismic migration and target oriented velocity analysis

    NASA Astrophysics Data System (ADS)

    Sen, Vikramaditya

    This dissertation comprises three different studies. The first part describes the acquisition and data processing techniques utilized during a seismic survey conducted in the austral summer of 1994--95 in the interior of Antarctica. Three multichannel seismic reflection profiles and two wide-angle profiles were collected over the central-west Antarctica ice sheet to investigate methods to obtain a shallow to mid-crustal section of the lithosphere below the Byrd subglacial basin. The multichannel seismic data were analysed to develop images of the shallow crustal structure, the base of ice, and intra-ice reflections that (with minor exceptions) conform to the ice-floor topography. The high energy, low frequency seismic energy generated by the larger charges of the wide angle data was more successful in imaging the deep crustal section. The upper crust in this area was determined to be fairly non-reflective. Along the main traverse, the base of ice has significant topographical undulation in both inline and crossline directions and several half grabens and localized basins can be identified. More efficient surveys can be conducted and better signal quality can be obtained by using longer streamers (˜4.5 km) and larger and buried charges. The second part describes a parallel implementation of 3D pre-stack Kirchhoff depth migration using the Parallel Virtual Machine (PVM) environment of message passing and clustering. A simple yet robust strategy has been proposed to distribute the computation load among the nodes of a virtual parallel machine and the performance of the parallel method has been compared with conventional sequential schemes. A near linear speedup was achieved in this implementation which implies that the reduction in computation time (compared to the sequential run time) was almost directly proportional to the number of nodes in the virtual machine. The third part of this dissertation describes an approach for target oriented migration velocity

  13. Seismic Noise Analysis and Reduction through Utilization of Collocated Seismic and Atmospheric Sensors at the GRO Chile Seismic Network

    NASA Astrophysics Data System (ADS)

    Farrell, M. E.; Russo, R. M.

    2013-12-01

    The installation of Earthscope Transportable Array-style geophysical observatories in Chile expands open data seismic recording capabilities in the southern hemisphere by nearly 30%, and has nearly tripled the number of seismic stations providing freely-available data in southern South America. Through the use of collocated seismic and atmospheric sensors at these stations we are able to analyze how local atmospheric conditions generate seismic noise, which can degrade data in seismic frequency bands at stations in the ';roaring forties' (S latitudes). Seismic vaults that are climate-controlled and insulated from the local environment are now employed throughout the world in an attempt to isolate seismometers from as many noise sources as possible. However, this is an expensive solution that is neither practical nor possible for all seismic deployments; and also, the increasing number and scope of temporary seismic deployments has resulted in the collection and archiving of terabytes of seismic data that is affected to some degree by natural seismic noise sources such as wind and atmospheric pressure changes. Changing air pressure can result in a depression and subsequent rebound of Earth's surface - which generates low frequency noise in seismic frequency bands - and even moderate winds can apply enough force to ground-coupled structures or to the surface above the seismometers themselves, resulting in significant noise. The 10 stations of the permanent Geophysical Reporting Observatories (GRO Chile), jointly installed during 2011-12 by IRIS and the Chilean Servicio Sismológico, include instrumentation in addition to the standard three seismic components. These stations, spaced approximately 300 km apart along the length of the country, continuously record a variety of atmospheric data including infrasound, air pressure, wind speed, and wind direction. The collocated seismic and atmospheric sensors at each station allow us to analyze both datasets together, to

  14. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    .I. Gushchenko, 1979) and seismological (database of USGS/NEIC Significant Worldwide Earthquakes, 2150 B.C.- 1994 A.D.) information which displays dynamics of endogenic relief-forming processes over a period of 1900 to 1994. In the course of the analysis, a substitution of calendar variable by a corresponding astronomical one has been performed and the epoch superposition method was applied. In essence, the method consists in that the massifs of information on volcanic eruptions (over a period of 1900 to 1977) and seismic events (1900-1994) are differentiated with respect to value of astronomical parameters which correspond to the calendar dates of the known eruptions and earthquakes, regardless of the calendar year. The obtained spectra of volcanic eruptions and violent earthquake distribution in the fields of the Earth orbital movement parameters were used as a basis for calculation of frequency spectra and diurnal probability of volcanic and seismic activity. The objective of the proposed investigations is a probabilistic model development of the volcanic and seismic events, as well as GIS designing for monitoring and forecast of volcanic and seismic activities. In accordance with the stated objective, three probability parameters have been found in the course of preliminary studies; they form the basis for GIS-monitoring and forecast development. 1. A multidimensional analysis of volcanic eruption and earthquakes (of magnitude 7) have been performed in terms of the Earth orbital movement. Probability characteristics of volcanism and seismicity have been defined for the Earth as a whole. Time intervals have been identified with a diurnal probability twice as great as the mean value. Diurnal probability of volcanic and seismic events has been calculated up to 2020. 2. A regularity is found in duration of dormant (repose) periods has been established. A relationship has been found between the distribution of the repose period probability density and duration of the period. 3

  15. Mapping Upper Mantle Seismic Discontinuities Using Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Gu, Y. J.; Dokht, R.; Sacchi, M. D.

    2015-12-01

    Seismic discontinuities are fundamental to the understanding of mantle composition and dynamics. Their depth and impedance are generally determined using secondary seismic phases, most commonly SS precursors and P-to-S converted waves. However, the analysis and interpretation using these approaches often suffer from incomplete data coverage, high noise levels and interfering seismic phases, especially near tectonically complex regions such as subduction zones and continental margins. To overcome these pitfalls, we apply Singular Spectrum Analysis (SSA) to remove random noise, reconstruct missing traces and enhance the robustness of SS precursors and P-to-S conversions from seismic discontinuities. Our method takes advantage of the predictability of time series in frequency-space domain and performs a rank reduction using a singular value decomposition of the trajectory matrix. We apply SSA to synthetic record sections as well as observations of 1) SS precursors beneath the northwestern Pacific subduction zones, and 2) P-to-S converted waves from the Western Canada Sedimentary Basin (WCSB). In comparison with raw or interpolated data, the SSA enhanced reflectivity maps show a greater resolution and a stronger negative correlation between the depths of the 410 and 660 km discontinuities. These effects can be attributed to the suppression of incoherent noise, which tends to reduce the signal amplitude during normal averaging procedures, through rank reduction and the emphasis of principle singular values. Our new results suggest a more laterally coherent 520 km reflection in the western Pacific regions. Similar improvements in data imaging are achieved in western Canada, where strong lateral variations in discontinuity topography are observed in the craton-Cordillera boundary zone. Improvements from SSA relative to conventional approaches are most notable in under-sampled regions.

  16. SeismicWaveTool: Continuous and discrete wavelet analysis and filtering for multichannel seismic data

    NASA Astrophysics Data System (ADS)

    Galiana-Merino, J. J.; Rosa-Herranz, J. L.; Rosa-Cintas, S.; Martinez-Espla, J. J.

    2013-01-01

    A MATLAB-based computer code has been developed for the simultaneous wavelet analysis and filtering of multichannel seismic data. The considered time-frequency transforms include the continuous wavelet transform, the discrete wavelet transform and the discrete wavelet packet transform. The developed approaches provide a fast and precise time-frequency examination of the seismograms at different frequency bands. Moreover, filtering methods for noise, transients or even baseline removal, are implemented. The primary motivation is to support seismologists with a user-friendly and fast program for the wavelet analysis, providing practical and understandable results. Program summaryProgram title: SeismicWaveTool Catalogue identifier: AENG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 611072 No. of bytes in distributed program, including test data, etc.: 14688355 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) version 7.8.0.347 (R2009a) or higher. Wavelet Toolbox is required. Computer: Developed on a MacBook Pro. Tested on Mac and PC. No computer-specific optimization was performed. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.8.0.347 (R2009a) or higher. Tested on Mac OS X 10.6.8, Windows XP and Vista. Classification: 13. Nature of problem: Numerous research works have developed a great number of free or commercial wavelet based software, which provide specific solutions for the analysis of seismic data. On the other hand, standard toolboxes, packages or libraries, such as the MathWorks' Wavelet Toolbox for MATLAB, offer command line functions and interfaces for the wavelet analysis of one-component signals. Thus, software usually is focused on very specific problems

  17. Livelihood security, vulnerability and resilience: a historical analysis of Chibuene, southern Mozambique.

    PubMed

    Ekblom, Anneli

    2012-07-01

    A sustainable livelihood framework is used to analyse livelihood security, vulnerability and resilience in the village of Chibuene, Vilanculos, southern Mozambique from a historical and contemporary perspective. Interviews, assessments, archaeology, palaeoecology and written sources are used to address tangible and intangible aspects of livelihood security. The analysis shows that livelihood strategies for building resilience, diversification of resource use, social networks and trade, have long historical continuities. Vulnerability is contingent on historical processes as long-term socio-environmental insecurity and resultant biodiversity loss. These contingencies affect the social capacity to cope with vulnerability in the present. The study concludes that contingency and the extent and strength of social networks should be added as a factor in livelihood assessments. Furthermore, policies for mitigating vulnerability must build on the reality of environmental insecurity, and strengthen local structures that diversify and spread risk.

  18. Exploring drought vulnerability in Africa: an indicator based analysis to inform early warning systems

    NASA Astrophysics Data System (ADS)

    Naumann, G.; Barbosa, P.; Garrote, L.; Iglesias, A.; Vogt, J.

    2013-10-01

    Drought vulnerability is a complex concept that includes both biophysical and socio-economic drivers of drought impact that determine capacity to cope with drought. In order to develop an efficient drought early warning system and to be prepared to mitigate upcoming drought events it is important to understand the drought vulnerability of the affected regions. We propose a composite Drought Vulnerability Indicator (DVI) that reflects different aspects of drought vulnerability evaluated at Pan-African level in four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation is less vulnerable to drought; furthermore we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided in three main different

  19. Understanding North Texas Seismicity: A Joint Analysis of Seismic Data and 3D Pore Pressure Modeling

    NASA Astrophysics Data System (ADS)

    DeShon, H. R.; Hornbach, M. J.; Ellsworth, W. L.; Oldham, H. R.; Hayward, C.; Stump, B. W.; Frohlich, C.; Olson, J. E.; Luetgert, J. H.

    2014-12-01

    In November 2013, a series of earthquakes began along a mapped ancient fault system near Azle, Texas. The Azle events are the third felt earthquake sequence in the Fort Worth (Barnett Shale) Basin since 2008, and several production and injection wells in the area are drilled to depths near the recent seismic activity. Understanding if and/or how injection and removal of fluids in the crystalline crust reactivates faults have important implications for seismology, the energy industry, and society. We assessed whether the Azle earthquakes were induced using a joint analysis of the earthquake data, subsurface geology and fault structure, and 3D pore pressure modeling. Using a 12-station temporary seismic deployment, we have recorded and located >300 events large enough to be recorded on multiple stations and 1000s of events during periods of swarm activity. High-resolution locations and focal mechanisms indicate that events occurred on NE-SW trending, steeply dipping normal faults associated with the southern end of the Newark East Fault Zone with hypocenters between 2-8 km depth. We considered multiple causes that might have changed stress along this system. Earthquakes resulting from natural processes, though perhaps unlikely in this historically inactive region, can be neither ruled out nor confirmed due to lack of information on the natural stress state of these faults. Analysis of lake and groundwater variations near Azle showed that no significant stress changes occurred prior to or during the earthquake sequence. In contrast, analysis of pore-pressure models shows that the combination of formation water production and wastewater injection near the fault could have caused pressure increases that induced earthquakes on near-critically stressed faults.

  20. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    NASA Astrophysics Data System (ADS)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar

  1. Analysis of Vulnerability Around The Colima Volcano, MEXICO

    NASA Astrophysics Data System (ADS)

    Carlos, S. P.

    2001-12-01

    The Colima volcano located in the western of the Trasmexican Volcanic Belt, in the central portion of the Colima Rift Zone, between the Mexican States of Jalisco and Colima. The volcano since January of 1998 presents a new activity, which has been characterized by two stages: the first one was an effusive phase that begin on 20 November 1998 and finish by the middle of January 1999. On February 10of 1999 a great explosion in the summit marked the beginning of an explosive phase, these facts implies that the eruptive process changes from an effusive model to an explosive one. Suárez-Plascencia et al, 2000, present hazard maps to ballistic projectiles, ashfalls and lahars for this scenario. This work presents the evaluation of the vulnerability in the areas identified as hazardous in the maps for ballistic, ashfalls and lahars, based on the economic elements located in the middle and lower sections of the volcano building, like agriculture, forestry, agroindustries and communication lines (highways, power, telephonic, railroad, etc). The method is based in Geographic Information Systems, using digital cartography scale 1:50,000, digital orthophotos from the Instituto Nacional de Estadística, Geografía e Informática, SPOT and Landsat satellite images from 1997 and 2000 in the bands 1, 2 and 3. The land use maps obtained for 1997 and 2000, were compared with the land use map reported by Suárez in 1992, from these maps an increase of the 5 porcent of the sugar cane area and corn cultivations were observed compared of those of 1990 (1225.7 km2) and a decrease of the forest surface, moving the agricultural limits uphill, and showing also some agave cultivation in the northwest and north hillslopes of the Nevado de Colima. This increment of the agricultural surface results in bigger economic activity in the area, which makes that the vulnerability also be increased to different volcanic products emitted during this phase of activity. The degradation of the soil by the

  2. Analysis of the seismic origin of landslides: examples from the New Madrid seismic zone

    USGS Publications Warehouse

    Jibson, R.W.; Keefer, D.K.

    1993-01-01

    By analyzing two landslides in the New Madrid seismic zone, we develop an approach for judging if a landslide or group of landslides of unknown origin was more likely to have formed as a result of earthquake shaking or in aseismic conditions. The two landslides analyzed are representative of two groups of land-slides that previous research on the geomorphology and regional distribution of landslides in this region indicates may have been triggered by the 1811-1812 New Madrid earthquakes. Slope-stability models of aseismic conditions show that neither landslide is likely to have formed aseismically even in unrealistically high ground-water conditions. Our analysis yields a general relationship between Newmark landslide displacement, earthquake shaking intensity, and the critical acceleration of a landslide. -from Authors

  3. Review of accident analysis calculations, 232-Z seismic scenario

    SciTech Connect

    Ballinger, M.Y.

    1993-05-01

    The 232-Z Building houses what was previously the incinerator facility, which is no longer in service. It is constructed out of concrete blocks and is approximately 37 ft wide by 57 ft long. The building has a single story over the process areas and two stories over the service areas at the north end of the building. The respective roofs are 15 ft and 19 ft above grade and consist of concrete over a metal decking, with insulation and a built-up asphalt gravel covering. This facility is assumed to collapse in the seismic event evaluated in the safety analyses, resulting in the release of a portion of the residual plutonium inventory remaining in the building. The seismic scenario for 232-Z assumes that the block concrete walls collapse, allowing the roof to fall, crushing the contaminated duct and gloveboxes within. This paper is a review of the scenario and methods used to calculate the source term from the seismic event as presented in the Plutonium Finishing Plant Final Safety Analysis Report (WHC 1991) also referred to as the PFP FSAR. Alternate methods of estimating the source term are presented. The calculation of source terms based on the mechanisms of release expected in worst-case scenario is recommended.

  4. Analysis of embedded waste storage tanks subjected to seismic loading

    SciTech Connect

    Zaslawsky, M.; Sammaddar, S.; Kennedy, W.N.

    1991-01-01

    At the Savannah River Site, High Activity Wastes are stored in carbon steel tanks that are within reinforced concrete vaults. These soil-embedded tank/vault structures are approximately 80 ft. in diameter and 40 ft. deep. The tanks were studied to determine the essentials of governing variables, to reduce the problem to the least number of governing cases to optimize analysis effort without introducing excessive conservatism. The problem reduced to a limited number of cases of soil-structure interaction and fluid (tank contents) -- structure interaction problems. It was theorized that substantially reduced input would be realized from soil structure interaction (SSI) but that it was also possible that tank-to-tank proximity would result in (re)amplification of the input. To determine the governing seismic input motion, the three dimensional SSI code, SASSI, was used. Significant among the issues relative to waste tanks is to the determination of fluid response and tank behavior as a function of tank contents viscosity. Tank seismic analyses and studies have been based on low viscosity fluids (water) and the behavior is quite well understood. Typical wastes (salts, sludge), which are highly viscous, have not been the subject of studies to understand the effect of viscosity on seismic response. The computer code DYNA3D was used to study how viscosity alters tank wall pressure distribution and tank base shear and overturning moments. A parallel hand calculation was performed using standard procedures. Conclusions based on the study provide insight into the quantification of the reduction of seismic inputs for soil structure interaction for a soft'' soil site.

  5. Analysis of embedded waste storage tanks subjected to seismic loading

    SciTech Connect

    Zaslawsky, M.; Sammaddar, S.; Kennedy, W.N.

    1991-12-31

    At the Savannah River Site, High Activity Wastes are stored in carbon steel tanks that are within reinforced concrete vaults. These soil-embedded tank/vault structures are approximately 80 ft. in diameter and 40 ft. deep. The tanks were studied to determine the essentials of governing variables, to reduce the problem to the least number of governing cases to optimize analysis effort without introducing excessive conservatism. The problem reduced to a limited number of cases of soil-structure interaction and fluid (tank contents) -- structure interaction problems. It was theorized that substantially reduced input would be realized from soil structure interaction (SSI) but that it was also possible that tank-to-tank proximity would result in (re)amplification of the input. To determine the governing seismic input motion, the three dimensional SSI code, SASSI, was used. Significant among the issues relative to waste tanks is to the determination of fluid response and tank behavior as a function of tank contents viscosity. Tank seismic analyses and studies have been based on low viscosity fluids (water) and the behavior is quite well understood. Typical wastes (salts, sludge), which are highly viscous, have not been the subject of studies to understand the effect of viscosity on seismic response. The computer code DYNA3D was used to study how viscosity alters tank wall pressure distribution and tank base shear and overturning moments. A parallel hand calculation was performed using standard procedures. Conclusions based on the study provide insight into the quantification of the reduction of seismic inputs for soil structure interaction for a ``soft`` soil site.

  6. Spectrum analysis techniques for personnel detection using seismic sensors

    NASA Astrophysics Data System (ADS)

    Houston, Kenneth M.; McGaffigan, Daniel P.

    2003-09-01

    There is a general need for improved detection range and false alarm performance for seismic sensors used for personnel detection. In this paper we describe a novel footstep detection algorithm which was developed and run on seismic footstep data collected at the Aberdeen Proving Ground in December 2000. The initial focus was an assessment of achievable detection range. The conventional approach to footstep detection is to detect transients corresponding to individual footfalls. We feel this is an error-prone approach. Because many real-world signals unrelated to human locomotion look like transients, transient-based footstep detection will inevitably either suffer from high false alarm rates or will be insensitive. Instead, we examined the use of spectrum analysis on envelope-detected seismic signals and have found the general method to be quite promising, not only for detection, but also for discrimination against other types of seismic sources. In particular, gait patterns and their corresponding signatures may help discriminate between human intruders and animals. In the APG data set, mean detection ranges of 64 meters (at PD=50%) were observed for normal walking, significantly improving on ranges previously reported. For running, mean detection ranges of 84 meters were observed. However, stealthy walking (creeping) remains a considerable problem. Even at short ranges (10 meters), in some cases the detection rate was less than 50%. In future efforts, additional data sets for a range of geologic and environmental conditions should be acquired and analyzed. Improvements to the detection algorithms are possible, including estimation of direction of travel and the number of intruders.

  7. Seismic vulnerability assessment of a steel-girder highway bridge equipped with different SMA wire-based smart elastomeric isolators

    NASA Astrophysics Data System (ADS)

    Hedayati Dezfuli, Farshad; Shahria Alam, M.

    2016-07-01

    Shape memory alloy wire-based rubber bearings (SMA-RBs) possess enhanced energy dissipation capacity and self-centering property compared to conventional RBs. The performance of different types of SMA-RBs with different wire configurations has been studied in detail. However, their reliability in isolating structures has not been thoroughly investigated. The objective of this study is to analytically explore the effect of SMA-RBs on the seismic fragility of a highway bridge. Steel-reinforced elastomeric isolators are equipped with SMA wires and used to isolate the bridge. Results revealed that SMA wires with a superelastic behavior and re-centering capability can increase the reliability of the bearing and the bridge structure. It was observed that at the collapse level of damage, the bridge isolated by SMA-HDRB has the lowest fragility. Findings also showed that equipping NRB with SMA wires decreases the possibility of damage in the bridge while, replacing HDRB with SMA-HDRB; or LRB with SMA-LRB increases the failure probability of the system at slight, moderate, and extensive limit states.

  8. Seismic Fragility Analysis of a Degraded Condensate Storage Tank

    SciTech Connect

    Nie, J.; Braverman, J.; Hofmayer, C.; Choun, Y-S.; Kim, M.K.; Choi, I-K.

    2011-05-16

    The Korea Atomic Energy Research Institute (KAERI) and Brookhaven National Laboratory are conducting a collaborative research project to develop seismic capability evaluation technology for degraded structures and components in nuclear power plants (NPPs). One of the goals of this collaboration endeavor is to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The essential part of this collaboration is aimed at achieving a better understanding of the effects of aging on the performance of SSCs and ultimately on the safety of NPPs. A recent search of the degradation occurrences of structures and passive components (SPCs) showed that the rate of aging related degradation in NPPs was not significantly large but increasing, as the plants get older. The slow but increasing rate of degradation of SPCs can potentially affect the safety of the older plants and become an important factor in decision making in the current trend of extending the operating license period of the plants (e.g., in the U.S. from 40 years to 60 years, and even potentially to 80 years). The condition and performance of major aged NPP structures such as the containment contributes to the life span of a plant. A frequent misconception of such low degradation rate of SPCs is that such degradation may not pose significant risk to plant safety. However, under low probability high consequence initiating events, such as large earthquakes, SPCs that have slowly degraded over many years could potentially affect plant safety and these effects need to be better understood. As part of the KAERI-BNL collaboration, a condensate storage tank (CST) was analyzed to estimate its seismic fragility capacities under various postulated degradation scenarios. CSTs were shown to have a significant impact on the seismic core damage frequency of a nuclear power plant. The seismic fragility capacity of the CST was developed

  9. Seismic margin review of the Maine Yankee Atomic Power Station: Fragility analysis

    SciTech Connect

    Ravindra, M. K.; Hardy, G. S.; Hashimoto, P. S.; Griffin, M. J.

    1987-03-01

    This Fragility Analysis is the third of three volumes for the Seismic Margin Review of the Maine Yankee Atomic Power Station. Volume 1 is the Summary Report of the first trial seismic margin review. Volume 2, Systems Analysis, documents the results of the systems screening for the review. The three volumes are part of the Seismic Margins Program initiated in 1984 by the Nuclear Regulatory Commission (NRC) to quantify seismic margins at nuclear power plants. The overall objectives of the trial review are to assess the seismic margins of a particular pressurized water reactor, and to test the adequacy of this review approach, quantification techniques, and guidelines for performing the review. Results from the trial review will be used to revise the seismic margin methodology and guidelines so that the NRC and industry can readily apply them to assess the inherent quantitative seismic capacity of nuclear power plants.

  10. Social vulnerability assessment using spatial multi-criteria analysis (SEVI model) and the Social Vulnerability Index (SoVI model) - a case study for Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Armaş, I.; Gavriş, A.

    2013-06-01

    In recent decades, the development of vulnerability frameworks has enlarged the research in the natural hazards field. Despite progress in developing the vulnerability studies, there is more to investigate regarding the quantitative approach and clarification of the conceptual explanation of the social component. At the same time, some disaster-prone areas register limited attention. Among these, Romania's capital city, Bucharest, is the most earthquake-prone capital in Europe and the tenth in the world. The location is used to assess two multi-criteria methods for aggregating complex indicators: the social vulnerability index (SoVI model) and the spatial multi-criteria social vulnerability index (SEVI model). Using the data of the 2002 census we reduce the indicators through a factor analytical approach to create the indices and examine if they bear any resemblance to the known vulnerability of Bucharest city through an exploratory spatial data analysis (ESDA). This is a critical issue that may provide better understanding of the social vulnerability in the city and appropriate information for authorities and stakeholders to consider in their decision making. The study emphasizes that social vulnerability is an urban process that increased in a post-communist Bucharest, raising the concern that the population at risk lacks the capacity to cope with disasters. The assessment of the indices indicates a significant and similar clustering pattern of the census administrative units, with an overlap between the clustering areas affected by high social vulnerability. Our proposed SEVI model suggests adjustment sensitivity, useful in the expert-opinion accuracy.

  11. Kinematic Seismic Rupture Parameters from a Doppler Analysis

    NASA Astrophysics Data System (ADS)

    Caldeira, Bento; Bezzeghoud, Mourad; Borges, José F.

    2010-05-01

    The radiation emitted from extended seismic sources, mainly when the rupture spreads in preferred directions, presents spectral deviations as a function of the observation location. This aspect, unobserved to point sources, and named as directivity, are manifested by an increase in the frequency and amplitude of seismic waves when the rupture occurs in the direction of the seismic station and a decrease in the frequency and amplitude if it occurs in the opposite direction. The model of directivity that supports the method is a Doppler analysis based on a kinematic source model of rupture and wave propagation through a structural medium with spherical symmetry [1]. A unilateral rupture can be viewed as a sequence of shocks produced along certain paths on the fault. According this model, the seismic record at any point on the Earth's surface contains a signature of the rupture process that originated the recorded waveform. Calculating the rupture direction and velocity by a general Doppler equation, - the goal of this work - using a dataset of common time-delays read from waveforms recorded at different distances around the epicenter, requires the normalization of measures to a standard value of slowness. This normalization involves a non-linear inversion that we solve numerically using an iterative least-squares approach. The evaluation of the performance of this technique was done through a set of synthetic and real applications. We present the application of the method at four real case studies, the following earthquakes: Arequipa, Peru (Mw = 8.4, June 23, 2001); Denali, AK, USA (Mw = 7.8; November 3, 2002); Zemmouri-Boumerdes, Algeria (Mw = 6.8, May 21, 2003); and Sumatra, Indonesia (Mw = 9.3, December 26, 2004). The results obtained from the dataset of the four earthquakes agreed, in general, with the values presented by other authors using different methods and data. [1] Caldeira B., Bezzeghoud M, Borges JF, 2009; DIRDOP: a directivity approach to determining

  12. A new approach for computing a flood vulnerability index using cluster analysis

    NASA Astrophysics Data System (ADS)

    Fernandez, Paulo; Mourato, Sandra; Moreira, Madalena; Pereira, Luísa

    2016-08-01

    A Flood Vulnerability Index (FloodVI) was developed using Principal Component Analysis (PCA) and a new aggregation method based on Cluster Analysis (CA). PCA simplifies a large number of variables into a few uncorrelated factors representing the social, economic, physical and environmental dimensions of vulnerability. CA groups areas that have the same characteristics in terms of vulnerability into vulnerability classes. The grouping of the areas determines their classification contrary to other aggregation methods in which the areas' classification determines their grouping. While other aggregation methods distribute the areas into classes, in an artificial manner, by imposing a certain probability for an area to belong to a certain class, as determined by the assumption that the aggregation measure used is normally distributed, CA does not constrain the distribution of the areas by the classes. FloodVI was designed at the neighbourhood level and was applied to the Portuguese municipality of Vila Nova de Gaia where several flood events have taken place in the recent past. The FloodVI sensitivity was assessed using three different aggregation methods: the sum of component scores, the first component score and the weighted sum of component scores. The results highlight the sensitivity of the FloodVI to different aggregation methods. Both sum of component scores and weighted sum of component scores have shown similar results. The first component score aggregation method classifies almost all areas as having medium vulnerability and finally the results obtained using the CA show a distinct differentiation of the vulnerability where hot spots can be clearly identified. The information provided by records of previous flood events corroborate the results obtained with CA, because the inundated areas with greater damages are those that are identified as high and very high vulnerability areas by CA. This supports the fact that CA provides a reliable FloodVI.

  13. Detection, Measurement, Visualization, and Analysis of Seismic Crustal Deformation

    NASA Technical Reports Server (NTRS)

    Crippen, R.; Blom, R.

    1995-01-01

    Remote sensing plays a key role in the analysis of seismic crustal deformation. Recently radar interferometry has been used to measure one dimension of the strain fields of earthquakes at a resolution of centimeters. Optical imagery is useful in measuring the strain fields in both geographic dimensions of the strain field down to 1/20 of pixel size, and soon will be capable of high resolution. Visual observation of fault motion from space can also be used to detect fault motion from aerial photographs.

  14. SCEC/CME CyberShake: Probabilistic Seismic Hazard Analysis Using 3D Seismic Waveform Modeling

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Cui, Y.; Faerman, M.; Field, E.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T. H.; Kesselman, C.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2005-12-01

    Researchers on the SCEC Community Modeling Environment (SCEC/CME) Project are calculating Probabilistic Seismic Hazard Curves for several sites in the Los Angeles area. The hazard curves calculated in this study use Intensity Measure Relationships (IMRs) based on 3D ground motion simulations rather than on attenuation relationships. State-of-the-art Probabilistic Seismic Hazard Analysis (PSHA) is currently conducted using IMRs that use empirically-based attenuation relationships. These attenuation relationships represent relatively simple analytical models based on the regression of observed data. However, it is widely believed that significant improvements in SHA will rely on the use of more physics-based, waveform modeling. In fact, a more physics-based approach to PSHA was endorsed in a recent assessment of earthquake science by National Research Council (2003). In order to introduce the use of 3D seismic waveform modeling into PSHA hazard curve calculations, the SCEC/CME CyberShake group is integrating state-of-the-art PSHA software tools (OpenSHA), SCEC-developed geophysical models (SCEC CVM3.0), validated anelastic wave modeling (AWM) software, and state-of-the-art computational technologies including high performance computing and grid-based scientific workflows in an effort to develop an OpenSHA-compatible 3D waveform-based IMR component. This will allow researchers to combine a new class of waveform-based IMRs with the large number of existing PSHA components, such as Earthquake Rupture Forecasts (ERF's), that are currently implemented in the OpenSHA system. To calculate a probabilistic hazard curve for a site of interest, we use the OpenSHA implementation of the NSHMP-2002 ERF and identify all ruptures within 200km of the site of interest. For each of these ruptures, we convert the NSHMP-2002 rupture definition into one, or more, Ruptures with Slip Time History (Rupture Variations) using newly developed Rupture Generator software. Strain Green Tensors are

  15. Exploring drought vulnerability in Africa: an indicator based analysis to be used in early warning systems

    NASA Astrophysics Data System (ADS)

    Naumann, G.; Barbosa, P.; Garrote, L.; Iglesias, A.; Vogt, J.

    2014-05-01

    We propose a composite drought vulnerability indicator (DVI) that reflects different aspects of drought vulnerability evaluated at Pan-African level for four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation, is less vulnerable to drought; furthermore, we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided into the following geographical regions: the Mediterranean coast of Africa; the Sahel region and the Horn of Africa; the Serengeti and the Eastern Miombo woodlands in eastern Africa; the western part of the Zambezi Basin, the southeastern border of the Congo Basin, and the belt of Fynbos in the Western Cape province of South Africa. The results of the DVI at the country level were

  16. Spectrum analysis of seismic surface waves and its applications in seismic landmine detection.

    PubMed

    Alam, Mubashir; McClellan, James H; Scott, Waymond R

    2007-03-01

    In geophysics, spectrum analysis of surface waves (SASW) refers to a noninvasive method for soil characterization. However, the term spectrum analysis can be used in a wider sense to mean a method for determining and identifying various modes of seismic surface waves and their properties such as velocity, polarization, etc. Surface waves travel along the free boundary of a medium and can be easily detected with a transducer placed on the free surface of the boundary. A new method based on vector processing of space-time data obtained from an array of triaxial sensors is proposed to produce high-resolution, multimodal spectra from surface waves. Then individual modes can be identified in the spectrum and reconstructed in the space-time domain; also, reflected waves can be separated easily from forward waves in the spectrum domain. This new SASW method can be used for detecting and locating landmines by analyzing the reflected waves for resonance. Processing examples are presented for numerically generated data, experimental data collected in a laboratory setting, and field data.

  17. Seismic piping test and analysis. Volumes 1, 2, and 3

    SciTech Connect

    Not Available

    1980-09-01

    This report presents selected results to date of a dynamic testing and analysis program focusing on a piping system at Consolidated Edison Company of New York's Indian Point-1 Nuclear Generating Station. The goal of this research program is the development of more accurate and realistic models of piping systems subjected to seismic, hydraulic, operating, and other dynamic loads. The program seeks to identify piping system properties significant to dynamic response rather than seeking to simulate any particular form of excitation. The fundamental experimental approach is the excitation of piping/restraint devices/supports by a variety of dynamic test methods and the analysis of the resulting response to identify the characteristic dynamic properties of the system tested. The comparison of the identified dynamic properties to those predicted by alternative analytical approaches will support improvements in methods used in the dynamic analysis of piping, restraint, devices, and supports.

  18. Tomographic Analysis of the West Bohemia Seismic Zone

    NASA Astrophysics Data System (ADS)

    Alexandrakis, Catherine; Calo, Marco; Vavrycuk, Vaclav

    2013-04-01

    The West Bohemia Seismic Zone is located on the border between Czech Republic and Germany. This region has several areas which experience periodic microseismic swarm activity. The installation of the West Bohemia Seismic Network (WEBNET) has allowed constant monitoring of the town Nový Kostel and surrounding area. Nový Kostel is one of the most active areas. Larger swarms, such as those in 1997, 2000, 2007, 2008 and 2011, have been studied in terms of source mechanisms and swarm characteristics. Despite these analyses, questions remain regarding the subsurface structure in and around the focal zone, and the swarm trigger. In this study, we investigate the seismic velocity structures within and around Nový Kostel using double-difference tomography and Weighted Average Model (WAM) post-processing analysis. To do this, we calculate a set of velocity models using a range of reasonable starting parameterizations that are compatible with the experimental information used. The WAM analysis produces a single averaged model and calculates the weighted standard deviation at each inversion node. By averaging the models together, bias and artefacts from the starting models are reduced. In addition, the weighted standard deviation is used to assess the averaged Vp and Vs models for stability and resolution. The full control on the reliability of the Vp and Vs models allows us to also calculate a Vp/Vs model by directly dividing the P and S seismic velocities. Initial results using a subset of the 2008 swarm indicated a low-Vp/Vs layer overlaying the focal zone, and high Vp and Vp/Vs values along the fault zone. This hinted towards a low-permeability layer acting as a fluid trap, and potentially triggering the swarms. Here, we further the investigation by using the full WEBNET catalog from 1991-2011. We invert the full catalog of P and S arrival times along with detailed inversions of individual swarms to produce a structural model of the Nový Kostel area.

  19. Decision Aid Tool and Ontology-Based Reasoning for Critical Infrastructure Vulnerabilities and Threats Analysis

    NASA Astrophysics Data System (ADS)

    Choraś, Michał; Flizikowski, Adam; Kozik, Rafał; Hołubowicz, Witold

    In this paper, a decision aid tool (DAT) for Critical Infrastructure threats analysis and ranking is presented. We propose the ontology-based approach that provides classification, relationships and reasoning about vulnerabilities and threats of the critical infrastructures. Our approach is a part of research within INSPIRE project for increasing security and protection through infrastructure resilience.

  20. On some recent definitions and analysis frameworks for risk, vulnerability, and resilience.

    PubMed

    Aven, Terje

    2011-04-01

    Recently, considerable attention has been paid to a systems-based approach to risk, vulnerability, and resilience analysis. It is argued that risk, vulnerability, and resilience are inherently and fundamentally functions of the states of the system and its environment. Vulnerability is defined as the manifestation of the inherent states of the system that can be subjected to a natural hazard or be exploited to adversely affect that system, whereas resilience is defined as the ability of the system to withstand a major disruption within acceptable degradation parameters and to recover within an acceptable time, and composite costs, and risks. Risk, on the other hand, is probability based, defined by the probability and severity of adverse effects (i.e., the consequences). In this article, we look more closely into this approach. It is observed that the key concepts are inconsistent in the sense that the uncertainty (probability) dimension is included for the risk definition but not for vulnerability and resilience. In the article, we question the rationale for this inconsistency. The suggested approach is compared with an alternative framework that provides a logically defined structure for risk, vulnerability, and resilience, where all three concepts are incorporating the uncertainty (probability) dimension. PMID:21077926

  1. Uncertainty analysis for seismic hazard in Northern and Central Italy

    USGS Publications Warehouse

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  2. MSNoise: A framework for Continuous Seismic Noise Analysis

    NASA Astrophysics Data System (ADS)

    Lecocq, Thomas; Caudron, Corentin; De Plaen, Raphaël; Mordret, Aurélien

    2016-04-01

    MSNoise is an Open and Free Python package known to be the only complete integrated workflow designed to analyse ambient seismic noise and study relative velocity changes (dv/v) in the crust. It is based on state of the art and well maintained Python modules, among which ObsPy plays an important role. To our knowledge, it is officially used for continuous monitoring at least in three notable places: the Observatory of the Piton de la Fournaise volcano (OVPF, France), the Auckland Volcanic Field (New Zealand) and on the South Napa earthquake (Berkeley, USA). It is also used by many researchers to process archive data to focus e.g. on fault zones, intraplate Europe, geothermal exploitations or Antarctica. We first present the general working of MSNoise, originally written in 2010 to automatically scan data archives and process seismic data in order to produce dv/v time series. We demonstrate that its modularity provides a new potential to easily test new algorithms for each processing step. For example, one could experiment new methods of cross-correlation (done by default in the frequency domain), stacking (default is linear stacking, averaging), or dv/v estimation (default is moving window cross-spectrum "MWCS", so-called "doublet"), etc. We present the last major evolution of MSNoise from a "single workflow: data archive to dv/v" to a framework system that allows plugins and modules to be developed and integrated into the MSNoise ecosystem. Small-scale plugins will be shown as examples, such as "continuous PPSD" (à la McNamarra & Buland) or "Seismic Amplitude Ratio Analysis" (Taisne, Caudron). We will also present the new MSNoise-TOMO package, using MSNoise as a "cross-correlation" toolbox and demystifying surface wave tomography ! Finally, the poster will be a meeting point for all those using or willing to use MSNoise, to meet the developer, exchange ideas and wishes !

  3. Probabilistic Seismic Hazard Analysis for Southern California Coastal Facilities

    SciTech Connect

    Savy, J; Foxall, B

    2004-04-16

    The overall objective of this study was to develop probabilistic seismic hazard estimates for the coastal and offshore area of Ventura, Los Angeles and Orange counties for use as a basis for the University of Southern California (USC) to develop physical models of tsunami for the coastal regions and by the California State Lands Commission (SLC) to develop regulatory standards for seismic loading and liquefaction evaluation of marine oil terminals. The probabilistic seismic hazard analysis (PSHA) was carried out by the Lawrence Livermore National Laboratory (LLNL), in several phases over a time period of two years, following the method developed by LLNL for the estimation of seismic hazards at Department Of Energy (DOE) facilities, and for 69 locations of nuclear plants in the Eastern United States, for the Nuclear Regulatory Commission (NRC). This method consists in making maximum use of all physical data (qualitative, and quantitative) and to characterize the uncertainties by using a set of alternate spatiotemporal models of occurrence of future earthquakes, as described in the SSHAC, PSHA Guidance Document (Budnitz et al., 1997), and implemented for the NRC (Savy et al., 2002). In general, estimation of seismic hazard is based not only on our understanding of the regional tectonics and detailed characterization of the faults in the area but also on the analysis methods employed and the types of physical and empirical models that are deemed appropriate for the analysis. To develop this understanding, the body of knowledge in the scientific community is sampled in a series of workshops with a group of experts representative of the entire scientific community, including geologists and seismologists from the United States Geological Survey (USGS), members of the South California Earthquake Center (SCEC), and members of academic institutions (University of California Santa-Cruz, Stanford, UC Santa Barbara, and University of Southern California), and members of

  4. A simplified inelastic seismic analysis method for piping systems

    SciTech Connect

    Not Available

    1990-05-01

    This report presents results of a three-year EPRI-funded effort to develop a simplified inelastic-dynamic analysis method for piping systems under earthquake loadings. The method uses a simplified plastic analysis that replaces highly loaded components with the idealized moment-rotation behavior observed in dynamic tests of piping components. The method uses increments of increased loading whose equivalence to seismic loads is established using the system ductility predicted by the simplified plastic solution. Results of high-level shaker table tests of piping systems are compared to the method's predictions. A conservative design qualification method is proposed in the format of an ASME Code Case. Results are provided for linear and nonlinear detailed time history ABAQUS solutions of shaker table tests. 91 refs., 72 figs., 11 tabs.

  5. Requalification analysis of a circular composite slab for seismic load

    SciTech Connect

    Srinivasan, M.G.; Kot, C.A.

    1992-11-01

    The circular roof slab of an existing facility was analyzed to requalify the structure for supporting a significant seismic load that it was not originally designed for. The slab has a clear span of 66 ft and consists of a 48 in thick reinforced concrete member and a steel liner plate. Besides a number of smaller penetrations, the slab contains two significant cutouts: a 9 ft square opening and a 3 ft dia hole. The issues that complicated the analysis of this non-typical structure, i.e., composite action and nonlinear stiffness of reinforced concrete (R. C.) sections, are discussed. It was possible to circumvent the difficulties by making conservative and simplifying assumptions. If codes incorporate guidelines on practical methods for dynamic analysis of R. C. structures, some of the unneeded conservatism could be eliminated in future designs.

  6. Analysis of piping system response to seismic excitations

    SciTech Connect

    Wang, C.Y.

    1987-01-01

    This paper describes a numerical algorithm for analyzing piping system response to seismic excitations. The numerical model of the piping considers hoop, flexural, axial, and torsional modes of deformation. Hoop modes generated from internal hydrodynamic loading are superimposed on the bending and twisting modes by two extra degrees of freedom. A time-history analysis technique using the implicit temporal integration scheme is addressed. The time integrator uses a predictor-corrector successive iterative scheme which satisfies the equation of motion. Both geometrical and material nonlinearities are considered. Multiple support excitations, fluid effect, piping insulation, and material dampings can be included in the analysis. Two problems are presented to illustrate the method. The results are discussed in detail.

  7. Latest development in seismic texture analysis for subsurface structure, facies, and reservoir characterization: A review

    SciTech Connect

    Gao, Dengliang

    2011-03-01

    In exploration geology and geophysics, seismic texture is still a developing concept that has not been sufficiently known, although quite a number of different algorithms have been published in the literature. This paper provides a review of the seismic texture concepts and methodologies, focusing on latest developments in seismic amplitude texture analysis, with particular reference to the gray level co-occurrence matrix (GLCM) and the texture model regression (TMR) methods. The GLCM method evaluates spatial arrangements of amplitude samples within an analysis window using a matrix (a two-dimensional histogram) of amplitude co-occurrence. The matrix is then transformed into a suite of texture attributes, such as homogeneity, contrast, and randomness, which provide the basis for seismic facies classification. The TMR method uses a texture model as reference to discriminate among seismic features based on a linear, least-squares regression analysis between the model and the data within an analysis window. By implementing customized texture model schemes, the TMR algorithm has the flexibility to characterize subsurface geology for different purposes. A texture model with a constant phase is effective at enhancing the visibility of seismic structural fabrics, a texture model with a variable phase is helpful for visualizing seismic facies, and a texture model with variable amplitude, frequency, and size is instrumental in calibrating seismic to reservoir properties. Preliminary test case studies in the very recent past have indicated that the latest developments in seismic texture analysis have added to the existing amplitude interpretation theories and methodologies. These and future developments in seismic texture theory and methodologies will hopefully lead to a better understanding of the geologic implications of the seismic texture concept and to an improved geologic interpretation of reflection seismic amplitude

  8. 77 FR 69509 - Combining Modal Responses and Spatial Components in Seismic Response Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-19

    ... COMMISSION Combining Modal Responses and Spatial Components in Seismic Response Analysis AGENCY: Nuclear... Components in Seismic Response Analysis'' as an administratively changed guide in which there are minor... the NRC staff considers acceptable for combining modal responses and spatial components in...

  9. Probabilistic Seismic Hazard Analysis: Adaptation for CO2 Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Vasudevan, K.; Eaton, D. W.

    2011-12-01

    Large-scale sequestration of CO2 in depleted oil and gas fields in sedimentary basins such as the Western Canada Sedimentary Basin (WCSB) and in particular, central Alberta, should consider, among other safety and risk issues, a seismic hazard analysis that would include potential ground motions induced by earthquakes. The region is juxtaposed to major tectonically active seismogenic zones such as the Cascadia Subduction Zone, the Queen Charlotte Fault Zone, and the northern Cordillera region. Hazards associated with large-scale storage from strong ground motions caused by large-magnitude earthquakes along the west coast of Canada, and/or medium-to-large magnitude earthquakes triggered by such earthquakes in the neighbourhood of the storage site, must be clearly understood. To this end, stochastic modeling of the accelerograms recorded during large magnitude earthquakes in western Canada has been undertaken. A lack of recorded accelerograms and the absence of a catalogue of ground-motion prediction equations similar to the Next Generation Attenuation (NGA) database, however, hamper such analysis for the WCSB. In order to generate our own database of ground-motions for probabilistic seismic hazard analysis, we employ a site-based stochastic simulation approach. We use it to simulate three-component ground-motion accelerograms recorded during the November 3, 2002 Denali earthquake to mimic the Queen Charlotte Fault earthquakes. To represent a Cascadia megathrust earthquake, we consider three-component strong-motion accelerograms recorded during the March 11, 2011 Tohoku earthquake in Japan. Finally, to simulate an event comparable to the thrust-style Kinbasket Lake earthquake of 1908, we use three-component ground-motion accelerograms recorded during the 1985 Nahanni earthquake and the 2004 Chuetsu earthquake. Here, we develop predictive equations for the stochastic model parameters that describe ground motions in terms of earthquake and site characteristics such as

  10. The earthquake vulnerability of a utility system

    SciTech Connect

    Burhenn, T.A.; Hawkins, H.G.; Ostrom, D.K.; Richau, E.M. )

    1992-01-01

    This paper describes a method to assess the earthquake vulnerability of a utility system and presents an example application. First, the seismic hazard of the system is modeled. Next, the damage and operational disruption to the facilities are estimated. The approach described herein formulates the problem so that the best documented and judgmental information on the earthquake performance of a utility's components can be utilized. finally, the activities and estimates of the time necessary to restore the system to different levels of service are developed. This method of analysis provides a realistic picture of the resiliency of utility service, not just vulnerabilities of various types of equipment.

  11. Multi-hole seismic modeling in 3-D space and cross-hole seismic tomography analysis for boulder detection

    NASA Astrophysics Data System (ADS)

    Cheng, Fei; Liu, Jiangping; Wang, Jing; Zong, Yuquan; Yu, Mingyu

    2016-11-01

    A boulder stone, a common geological feature in south China, is referred to the remnant of a granite body which has been unevenly weathered. Undetected boulders could adversely impact the schedule and safety of subway construction when using tunnel boring machine (TBM) method. Therefore, boulder detection has always been a key issue demanded to be solved before the construction. Nowadays, cross-hole seismic tomography is a high resolution technique capable of boulder detection, however, the method can only solve for velocity in a 2-D slice between two wells, and the size and central position of the boulder are generally difficult to be accurately obtained. In this paper, the authors conduct a multi-hole wave field simulation and characteristic analysis of a boulder model based on the 3-D elastic wave staggered-grid finite difference theory, and also a 2-D imaging analysis based on first arrival travel time. The results indicate that (1) full wave field records could be obtained from multi-hole seismic wave simulations. Simulation results describe that the seismic wave propagation pattern in cross-hole high-velocity spherical geological bodies is more detailed and can serve as a basis for the wave field analysis. (2) When a cross-hole seismic section cuts through the boulder, the proposed method provides satisfactory cross-hole tomography results; however, when the section is closely positioned to the boulder, such high-velocity object in the 3-D space would impact on the surrounding wave field. The received diffracted wave interferes with the primary wave and in consequence the picked first arrival travel time is not derived from the profile, which results in a false appearance of high-velocity geology features. Finally, the results of 2-D analysis in 3-D modeling space are comparatively analyzed with the physical model test vis-a-vis the effect of high velocity body on the seismic tomographic measurements.

  12. Two-way traveltime analysis for seismic reservoir characterization

    NASA Astrophysics Data System (ADS)

    Sil, Samik

    Two-way traveltime (TWT) is one of the most important seismic attributes for reservoir characterization. Erroneous analysis of TWT can lead to incorrect estimates of velocity models resulting in improper structural interpretation of the subsurface. TWT analysis starts with the most fundamental step of seismic data processing, namely, Normal Moveout (NMO) correction. NMO correction is generally performed in the offset-time (X-t) domain, by fitting a hyperbolic curve to the observed traveltime corresponding to each reflection event. The performance of NMO correction depends on the quality of the data in the prestack domain and the underlying geology. When ideal data sets are available (high signal to noise ratio), and underlying geology is simple (flat layers), the NMO correction can still be erroneous due to (1) its long offset non-hyperbolic behavior, and (2) due to the presence of seismic anisotropy. Even though in the X-t domain several equations have been developed to account for seismic anisotropy induced non-hyperbolic move out, they are prone to error, when multiple anisotropic and isotropic layers are present. The non-hyperbolic equations for moveout corrections are also approximate as they are some form of truncated Taylor series and can only estimate effective root mean square (rms) parameters for each reflection event. In the plane wave (tau-p) domain, the estimation of layer parameters can be done using an exact equation for delay-time free from the approximation errors present in the X-t domain. In this domain a layer striping approach can also be used to account for the presence of multiple anisotropic and isotropic layers. Thus it is lucrative to develop NMO correction equation in the tau-p domain for an anisotropic medium, which in its limiting case can be useful for the isotropic medium as well. The simplest anisotropic media are Transversely Isotropic (TI) media which are also common in exploration seismology. One of the TI media, with a vertical

  13. Analysis of Treasure Island earthquake data using seismic interferometry

    NASA Astrophysics Data System (ADS)

    Mehta, K.; Snieder, R.; Graizer, V.

    2005-12-01

    Seismic interferometry is a powerful tool in extracting the response of ground motion. We show the use of seismic interferometry for analysis of an earthquake recorded by Treasure Island Geotechnical Array near San Francisco, California on 06/26/94. It was a magnitude 4.0 earthquake located at a depth of 6.6 km and distance of 12.6 km from the sensors in borehole. There were six 3-component sensors located at different depths. This problem is similar to the analysis by Snieder and Safak for the Robert A. Millikan Library in Pasadena, California where they deconvolve the recorded wavefield at each of the library floors with the top floor to see the upgoing and the downgoing waves and using that, estimate a shear velocity and a quality factor. They have also shown that for such applications of seismic interferometry, deconvolution of waveforms is superior to correlation. For the Treasure Island data, deconvolving the vertical component of the wavefield for each sensors with the sensor at the surface gives a similar superposition of an upgoing and a downgoing wave. The velocity of these waves agrees well with the compressional wave velocity. We compute the radial and the transverse components. When we window the shear wave arrivals in transverse components at each depth and deconvolve with the one on the surface, the resultant up and down going waves travel with the shear wave velocity. Similar windowing and deconvolution for the radial component also agrees with the shear wave velocity. However, when the radial component is windowed around the compressional waves and deconvolved, the up and the down going waves travel with the shear wave velocity. In the absence of any P to S conversion, the deconvolved waves should be travelling with compressional wave velocity. This suggests that there is a conversion at a depth below the deepest sensor. Receiver functions, defined as the spectral ratio of the radial component with vertical component, can be used to characterize

  14. Surface-Source Downhole Seismic Analysis in R

    USGS Publications Warehouse

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the

  15. Earthquake prediction in Japan and natural time analysis of seismicity

    NASA Astrophysics Data System (ADS)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    ' (SES) data are available as in Greece, the natural time analysis of the seismicity after the initiation of the SES allows the determination of the time window of the impending mainshock through the evolution of the value of κ1 itself. It was found to work also for the 1989 M7.1 Loma Prieta earthquake. If SES data are not available, we solely rely on the evolution of the fluctuations of κ1 obtained by computing κ1 values using a natural time window of certain length sliding through the earthquake catalog. The fluctuations of the order parameter, in terms of variability, i. e., standard deviation divided by average, was found to increase dramatically when approaching the 11 March M9 super- giant earthquake. In fact, such increase was also found for M7.1 Kobe in 1995, M8.0 Tokachi-oki in 2003 and Landers and Hector-Mines earthquakes in Southern California. It is worth mentioning that such increase is obtained straghtforwardly from ordinary earthquake catalogs without any adjustable parameters.

  16. Data Quality Analysis for the Bighorn Arch Seismic Array Experiment

    NASA Astrophysics Data System (ADS)

    Mancinelli, N. J.; Yang, Z.; Yeck, W. L.; Sheehan, A. F.

    2010-12-01

    We analyze background noise to assess the difference in station noise levels of different types of seismic sensors and the effects of deployed site locations, and to identify local noise sources, using the data from the Bighorn Arch Seismic Experiment (BASE). Project BASE is an EarthScope Flexible Array (FA) project and includes the deployment of 38 broadband seismometers (Guralp CMG3T), 173 short-period seismometers (L22 and CMG40T-1s), and 1850 high-frequency geophones with Reftek RT125 “Texans” in northern Wyoming, providing continuous dataset of various seismic sensor types and site locations in different geologic setups (basins and mountains). We carry out our analysis through a recently developed approach of using probability density function (PDF) to display the distribution of seismic power spectral density (PSD) [McNamara and Buland, 2004]. This new approach bypasses the tedious pre-screening for transient signals (earthquakes, mass recentering, calibration pulses, etc.) which is required by the traditional PSD analysis. Using the program PQLX, we were able to correlate specific noise sources—mine blasts, teleseisms, passing cars, etc—with features seen on PDF plots. We analyzed eight months of continuous BASE project broadband and short period data for this study. The power spectral density plots suggest that, of the 3 different instrument types used in the BASE project, the broadband CMG3T stations have the lowest background noise in the period range of 0.1-1 s while the short-period L22 stations have the highest background noise. As expected, stations located in the Bighorn Mountain Range are closer to the Low Noise Model [Peterson, 1993] than those located in the adjacent Bighorn Basin and Powder River Basin, particularly in the 0.1-1 s period range. This is mainly attributed to proximity to bedrock, though increased distance from cultural noise also contributes. At longer periods (1-100 s), the noise level of broadband instruments is lower

  17. Analysis of Global Seismicity In The Period 1976-2000

    NASA Astrophysics Data System (ADS)

    Alves, E. I.

    After the pioneering works of Beno Gutenberg and Charles F. Richter, not many at- tempts have been made to quantify world seismicity as a whole. The problem today is twofold. First, there is the issue of the amount of data that is available. Then there is the issue of the usefulness of this analysis: is there such a thing as a "global seismicity"? The present work managed the first issue by analysing only earthquakes of magnitude M>4, in the 25 years period between 1976 and 2000, a total of 39035 events from the NEIC catalogue. This truncated time-series adjusts well to a Gutenberg-Richter frequency-magnitude relation with a=9.88 and b=1.07, with R=0.99. The question of the existence of a global seismicity - that is, if there are long-ranged tectonic interac- tions on a global scale - must, however, be answered by other means. A first approach was to compute the Omori power law for aftershock decay rate parameter p for the 13 strongest earthquakes in the series (8.0analysis because they are discrete and not periodically sampled. To obviate to this problem, it was chosen to analyse the sequence of times of pause between successive earthquakes. This sequence of pauses was analysed by recurrence quantification analysis, and it was found that the optimal embedding dimension for phase-space reconstruction was 4, corresponding to 52.63% false nearest neighbours, and the optimal time-delay was 6, corresponding to the first minimum in average mutual information. Spatio-temporal entropy was found to be 82%, far enough from 100% not to be considered random. An implementation of

  18. Storey building early monitoring based on rapid seismic response analysis

    NASA Astrophysics Data System (ADS)

    Julius, Musa, Admiral; Sunardi, Bambang; Rudyanto, Ariska

    2016-05-01

    Within the last decade, advances in the acquisition, processing and transmission of data from seismic monitoring has contributed to the growth in the number structures instrumented with such systems. An equally important factor for such growth can be attributed to the demands by stakeholders to find rapid answers to important questions related to the functionality or state of "health" of structures during and immediately of a seismic events. Consequently, this study aims to monitor the storey building based on seismic response i. e. earthquake and tremor analysis at short time lapse using accelerographs data. This study used one of storey building (X) in Jakarta city that suffered the effects of Kebumen earthquake January 25th 2014, Pandeglang earthquake July 9th 2014, and Lebak earthquake November 8th 2014. Tremors used in this study are tremors after the three following earthquakes. Data processing used to determine peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), spectral acceleration (SA), spectral velocity (SV), spectral displacement (SD), A/V ratio, acceleration amplification and effective duration (te). Then determine the natural frequency (f0) and peak of H/V ratio using H/V ratio method.The earthquakes data processing result shows the value of peak ground motion, spectrum response, A/V ratio and acceleration amplification increases with height, while the value of the effective duration give a different viewpoint of building dynamic because duration of Kebumen earthquake shows the highest energy in the highest floor but Pandeglang and Lebak earthquake in the lowest floor. Then, tremors data processing result one month after each earthquakes shows the natural frequency of building in constant value. Increasing of peak ground motion, spectrum response, A/V ratio, acceleration amplification, then decrease of effective duration following the increase of building floors shows that the building construction supports the

  19. One-dimensional Seismic Analysis of a Solid-Waste Landfill

    SciTech Connect

    Castelli, Francesco; Lentini, Valentina; Maugeri, Michele

    2008-07-08

    Analysis of the seismic performance of solid waste landfill follows generally the same procedures for the design of embankment dams, even if the methods and safety requirements should be different. The characterization of waste properties for seismic design is difficult due the heterogeneity of the material, requiring the procurement of large samples. The dynamic characteristics of solid waste materials play an important role on the seismic response of landfill, and it also is important to assess the dynamic shear strengths of liner materials due the effect of inertial forces in the refuse mass. In the paper the numerical results of a dynamic analysis are reported and analysed to determine the reliability of the common practice of using 1D analysis to evaluate the seismic response of a municipal solid-waste landfill. Numerical results indicate that the seismic response of a landfill can vary significantly due to reasonable variations of waste properties, fill heights, site conditions, and design rock motions.

  20. An analysis of a seismic reflection from the base of a gas hydrate zone, offshore Peru

    USGS Publications Warehouse

    Miller, J.J.; Lee, M.W.; Von Huene, R.

    1991-01-01

    Seismic reflection data recorded near ODP Site 688, offshore Peru, exhibit a persistent bottom-simulating reflector (BSR) from a depth corresponding to the theoretical base of the gas hdyrate stability field. To carry out a quantitative analysis of the BSR, the seismic data were reprocessed using signature deconvolution and true amplitude recovery techniques. Results indicate the BSR is discontinuous laterally. -from Authors

  1. Structured Assessment Approach: a microcomputer-based insider-vulnerability analysis tool

    SciTech Connect

    Patenaude, C.J.; Sicherman, A.; Sacks, I.J.

    1986-01-01

    The Structured Assessment Approach (SAA) was developed to help assess the vulnerability of safeguards systems to insiders in a staged manner. For physical security systems, the SAA identifies possible diversion paths which are not safeguarded under various facility operating conditions and insiders who could defeat the system via direct access, collusion or indirect tampering. For material control and accounting systems, the SAA identifies those who could block the detection of a material loss or diversion via data falsification or equipment tampering. The SAA, originally desinged to run on a mainframe computer, has been converted to run on a personal computer. Many features have been added to simplify and facilitate its use for conducting vulnerability analysis. For example, the SAA input, which is a text-like data file, is easily readable and can provide documentation of facility safeguards and assumptions used for the analysis.

  2. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    NASA Astrophysics Data System (ADS)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  3. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    SciTech Connect

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  4. Integration of Gis-analysis and Atmospheric Modelling For Nuclear Risk and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Rigina, O.; Baklanov, A.; Mahura, A.

    The paper is devoted to the problems of residential radiation risk and territorial vul- nerability with respect to nuclear sites in Europe. The study suggests two approaches, based on an integration of the GIS-analysis and the atmospheric modelling, to calcu- late radiation risk/vulnerability. First, modelling simulations were done for a number of case-studies, based on real data, such as reactor core inventory and estimations from the known accidents, for a number of typical meteorological conditions and different accidental scenarios. Then, using these simulations and the population database as input data, the GIS-analysis reveals administrative units at the highest risk with re- spect to the mean individual and collective doses received by the population. Then, two alternative methods were suggested to assess a probabilistic risk to the population in case of a severe accident on the Kola and Leningrad NPPs (as examples) based on social-geophysical factors: proximity to the accident site, population density and presence of critical groups, and the probabilities of wind trajectories and precipitation. The two latter probabilities were calculated by the atmospheric trajectory models and statistical methods for many years. The GIS analysis was done for the Nordic coun- tries as an example. GIS-based spatial analyses integrated with mathematical mod- elling allow to develop a common methodological approach for complex assessment of regional vulnerability and residential radiation risk, by merging together the sepa- rate aspects: modelling of consequences, probabilistic analysis of atmospheric flows, dose estimation etc. The approach was capable to create risk/vulnerability maps of the Nordic countries and to reveal the most vulnerable provinces with respect to the radiation risk sites.

  5. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    NASA Astrophysics Data System (ADS)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  6. Letter report seismic shutdown system failure mode and effect analysis

    SciTech Connect

    KECK, R.D.

    1999-09-01

    The Supply Ventilation System Seismic Shutdown ensures that the 234-52 building supply fans, the dry air process fans and vertical development calciner are shutdown following a seismic event. This evaluates the failure modes and determines the effects of the failure modes.

  7. Seismic signature analysis for discrimination of people from animals

    NASA Astrophysics Data System (ADS)

    Damarla, Thyagaraju; Mehmood, Asif; Sabatier, James M.

    2013-05-01

    Cadence analysis has been the main focus for discriminating between the seismic signatures of people and animals. However, cadence analysis fails when multiple targets are generating the signatures. We analyze the mechanism of human walking and the signature generated by a human walker, and compare it with the signature generated by a quadruped. We develop Fourier-based analysis to differentiate the human signatures from the animal signatures. We extract a set of basis vectors to represent the human and animal signatures using non-negative matrix factorization, and use them to separate and classify both the targets. Grazing animals such as deer, cows, etc., often produce sporadic signals as they move around from patch to patch of grass and one must characterize them so as to differentiate their signatures from signatures generated by a horse steadily walking along a path. These differences in the signatures are used in developing a robust algorithm to distinguish the signatures of animals from humans. The algorithm is tested on real data collected in a remote area.

  8. Dynamic behavior of ground for seismic analysis of lifeline systems

    NASA Astrophysics Data System (ADS)

    Sato, T.; Derkiurgehian, A.

    1982-01-01

    A mathematical formula is derived for the general wave transfer function in multilayered media with inhomogeneous and nonlinear properties of soil. It is assumed that the ground consists of horizontally stratified layers overlying a homogeneous half space which is excited by vertically incident, plane shear waves. To formulate the nonlinear harmonic wave solution, the surface layer is regarded as a multilayered system consisting of infinite numbers of sublayers with infinitesimal thicknesses. The mode superposition procedure based on response spectrum provides an expedient tool for dynamic analysis of surficial ground. The characteristic equation for obtaining natural frequencies and free vibration modes is derived by using the proposed wave transfer function. To use the modal analysis for nonlinear systems, a repetition scheme for calculating the model stiffness and damping is proposed which is an adaptation of the equivalent linearization technique. The estimation of intensity of ground shaking is based on a response spectrum for stationary random vibration analysis. The results in conjunction with fatigue theory are used to study the liquefaction problem in soil layers with general topography. Application of the proposed methods in seismic reliability assessment of lifeline systems is discussed.

  9. Numerical analysis on seismic response of Shinkansen bridge-train interaction system under moderate earthquakes

    NASA Astrophysics Data System (ADS)

    He, Xingwen; Kawatani, Mitsuo; Hayashikawa, Toshiro; Matsumoto, Takashi

    2011-03-01

    This study is intended to evaluate the influence of dynamic bridge-train interaction (BTI) on the seismic response of the Shinkansen system in Japan under moderate earthquakes. An analytical approach to simulate the seismic response of the BTI system is developed. In this approach, the behavior of the bridge structure is assumed to be within the elastic range under moderate ground motions. A bullet train car model idealized as a sprung-mass system is established. The viaduct is modeled with 3D finite elements. The BTI analysis algorithm is verified by comparing the analytical and experimental results. The seismic analysis is validated through comparison with a general program. Then, the seismic responses of the BTI system are simulated and evaluated. Some useful conclusions are drawn, indicating the importance of a proper consideration of the dynamic BTI in seismic design.

  10. Seismic analysis of liquid-filled tanks with an eccentric core barrel

    SciTech Connect

    Liu, W.K.; Gvildys, J.

    1985-01-01

    The seismic analysis of fluid-coupled concentric cylindrical shells is reviewed. A coupled fluid-structure finite element method which considers the sloshing effect is then developed for the seismic analysis of liquid-filled systems with internal components. The theoretical development of the mixed finite element formulation is also included. The resulting fluid-structure interaction algorithm has been integrated into the computer code FLUSTR II and the seismic analysis of liquid-filled tanks with an eccentric core barrel is performed. Numerical results show the method yields accurate solutions with large increases in efficiency.

  11. Sampling and Analysis Plan Waste Treatment Plant Seismic Boreholes Project.

    SciTech Connect

    Brouns, Thomas M.

    2007-07-15

    This sampling and analysis plan (SAP) describes planned data collection activities for four entry boreholes through the sediment overlying the Saddle Mountains Basalt, up to three new deep rotary boreholes through the Saddle Mountains Basalt and sedimentary interbeds, and one corehole through the Saddle Mountains Basalt and sedimentary interbeds at the Waste Treatment Plant (WTP) site. The SAP will be used in concert with the quality assurance plan for the project to guide the procedure development and data collection activities needed to support borehole drilling, geophysical measurements, and sampling. This SAP identifies the American Society of Testing Materials standards, Hanford Site procedures, and other guidance to be followed for data collection activities. Revision 3 incorporates all interim change notices (ICN) that were issued to Revision 2 prior to completion of sampling and analysis activities for the WTP Seismic Boreholes Project. This revision also incorporates changes to the exact number of samples submitted for dynamic testing as directed by the U.S. Army Corps of Engineers. Revision 3 represents the final version of the SAP.

  12. Regional analysis of earthquake occurrence and seismic energy release

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1980-01-01

    The historic temporal variation in earthquake occurrence and seismic energy release on a regional basis throughtout the world were studied. The regionalization scheme employed divided the world into large areas based either on seismic and tectonic considerations (Flinn-Engdahl Scheme) or geographic (longitude and latitude) criteria. The data set is the wide earthquake catalog of the National Geophysical Solar-Terrestrial Data Center. An apparent relationship exists between the maximum energy released in a limited time within a seismic region and the average or background energy per year averaged over a long time period. In terms of average or peak energy release, the most seismic regions of the world during the 50 to 81 year period ending in 1977 were Japanese, Andean South American, and the Alaska-Aleutian Arc regions. The year to year fluctuations in regional seismic energy release are greater, by orders of magnitude, than the corresponding variations in the world-wide seismic energy release. The b values of seismic regions range from 0.7 to 1.4 where earthquake magnitude is in the range 6.0 to 7.5.

  13. Seismic facies analysis based on self-organizing map and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian

    2015-01-01

    Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.

  14. Seismic data interpretation using the Hough transform and principal component analysis

    NASA Astrophysics Data System (ADS)

    Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Martin, R.; Ávila-Carrera, R.; Rodríguez-Castellanos, A.

    2011-03-01

    In this work two novel image processing techniques are applied to detect and delineate complex salt bodies from seismic exploration profiles: Hough transform and principal component analysis (PCA). It is well recognized by the geophysical community that the lack of resolution and poor structural identification in seismic data recorded at sub-salt plays represent severe technical and economical problems. Under such circumstances, seismic interpretation based only on the human-eye is inaccurate. Additionally, petroleum field development decisions and production planning depend on good-quality seismic images that generally are not feasible in salt tectonics areas. In spite of this, morphological erosion, region growing and, especially, a generalization of the Hough transform (closely related to the Radon transform) are applied to build parabolic shapes that are useful in the idealization and recognition of salt domes from 2D seismic profiles. In a similar way, PCA is also used to identify shapes associated with complex salt bodies in seismic profiles extracted from 3D seismic data. To show the validity of the new set of seismic results, comparisons between both image processing techniques are exhibited. It is remarkable that the main contribution of this work is oriented in providing the seismic interpreters with new semi-automatic computational tools. The novel image processing approaches presented here may be helpful in the identification of diapirs and other complex geological features from seismic images. Conceivably, in the near future, a new branch of seismic attributes could be recognized by geoscientists and engineers based on the encouraging results reported here.

  15. The shallow elastic structure of the lunar crust: New insights from seismic wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-10-01

    Enigmatic lunar seismograms recorded during the Apollo 17 mission in 1972 have so far precluded the identification of shear-wave arrivals and hence the construction of a comprehensive elastic model of the shallow lunar subsurface. Here, for the first time, we extract shear-wave information from the Apollo active seismic data using a novel waveform analysis technique based on spatial seismic wavefield gradients. The star-like recording geometry of the active seismic experiment lends itself surprisingly well to compute spatial wavefield gradients and rotational ground motion as a function of time. These observables, which are new to seismic exploration in general, allowed us to identify shear waves in the complex lunar seismograms, and to derive a new model of seismic compressional and shear-wave velocities in the shallow lunar crust, critical to understand its lithology and constitution, and its impact on other geophysical investigations of the Moon's deep interior.

  16. Discrimination of porosity and fluid saturation using seismic velocity analysis

    DOEpatents

    Berryman, James G.

    2001-01-01

    The method of the invention is employed for determining the state of saturation in a subterranean formation using only seismic velocity measurements (e.g., shear and compressional wave velocity data). Seismic velocity data collected from a region of the formation of like solid material properties can provide relatively accurate partial saturation data derived from a well-defined triangle plotted in a (.rho./.mu., .lambda./.mu.)-plane. When the seismic velocity data are collected over a large region of a formation having both like and unlike materials, the method first distinguishes the like materials by initially plotting the seismic velocity data in a (.rho./.lambda., .mu./.lambda.)-plane to determine regions of the formation having like solid material properties and porosity.

  17. MOBB: Data Analysis from an Ocean Floor Broadband Seismic Observatory

    NASA Astrophysics Data System (ADS)

    Uhrhammer, R. A.; Dolenc, D.; Romanowicz, B.; Stakes, D.; McGill, P.; Neuhauser, D.; Ramirez, T.

    2003-12-01

    MOBB (Monterey bay Ocean floor Broad Band project) is a collaborative project between the Monterey Bay Aquarium Research Institute (MBARI) and the Berkeley Seismological Laboratory (BSL). Its goal is to install and operate a permanent seafloor broadband station as a first step towards extending the on-shore broadband seismic network in northern California, to the seaside of the North-America/Pacific plate boundary, providing improved azimuthal coverage for regional earthquake and structure studies. The MOBB station was installed on the seafloor in Monterey Bay, 40 km offshore, and at a depth of 1000m from the sea surface, in April 2002, and is completely buried under the seafloor level. The installation made use of MBARI's Point Lobos ship and ROV Ventana and the station currently records data autonomously. Dives are scheduled regularly (about every three months) to recover and replace the recording and battery packages. Some data were lost in the first half of 2003 due to hardware and software problems in the recording system. The ocean-bottom MOBB station currently comprises a three-component seismometer package (Guralp CMG-1T), a current-meter, a digital pressure gauge (DPG), and recording and battery packages. The seismometer package is mounted on a cylindrical titanium pressure vessel 54cm in height and 41 cm in diameter, custom built by the MBARI team and outfitted for underwater connection. Since the background noise in the near-shore ocean floor environment is high in the band pass of interest, for the study of regional and teleseismic signals, an important focus of this project is to develop methods to a posteriori increase signal to noise ratios, by deconvolving contributions from various sources of noise. We present results involving analysis of correlation of background noise with tide, ocean current and pressure records, combining data from MOBB and regional land based stations of the Berkeley Digital Seismic Network (BDSN). We also present preliminary

  18. Seismic Analysis Issues in Design Certification Applications for New Reactors

    SciTech Connect

    Miranda, M.; Morante, R.; Xu, J.

    2011-07-17

    The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants,” provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

  19. Earthquake Cluster Analysis for Turkey and its Application for Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake clusters are an important element in general seismology and also for the application in seismic hazard assessment. In probabilistic seismic hazard assessment, the occurrence of earthquakes is often linked to an independent Monte Carlo process, following a stationary Poisson model. But earthquakes are dependent and constrained, especially in terms of earthquake swarms, fore- and aftershocks or even larger sequences as observed for the Landers sequence in California or the Darfield-Christchurch sequence in New Zealand. For earthquake catalogues, the element of declustering is an important step to capture earthquake frequencies by avoiding a bias towards small magnitudes due to aftershocks. On the other hand, declustered catalogues for independent probabilistic seismic activity will underestimate the total number of earthquakes by neglecting dependent seismicity. In this study, the effect of clusters on probabilistic seismic hazard assessment is investigated in detail. To capture the features of earthquake clusters, a uniform framework for earthquake cluster analysis is introduced using methodologies of geostatistics and machine learning. These features represent important cluster characteristics like cluster b-values, temporal decay, rupture orientations and many more. Cluster parameters are mapped in space using kriging. Furthermore, a detailed data analysis is undertaken to provide magnitude-dependent relations for various cluster parameters. The acquired features are used to introduce dependent seismicity within stochastic earthquake catalogues. In addition, the development of smooth seismicity maps based on historic databases is in general biased to the more complete recent decades. A filling methodology is introduced which will add dependent seismicity in catalogues where none has been recorded to avoid the above mentioned bias. As a case study, Turkey has been chosen due to its inherent seismic activity and well-recorded data coverage. Clustering

  20. China's water resources vulnerability: A spatio-temporal analysis during 2003-2013

    NASA Astrophysics Data System (ADS)

    Cai, J.; Varis, O.; Yin, H.

    2015-12-01

    The present highly serious situation of China's water environment and aquatic ecosystems has occurred in the context of its stunning socioeconomic development over the past several decades. Therefore, an analysis with a high spatio-temporal resolution of the vulnerability assessment of water resources (VAWR) in China is burningly needed. However, to our knowledge, the temporal analysis of VAWR has been not yet addressed. Consequently, we performed, for the first time, a comprehensive spatio-temporal analysis of China's water resources vulnerability (WRV), using a composite index approach with an array of aspects highlighting key challenges that China's water resources system is nowadays facing. During our study period of 2003-2013, the political weight of China's integrated water resources management has been increasing continuously. Hence, it is essential and significant, based on the historical socioeconomic changes influenced by water-environment policy making and implementation, to reveal China's WRV for pinpointing key challenges to the healthy functionality of its water resources system. The water resources system in North and Central Coast appeared more vulnerable than that in Western China. China's water use efficiency has grown substantially over the study period, and so is water supply and sanitation coverage. In contrast, water pollution has been worsening remarkably in most parts of China, and so have water scarcity and shortage in the most stressed parts of the country. This spatio-temporal analysis implies that the key challenges to China's water resources system not only root in the geographical mismatch between socioeconomic development (e.g. water demand) and water resources endowments (e.g. water resources availability), but also stem from the intertwinement between socioeconomic development and national strategic policy making.

  1. A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators

    PubMed Central

    Beccari, Benjamin

    2016-01-01

    related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. Discussion: A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development. PMID:27066298

  2. Application and Validation of a GIS Model for Local Tsunami Vulnerability and Mortality Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Frauenfelder, R.; Kaiser, G.; Glimsdal, S.; Sverdrup-thygeson, K.; Løvholt, F.; Gruenburg, L.; Mc Adoo, B. G.

    2015-12-01

    The 2011 Tōhoku tsunami caused a high number of fatalities and massive destruction. Data collected after the event allow for retrospective analyses. Since 2009, NGI has developed a generic GIS model for local analyses of tsunami vulnerability and mortality risk. The mortality risk convolves the hazard, exposure, and vulnerability. The hazard is represented by the maximum tsunami flow depth (with a corresponding likelihood), the exposure is described by the population density in time and space, while the vulnerability is expressed by the probability of being killed as a function of flow depth and building class. The analysis is further based on high-resolution DEMs. Normally a certain tsunami scenario with a corresponding return period is applied for vulnerability and mortality risk analysis. Hence, the model was first employed for a tsunami forecast scenario affecting Bridgetown, Barbados, and further developed in a forecast study for the city of Batangas in the Philippines. Subsequently, the model was tested by hindcasting the 2009 South Pacific tsunami in American Samoa. This hindcast was based on post-tsunami information. The GIS model was adapted for optimal use of the available data and successfully estimated the degree of mortality.For further validation and development, the model was recently applied in the RAPSODI project for hindcasting the 2011 Tōhoku tsunami in Sendai and Ishinomaki. With reasonable choices of building vulnerability, the estimated expected number of fatalities agree well with the reported death toll. The results of the mortality hindcast for the 2011 Tōhoku tsunami substantiate that the GIS model can help to identify high tsunami mortality risk areas, as well as identify the main risk drivers.The research leading to these results has received funding from CONCERT-Japan Joint Call on Efficient Energy Storage and Distribution/Resilience against Disasters (http://www.concertjapan.eu; project RAPSODI - Risk Assessment and design of

  3. Geo-ethical dimension of community's safety: rural and urban population vulnerability analysis methodology

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro; Kopachevsky, Ivan; Yuschenko, Maxim

    2016-04-01

    Modern world based on relations more than on causalities, so communicative, socio-economic, and socio-cultural issues are important to understand nature of risks and to make correct, ethical decisions. Today major part of risk analysts declared new nature of modern risks. We faced coherent or systemic risks, realization of which leads to domino effect, unexpected growing of losses and fatalities. This type of risks originated by complicated nature of heterogeneous environment, close interconnection of engineering networks, and changing structure of society. Heterogeneous multi-agent environment generates systemic risks, which requires analyze multi-source data with sophisticated tools. Formal basis for analysis of this type of risks is developed during last 5-7 years. But issues of social fairness, ethics, and education require further development. One aspect of analysis of social issues of risk management is studied in this paper. Formal algorithm for quantitative analysis of multi-source data analysis is proposed. As it was demonstrated, using proposed methodological base and the algorithm, it is possible to obtain regularized spatial-temporal distribution of investigated parameters over whole observation period with rectified reliability and controlled uncertainty. The result of disaster data analysis demonstrates that about half of direct disaster damage might be caused by social factors: education, experience and social behaviour. Using data presented also possible to estimate quantitative parameters of the losses distributions: a relation between education, age, experience, and losses; as well as vulnerability (in terms of probable damage) toward financial status in current social density. It is demonstrated that on wide-scale range an education determines risk perception and so vulnerability of societies. But on the local level there are important heterogeneities. Land-use and urbanization structure influencing to vulnerability essentially. The way to

  4. Analysis of infrasonic and seismic events related to the 1998 Vulcanian eruption at Sakurajima

    NASA Astrophysics Data System (ADS)

    Morrissey, M.; Garces, M.; Ishihara, K.; Iguchi, M.

    2008-08-01

    We present results from a detailed analysis of seismic and infrasonic data recorded over a four day period prior to the Vulcanian eruptive event at Sakurajima volcano on May 19, 1998. Nearly one hundred seismic and infrasonic events were recorded on at least one of the nine seismic-infrasonic stations located within 3 km of the crater. Four unique seismic event types are recognized based on the spectral features of seismograms, including weak seismic tremor characterized by a 5-6 Hz peak mode that later shifted to 4-5 Hz. Long-period events are characterized by a short-duration, wide spectral band signal with an emergent, high-frequency onset followed by a wave coda lasting 15-20 s and a fundamental mode of 4.2-4.4 Hz. Values of Q for long-period events range between 10 and 22 suggesting that a gas-rich fluid was involved. Explosive events are the third seismic type, characterized by a narrow spectral band signal with an impulsive high-frequency onset followed by a 20-30 second wave coda and a peak mode of 4.0-4.4 Hz. Volcano-tectonic earthquakes are the fourth seismic type. Prior to May 19, 1998, only the tremor and explosion seismic events are found to have an infrasonic component. Like seismic tremor, infrasonic tremor is typically observed as a weak background signal. Explosive infrasonic events were recorded 10-15 s after the explosive seismic events and with audible explosions prior to May 19. On May 19, high-frequency impulsive infrasonic events occurred sporadically and as swarms within hours of the eruption. These infrasonic events are observed to be coincident with swarms of long-period seismic events. Video coverage during the seismic-infrasonic experiment recorded intermittent releases of gases and ash during times when seismic and acoustic events were recorded. The sequence of seismic and infrasonic events is interpreted as representing a gas-rich fluid moving through a series of cracks and conduits beneath the active summit crater.

  5. Review of seismic probabilistic risk assessment and the use of sensitivity analysis

    SciTech Connect

    Shiu, K.K.; Reed, J.W.; McCann, M.W. Jr.

    1985-01-01

    This paper presents results of sensitivity reviews performed to address a range of questions which arise in the context of seismic probabilistic risk assessment (PRA). These questions are the subject of this paper. A seismic PRA involves evaluation of seismic hazard, component fragilities, and system responses. They are combined in an integrated analysis to obtain various risk measures, such as frequency of plant damage states. Calculation of these measures depends on combination of non-linear functions based on a number of parameters and assumptions used in the quantification process. Therefore, it is often difficult to examine seismic PRA results and derive useful insights from them if detailed sensitivity studies are absent. In a seismic PRA, sensitivity evaluations can be divided into three areas: hazard, fragility, and system modeling. As a part of the review of a standard boiling water reactor seismic PRA, a reassessment of the plant damage states frequency and a detailed sensitivity analysis were conducted. Seismic event trees and fault trees were developed to model the different system and plant accident sequences. Hazard curves which represent various sites on the east coast were obtained; alternate structure and equipment fragility data were postulated. Various combinations of hazard and fragility data were analyzed. In addition, system modeling was perturbed to examine the impact upon the final results. Orders of magnitude variation were observed in the plant damage state frequency among the different cases. 6 refs., 2 figs., 3 tabs.

  6. Pre-stack-texture-based reservoir characteristics and seismic facies analysis

    NASA Astrophysics Data System (ADS)

    Song, Cheng-Yun; Liu, Zhi-Ning; Cai, Han-Peng; Qian, Feng; Hu, Guang-Min

    2016-03-01

    Seismic texture attributes are closely related to seismic facies and reservoir characteristics and are thus widely used in seismic data interpretation. However, information is mislaid in the stacking process when traditional texture attributes are extracted from post-stack data, which is detrimental to complex reservoir description. In this study, pre-stack texture attributes are introduced, these attributes can not only capable of precisely depicting the lateral continuity of waveforms between different reflection points but also reflect amplitude versus offset, anisotropy, and heterogeneity in the medium. Due to its strong ability to represent stratigraphics, a pre-stack-data-based seismic facies analysis method is proposed using the self-organizing map algorithm. This method is tested on wide azimuth seismic data from China, and the advantages of pre-stack texture attributes in the description of stratum lateral changes are verified, in addition to the method's ability to reveal anisotropy and heterogeneity characteristics. The pre-stack texture classification results effectively distinguish different seismic reflection patterns, thereby providing reliable evidence for use in seismic facies analysis.

  7. Optimization Strategies for the Vulnerability Analysis of the Electric Power Grid

    SciTech Connect

    Pinar, A.; Meza, J.; Donde, V.; Lesieutre, B.

    2007-11-13

    Identifying small groups of lines, whose removal would cause a severe blackout, is critical for the secure operation of the electric power grid. We show how power grid vulnerability analysis can be studied as a mixed integer nonlinear programming (MINLP) problem. Our analysis reveals a special structure in the formulation that can be exploited to avoid nonlinearity and approximate the original problem as a pure combinatorial problem. The key new observation behind our analysis is the correspondence between the Jacobian matrix (a representation of the feasibility boundary of the equations that describe the flow of power in the network) and the Laplacian matrix in spectral graph theory (a representation of the graph of the power grid). The reduced combinatorial problem is known as the network inhibition problem, for which we present a mixed integer linear programming formulation. Our experiments on benchmark power grids show that the reduced combinatorial model provides an accurate approximation, to enable vulnerability analyses of real-sized problems with more than 10,000 power lines.

  8. Seismic Fragility Analysis of a Condensate Storage Tank with Age-Related Degradations

    SciTech Connect

    Nie, J.; Braverman, J.; Hofmayer, C; Choun, Y-S; Kim, MK; Choi, I-K

    2011-04-01

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structures and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. This report describes the research effort performed by BNL for the Year 4 scope of work. This report was developed as an update to the Year 3 report by incorporating a major supplement to the Year 3 fragility analysis. In the Year 4 research scope, an additional study was carried out to consider an additional degradation scenario, in which the three basic degradation scenarios, i.e., degraded tank shell, degraded anchor bolts, and cracked anchorage concrete, are combined in a non-perfect correlation manner. A representative operational water level is used for this effort. Building on the same CDFM procedure implemented for the Year 3 Tasks, a simulation method was applied using optimum Latin Hypercube samples to characterize the deterioration behavior of the fragility capacity as a function of age-related degradations. The results are summarized in Section 5

  9. Characterising Seismic Hazard Input for Analysis Risk to Multi-System Infrastructures: Application to Scenario Event-Based Models and extension to Probabilistic Risk

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Silva, V.

    2011-12-01

    The potential human and economic cost of earthquakes to complex urban infrastructures has been demonstrated in the most emphatic manner by recent large earthquakes such as that of Haiti (February 2010), Christchurch (September 2010 and February 2011) and Tohoku (March 2011). Consideration of seismic risk for a homogenous portfolio, such as a single building typology or infrastructure, or independent analyses of separate typologies or infrastructures, are insufficient to fully characterise the potential impacts that arise from inter-connected system failure. Individual elements of each infrastructure may be adversely affected by different facets of the ground motion (e.g. short-period acceleration, long-period displacement, cumulative energy input etc.). The accuracy and efficiency of the risk analysis is dependent on the ability to characterise these multiple features of the ground motion over a spatially distributed portfolio of elements. The modelling challenges raised by this extension to multi-system analysis of risk have been a key focus of the European Project "Systemic Seismic Vulnerability and Risk Analysis for Buildings, Lifeline Networks and Infrastructures Safety Gain (SYNER-G)", and are expected to be developed further within the Global Earthquake Model (GEM). Seismic performance of a spatially distributed infrastructure during an earthquake may be assessed by means of Monte Carlo simulation, in order to incorporate the aleatory variability of the ground motion into the network analysis. Methodologies for co-simulating large numbers of spatially cross-correlated ground motion fields are appraised, and their potential impacts on a spatially distributed portfolio of mixed building typologies assessed using idealised case study scenarios from California and Europe. Potential developments to incorporate correlation and uncertainty in site amplification and geotechnical hazard are also explored. Whilst the initial application of the seismic risk analysis is

  10. On analysis-based two-step interpolation methods for randomly sampled seismic data

    NASA Astrophysics Data System (ADS)

    Yang, Pengliang; Gao, Jinghuai; Chen, Wenchao

    2013-02-01

    Interpolating the missing traces of regularly or irregularly sampled seismic record is an exceedingly important issue in the geophysical community. Many modern acquisition and reconstruction methods are designed to exploit the transform domain sparsity of the few randomly recorded but informative seismic data using thresholding techniques. In this paper, to regularize randomly sampled seismic data, we introduce two accelerated, analysis-based two-step interpolation algorithms, the analysis-based FISTA (fast iterative shrinkage-thresholding algorithm) and the FPOCS (fast projection onto convex sets) algorithm from the IST (iterative shrinkage-thresholding) algorithm and the POCS (projection onto convex sets) algorithm. A MATLAB package is developed for the implementation of these thresholding-related interpolation methods. Based on this package, we compare the reconstruction performance of these algorithms, using synthetic and real seismic data. Combined with several thresholding strategies, the accelerated convergence of the proposed methods is also highlighted.

  11. Seismic detection and analysis of icequakes at Columbia Glacier, Alaska

    USGS Publications Warehouse

    O'Neel, Shad; Marshall, Hans P.; McNamara, Daniel E.; Pfeffer, William Tad

    2007-01-01

    Contributions to sea level rise from rapidly retreating marine-terminating glaciers are large and increasing. Strong increases in iceberg calving occur during retreat, which allows mass transfer to the ocean at a much higher rate than possible through surface melt alone. To study this process, we deployed an 11-sensor passive seismic network at Columbia Glacier, Alaska, during 2004–2005. We show that calving events generate narrow-band seismic signals, allowing frequency domain detections. Detection parameters were determined using direct observations of calving and validated using three statistical methods and hypocenter locations. The 1–3 Hz detections provide a good measure of the temporal distribution and size of calving events. Possible source mechanisms for the unique waveforms are discussed, and we analyze potential forcings for the observed seismicity.

  12. Sensor Emplacement Techniques and Seismic Noise Analysis for USArray Transportable Array Seismic Stations

    NASA Astrophysics Data System (ADS)

    Frassetto, A.; Busby, R. W.; Hafner, K.; Woodward, R.; Sauter, A.

    2013-12-01

    In preparation for the upcoming deployment of EarthScope's USArray Transportable Array (TA) in Alaska, the National Science Foundation (NSF) has supported exploratory work on seismic station design, sensor emplacement, and communication concepts appropriate for this challenging high-latitude environment. IRIS has installed several experimental stations to evaluate different sensor emplacement schemes both in Alaska and in the lower-48 of the U.S. The goal of these tests is to maintain or enhance a station's noise performance while minimizing its footprint and the weight of the equipment, materials, and overall expense required for its construction. Motivating this approach are recent developments in posthole broadband seismometer design and the unique conditions for operating in Alaska, where there are few roads, cellular communications are scarce, most areas are only accessible by small plane or helicopter, and permafrost underlies much of the state. We will review the methods used for directly emplacing broadband seismometers in comparison to the current methods used for the lower-48 TA. These new methods primarily focus on using a portable drill to make a bored hole three to five meters, beneath the active layer of the permafrost, or by coring 1-2 meters deep into surface bedrock. Both methods are logistically effective in preliminary trials. Subsequent station performance has been assessed quantitatively using probability density functions summed from power spectral density estimates. These are calculated for the continuous time series of seismic data recorded for each channel of the seismometer. There are five test stations currently operating in Alaska. One was deployed in August 2011 and the remaining four in October 2012. Our results show that the performance of seismometers in Alaska with auger-hole or core-hole installations can sometimes exceed that of the quietest TA stations in the lower-48, particularly horizontal components at long periods. A

  13. Risk and Vulnerability Analysis of Satellites Due to MM/SD with PIRAT

    NASA Astrophysics Data System (ADS)

    Kempf, Scott; Schafer, Frank Rudolph, Martin; Welty, Nathan; Donath, Therese; Destefanis, Roberto; Grassi, Lilith; Janovsky, Rolf; Evans, Leanne; Winterboer, Arne

    2013-08-01

    Until recently, the state-of-the-art assessment of the threat posed to spacecraft by micrometeoroids and space debris was limited to the application of ballistic limit equations to the outer hull of a spacecraft. The probability of no penetration (PNP) is acceptable for assessing the risk and vulnerability of manned space mission, however, for unmanned missions, whereby penetrations of the spacecraft exterior do not necessarily constitute satellite or mission failure, these values are overly conservative. The newly developed software tool PIRAT (Particle Impact Risk and Vulnerability Analysis Tool) has been developed based on the Schäfer-Ryan-Lambert (SRL) triple-wall ballistic limit equation (BLE), applicable for various satellite components. As a result, it has become possible to assess the individual failure rates of satellite components. This paper demonstrates the modeling of an example satellite, the performance of a PIRAT analysis and the potential for subsequent design optimizations with respect of micrometeoroid and space debris (MM/SD) impact risk.

  14. Using wavefront coding technique as an optical encryption system: reliability analysis and vulnerabilities assessment

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.

    2012-04-01

    Wavefront coding paradigm can be used not only for compensation of aberrations and depth-of-field improvement but also for an optical encryption. An optical convolution of the image with the PSF occurs when a diffractive optical element (DOE) with a known point spread function (PSF) is placed in the optical path. In this case, an optically encoded image is registered instead of the true image. Decoding of the registered image can be performed using standard digital deconvolution methods. In such class of optical-digital systems, the PSF of the DOE is used as an encryption key. Therefore, a reliability and cryptographic resistance of such an encryption method depends on the size and complexity of the PSF used for optical encoding. This paper gives a preliminary analysis on reliability and possible vulnerabilities of such an encryption method. Experimental results on brute-force attack on the optically encrypted images are presented. Reliability estimation of optical coding based on wavefront coding paradigm is evaluated. An analysis of possible vulnerabilities is provided.

  15. Genetic analysis reveals demographic fragmentation of grizzly bears yielding vulnerably small populations

    PubMed Central

    Proctor, Michael F; McLellan, Bruce N; Strobeck, Curtis; Barclay, Robert M.R

    2005-01-01

    Ecosystem conservation requires the presence of native carnivores, yet in North America, the distributions of many larger carnivores have contracted. Large carnivores live at low densities and require large areas to thrive at the population level. Therefore, if human-dominated landscapes fragment remaining carnivore populations, small and demographically vulnerable populations may result. Grizzly bear range contraction in the conterminous USA has left four fragmented populations, three of which remain along the Canada–USA border. A tenet of grizzly bear conservation is that the viability of these populations requires demographic linkage (i.e. inter-population movement of both sexes) to Canadian bears. Using individual-based genetic analysis, our results suggest this demographic connection has been severed across their entire range in southern Canada by a highway and associated settlements, limiting female and reducing male movement. Two resulting populations are vulnerably small (≤100 animals) and one of these is completely isolated. Our results suggest that these trans-border bear populations may be more threatened than previously thought and that conservation efforts must expand to include international connectivity management. They also demonstrate the ability of genetic analysis to detect gender-specific demographic population fragmentation in recently disturbed systems, a traditionally intractable yet increasingly important ecological measurement worldwide. PMID:16243699

  16. Genetic analysis reveals demographic fragmentation of grizzly bears yielding vulnerably small populations.

    PubMed

    Proctor, Michael F; McLellan, Bruce N; Strobeck, Curtis; Barclay, Robert M R

    2005-11-22

    Ecosystem conservation requires the presence of native carnivores, yet in North America, the distributions of many larger carnivores have contracted. Large carnivores live at low densities and require large areas to thrive at the population level. Therefore, if human-dominated landscapes fragment remaining carnivore populations, small and demographically vulnerable populations may result. Grizzly bear range contraction in the conterminous USA has left four fragmented populations, three of which remain along the Canada-USA border. A tenet of grizzly bear conservation is that the viability of these populations requires demographic linkage (i.e. inter-population movement of both sexes) to Canadian bears. Using individual-based genetic analysis, our results suggest this demographic connection has been severed across their entire range in southern Canada by a highway and associated settlements, limiting female and reducing male movement. Two resulting populations are vulnerably small (< or =100 animals) and one of these is completely isolated. Our results suggest that these trans-border bear populations may be more threatened than previously thought and that conservation efforts must expand to include international connectivity management. They also demonstrate the ability of genetic analysis to detect gender-specific demographic population fragmentation in recently disturbed systems, a traditionally intractable yet increasingly important ecological measurement worldwide.

  17. Seismic structural analysis of a glovebox by the equivalent static method

    SciTech Connect

    Hsieh, B.J.

    1994-06-01

    Seismic strength evaluation of equipment requires efficient and accurate methods. Such an evaluation generally calls for dynamic analysis requiring detailed accelerations and advanced mathematical modeling. The analysis may be tedious, but in theory works for any structure with any boundary conditions. Many equipment do not justify such expansive and expensive evaluation; hence, efficient and inexpensive, but may be more conservative, methods of analysis are used instead. The equivalent static method (ESM) is such a method. Being a static method, the ESM can not be directly applied to equipment that are not simply anchored to or only rest on the ground. In this paper, we show how a glovebox with ambiguous anchorage conditions is analyzed by the ESM when subjected to the seismic load. Also outlined are the retrofits to increase its seismic resistance. The recommendations include fixing the legs to the floor and using inclined braces. The use of braces is effective in resisting the lateral seismic. It redistributes the seismic-generated moment and force in a more benign way. It also significantly stiffens the glovobox`s supporting table structure, thus raising the vibration frequency of the table away from the high-energy range of the seismic load and drastically reduces the displacement of the glovebox.

  18. Seismic Attribute Analysis of the Mississippian Chert at the Wellington Field, south-central Kansas

    NASA Astrophysics Data System (ADS)

    Sirazhiev, Ayrat

    Mississippian chert reservoirs, important hydrocarbon resources in North America, are highly heterogeneous, typically below seismic resolution and, therefore, present a challenging task for predicting reservoir properties from seismic data. In this study, I conducted a seismic attribute analysis of the Mississippian chert reservoir at the Wellington Field, south-central Kansas using well and 3D PSTM seismic data. The microporous cherty dolomite reservoir exhibits a characteristic vertical gradational porosity reduction and associated increase in acoustic velocity, known as a ramp-transition velocity function. I investigated possible relationships of the Mississippian reservoir thickness and porosity with post-stack seismic attributes, including inverted acoustic impedance. The analysis of well-log and seismic data revealed that fault #1 divides the Wellington Field diagonally from the southwestern corner to the northeastern corner. The reservoir in the southeastern part of the field is characterized by a vertical gradational porosity decrease (from 25-30 to 4-6%), variable thickness (6-20 m), lower seismic amplitude and frequency content, locally developed double reflector, and high correlation between seismic amplitude and reservoir thickness conformable with the theoretical amplitude response of a ramp-transition velocity function. Amplitude envelope was used to predict the reservoir thickness in this part of the field. The Mississippian reservoir in the northwestern part of the field has more heterogeneous porosity distribution within the reservoir interval, thins in the north-north-west direction, while no clear relationship was found between reservoir thickness and instantaneous seismic attributes. The model-based inversion and porosity model predicted from inverted impedance supported the well-log and seismic attribute interpretation. The reliability of the predicted porosity model is tested by cross-validation. Resolution limits were determined using wedge

  19. Analysis of the Far-Field Co-seismic and Post-seismic Responses Caused by the 2011 M W 9.0 Tohoku-Oki Earthquake

    NASA Astrophysics Data System (ADS)

    Shao, Zhigang; Zhan, Wei; Zhang, Langping; Xu, Jing

    2016-02-01

    We analyzed the far-field co-seismic response of the M W 9.0 Tohoku-Oki earthquake, which occurred on March 11th 2011 at the Japan Trench plate boundary. Our analysis indicates that the far-field co-seismic displacement was very sensitive to the magnitude of this event, and that a significant co-seismic surface displacement from earthquakes in the Japan Trench region can be observed in Eurasia only for events of M W ≥ 8.0. We also analyzed the temporal characteristics of the near-field post-seismic deformation caused by the afterslip and the viscoelastic relaxation following the Japan earthquake. Next, we performed a simulation to analyze the influence of the two post-seismic effects previously mentioned on the far-field post-seismic crustal deformation. The simulation results help explain the post-seismic crustal deformation observed on the Chinese mainland 1.5 years after the event. Fitting results revealed that after the M W 9.0 Tohoku-Oki earthquake, the afterslip decayed exponentially, and may eventually disappear after 4 years. The far-field post-seismic displacement in Eurasia caused by the viscoelastic relaxation following this earthquake will reach the same magnitude as the co-seismic displacement in approximately 10 years. In addition, the co- and post-seismic Coulomb stress on several NE-trending faults in the northeastern and northern regions of the Chinese mainland were significantly enhanced because of the M W 9.0 earthquake, especially on the Yilan-Yitong and the Dunhua-Mishan faults (the northern section of the Tan-Lu fault zone) as well as the Yalujiang and the Fuyu-Zhaodong faults.

  20. Altered autonomic arousal in psychosis: an analysis of vulnerability and specificity.

    PubMed

    Clamor, Annika; Hartmann, Maike M; Köther, Ulf; Otte, Christian; Moritz, Steffen; Lincoln, Tania M

    2014-04-01

    Vulnerability-stress models implicate that alterations of the autonomous nervous system contribute to the development of psychosis. Previous research has found autonomic arousal alterations in psychotic disorders and at-risk individuals that are not explained by medication alone. To test whether these alterations are associated with the extent of an individual's vulnerability and whether they are specific to psychosis, we compared participants with psychosis (n=23), first-degree relatives of individuals with psychosis (n=21), and healthy participants with attenuated positive symptoms (n=23) to participants with depression (n=24) and healthy controls (n=24). At rest, skin conductance level was assessed and photoplethysmography was applied to measure time- and frequency-domain heart rate variability (HRV). Univariate and multivariate analyses of covariance with perceived stress and psychophysiological values as dependent variables showed significant between-group differences for perceived stress (p=.010), heart rate (p=.022), time-domain HRV indices (all ps≤.027), and vagal activity (p=.017). Group differences in sympathetic activity were nonsignificant (p=.069). In an additional analysis with medication as a second between-group factor, the physiological between-group differences remained significant or trend significant (all ps≤.060). With the exception of sympathetic activity, participants with psychosis exhibited more extreme arousal than the control groups. First-degree relatives and participants with attenuated symptoms showed comparable autonomic activity to healthy controls. Thus, the hypothesized association of an alteration of arousal and vulnerability to psychosis was not confirmed. However, particularly low time-domain HRV was found for psychosis, with significant differences to healthy controls (all ps≤.007) and to depression (all ps≤.004), with the latter indicating a specificity to psychosis. PMID:24582038

  1. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    A search for space and time regularities in volcanic and seismic events for the purpose of forecast method development seems to be of current concern, both scientifically and practically. The seismic and volcanic processes take place in the Earth's field of gravity which in turn is closely related to gravitational fields of the Moon, the Sun, and the planets of the Solar System. It is mostly gravity and tidal forces that exercise control over the Earth's configuration and relief. Dynamic gravitational interaction between the Earth and other celestial bodies makes itself evident in tidal phenomena and other effects in the geospheres (including the Earth's crust). Dynamics of the tidal and attractive forces is responsible for periodical changes in gravity force, both in value and direction [Darwin, 1965], in the rate of rotation and orbital speed; that implies related changes in the endogenic activity of the Earth. The Earth's rotation in the alternating gravitational field accounts to a considerable extent for regular pattern of crustal deformations and dislocations; it is among principal factors that control the Earth's form and structure, distribution of oceans and continents and, probably, continental drift [Peive, 1969; Khain, 1973; Kosygin, 1983]. The energy of gravitational interaction is transmitted through the tidal energy to planetary spheres and feeds various processes there, including volcanic and seismic ones. To determine degree, character and special features of tidal force contribution to the volcanic and seismic processes is of primary importance for understanding of genetic and dynamic aspects of volcanism and seismicity. Both volcanic and seismic processes are involved in evolution of celestial bodies; they are operative on the planets of the Earth group and many satellites [Essays…, 1981; Lukashov, 1996]. From this standpoint, studies of those processes are essential with a view to development of scenarios of the Earth's evolution as a celestial

  2. Detailed seismicity analysis of the southern Dead Sea area

    NASA Astrophysics Data System (ADS)

    Braeuer, Benjamin; Asch, Guenter; Hofstetter, Rami; Haberland, Christian; Jaser, Darwish; El-Kelani, Radwan; Weber, Michael

    2013-04-01

    While the Dead Sea basin has been studied for a long time, the available knowledge about the micro-seismicity, its distribution and characteristics is limited. Therefore, within the framework of the international DESIRE (DEad Sea Integrated REsearch) project, a dense temporary local seismological network was operated in the southern Dead Sea area. Within 18 month of recording 650 events were detected. Based on an already published tomography study clustering, focal mechanisms, statistics and the distribution of the micro-seismicity in relation to the velocity models from the tomography are analyzed. The determined b-value of 0.7 indicates a relatively high risk of large earthquakes compared to the moderate microseismic activity. The distribution of the seismicity suggests an asymmetric basin with a vertical strike slip fault forming the eastern boundary of the basin, and an inclined western boundary, made up of strike-slip and normal faults. Furthermore, significant differences between the area North and South of the Boqeq fault were observed. South of the Boqeq fault the western boundary is inactive while the entire seismicity occurs at the eastern boundary and below the basin-fill sediments. The largest events occurred here, their focal mechanisms represent the northwards transform motion of the Arabian plate along the Dead Sea Transform. The vertical extension of the the spatial and temporal cluster from February 2007 is interpreted as being related to the locking of the region around the Boqeq fault. North of the Boqeq fault similar seismic activity occurs at both boundaries most notably within the basin-fill sediments, displaying mainly small events with strike-slip mechanism and normal faulting in EW direction. Therefore, we suggest that the Boqeq fault forms the border between the "single" transform fault and the pull-apart basin with two active border faults.

  3. HANFORD DOUBLE SHELL TANK THERMAL AND SEISMIC PROJECT SUMMARY OF COMBINED THERMAL AND OPERATING LOADS WITH SEISMIC ANALYSIS

    SciTech Connect

    MACKEY TC; DEIBLER JE; RINKER MW; JOHNSON KI; ABATT FG; KARRI NK; PILLI SP; STOOPS KL

    2009-01-15

    This report summarizes the results of the Double-Shell Tank Thermal and Operating Loads Analysis (TaLA) combined with the Seismic Analysis. This combined analysis provides a thorough, defensible, and documented analysis that will become a part of the overall analysis of record for the Hanford double-shell tanks (DSTs). The bases of the analytical work presented herein are two ANSYS{reg_sign} finite element models that were developed to represent a bounding-case tank. The TaLA model includes the effects of temperature on material properties, creep, concrete cracking, and various waste and annulus pressure-loading conditions. The seismic model considers the interaction of the tanks with the surrounding soil including a range of soil properties, and the effects of the waste contents during a seismic event. The structural evaluations completed with the representative tank models do not reveal any structural deficiencies with the integrity of the DSTs. The analyses represent 60 years of use, which extends well beyond the current date. In addition, the temperature loads imposed on the model are significantly more severe than any service to date or proposed for the future. Bounding material properties were also selected to provide the most severe combinations. While the focus of the analyses was a bounding-case tank, it was necessary during various evaluations to conduct tank-specific analyses. The primary tank buckling evaluation was carried out on a tank-specific basis because of the sensitivity to waste height, specific gravity, tank wall thickness, and primary tank vapor space vacuum limit. For this analysis, the occurrence of maximum tank vacuum was classified as a service level C, emergency load condition. The only area of potential concern in the analysis was with the buckling evaluation of the AP tank, which showed the current limit on demand of l2-inch water gauge vacuum to exceed the allowable of 10.4 inches. This determination was based on analysis at the

  4. Singular spectral analysis based filtering of seismic signal using new Weighted Eigen Spectrogram

    NASA Astrophysics Data System (ADS)

    Rekapalli, Rajesh; Tiwari, R. K.

    2016-09-01

    Filtering of non-stationary noisy seismic signals using the fixed basis functions (sine and cosine) generates artifacts in the final output and thereby leads to wrong interpretation. In order to circumvent the problem, we propose here, a new Weighted Eigen Spectrogram (WES) based robust time domain Singular Spectrum Analysis (SSA) frequency filtering algorithm. The new WES is used to simplify the Eigen triplet grouping procedure in SSA. We tested the robustness of the algorithm on synthetic seismic data assorted with field-simulated noise. Then we applied the method to filter the high-resolution seismic reflection field data. The band pass filtering of noisy seismic records suggests that the underlying algorithm is efficient for improving the signal to noise ratio (S/N) and also it is user-friendly.

  5. Vulnerabilities to Rock-Slope Failure Impacts from Christchurch, NZ Case History Analysis

    NASA Astrophysics Data System (ADS)

    Grant, A.; Wartman, J.; Massey, C. I.; Olsen, M. J.; Motley, M. R.; Hanson, D.; Henderson, J.

    2015-12-01

    Rock-slope failures during the 2010/11 Canterbury (Christchurch), New Zealand Earthquake Sequence resulted in 5 fatalities and caused an estimated US$400 million of damage to buildings and infrastructure. Reducing losses from rock-slope failures requires consideration of both hazard (i.e. likelihood of occurrence) and risk (i.e. likelihood of losses given an occurrence). Risk assessment thus requires information on the vulnerability of structures to rock or boulder impacts. Here we present 32 case histories of structures impacted by boulders triggered during the 2010/11 Canterbury earthquake sequence, in the Port Hills region of Christchurch, New Zealand. The consequences of rock fall impacts on structures, taken as penetration distance into structures, are shown to follow a power-law distribution with impact energy. Detailed mapping of rock fall sources and paths from field mapping, aerial lidar digital elevation model (DEM) data, and high-resolution aerial imagery produced 32 well-constrained runout paths of boulders that impacted structures. Impact velocities used for structural analysis were developed using lumped mass 2-D rock fall runout models using 1-m resolution lidar elevation data. Model inputs were based on calibrated surface parameters from mapped runout paths of 198 additional boulder runouts. Terrestrial lidar scans and structure from motion (SfM) imagery generated 3-D point cloud data used to measure structural damage and impacting boulders. Combining velocity distributions from 2-D analysis and high-precision boulder dimensions, kinetic energy distributions were calculated for all impacts. Calculated impact energy versus penetration distance for all cases suggests a power-law relationship between damage and impact energy. These case histories and resulting fragility curve should serve as a foundation for future risk analysis of rock fall hazards by linking vulnerability data to the predicted energy distributions from the hazard analysis.

  6. Low carbon technology performance vs infrastructure vulnerability: analysis through the local and global properties space.

    PubMed

    Dawson, David A; Purnell, Phil; Roelich, Katy; Busch, Jonathan; Steinberger, Julia K

    2014-11-01

    Renewable energy technologies, necessary for low-carbon infrastructure networks, are being adopted to help reduce fossil fuel dependence and meet carbon mitigation targets. The evolution of these technologies has progressed based on the enhancement of technology-specific performance criteria, without explicitly considering the wider system (global) impacts. This paper presents a methodology for simultaneously assessing local (technology) and global (infrastructure) performance, allowing key technological interventions to be evaluated with respect to their effect on the vulnerability of wider infrastructure systems. We use exposure of low carbon infrastructure to critical material supply disruption (criticality) to demonstrate the methodology. A series of local performance changes are analyzed; and by extension of this approach, a method for assessing the combined criticality of multiple materials for one specific technology is proposed. Via a case study of wind turbines at both the material (magnets) and technology (turbine generators) levels, we demonstrate that analysis of a given intervention at different levels can lead to differing conclusions regarding the effect on vulnerability. Infrastructure design decisions should take a systemic approach; without these multilevel considerations, strategic goals aimed to help meet low-carbon targets, that is, through long-term infrastructure transitions, could be significantly jeopardized.

  7. Arctic indigenous youth resilience and vulnerability: comparative analysis of adolescent experiences across five circumpolar communities.

    PubMed

    Ulturgasheva, Olga; Rasmus, Stacy; Wexler, Lisa; Nystad, Kristine; Kral, Michael

    2014-10-01

    Arctic peoples today find themselves on the front line of rapid environmental change brought about by globalizing forces, shifting climates, and destabilizing physical conditions. The weather is not the only thing undergoing rapid change here. Social climates are intrinsically connected to physical climates, and changes within each have profound effects on the daily life, health, and well-being of circumpolar indigenous peoples. This paper describes a collaborative effort between university researchers and community members from five indigenous communities in the circumpolar north aimed at comparing the experiences of indigenous Arctic youth in order to come up with a shared model of indigenous youth resilience. The discussion introduces a sliding scale model that emerged from the comparative data analysis. It illustrates how a "sliding scale" of resilience captures the inherent dynamism of youth strategies for "doing well" and what forces represent positive and negative influences that slide towards either personal and communal resilience or vulnerability. The model of the sliding scale is designed to reflect the contingency and interdependence of resilience and vulnerability and their fluctuations between lowest and highest points based on timing, local situation, larger context, and meaning.

  8. Arctic indigenous youth resilience and vulnerability: comparative analysis of adolescent experiences across five circumpolar communities.

    PubMed

    Ulturgasheva, Olga; Rasmus, Stacy; Wexler, Lisa; Nystad, Kristine; Kral, Michael

    2014-10-01

    Arctic peoples today find themselves on the front line of rapid environmental change brought about by globalizing forces, shifting climates, and destabilizing physical conditions. The weather is not the only thing undergoing rapid change here. Social climates are intrinsically connected to physical climates, and changes within each have profound effects on the daily life, health, and well-being of circumpolar indigenous peoples. This paper describes a collaborative effort between university researchers and community members from five indigenous communities in the circumpolar north aimed at comparing the experiences of indigenous Arctic youth in order to come up with a shared model of indigenous youth resilience. The discussion introduces a sliding scale model that emerged from the comparative data analysis. It illustrates how a "sliding scale" of resilience captures the inherent dynamism of youth strategies for "doing well" and what forces represent positive and negative influences that slide towards either personal and communal resilience or vulnerability. The model of the sliding scale is designed to reflect the contingency and interdependence of resilience and vulnerability and their fluctuations between lowest and highest points based on timing, local situation, larger context, and meaning. PMID:25217145

  9. Low carbon technology performance vs infrastructure vulnerability: analysis through the local and global properties space.

    PubMed

    Dawson, David A; Purnell, Phil; Roelich, Katy; Busch, Jonathan; Steinberger, Julia K

    2014-11-01

    Renewable energy technologies, necessary for low-carbon infrastructure networks, are being adopted to help reduce fossil fuel dependence and meet carbon mitigation targets. The evolution of these technologies has progressed based on the enhancement of technology-specific performance criteria, without explicitly considering the wider system (global) impacts. This paper presents a methodology for simultaneously assessing local (technology) and global (infrastructure) performance, allowing key technological interventions to be evaluated with respect to their effect on the vulnerability of wider infrastructure systems. We use exposure of low carbon infrastructure to critical material supply disruption (criticality) to demonstrate the methodology. A series of local performance changes are analyzed; and by extension of this approach, a method for assessing the combined criticality of multiple materials for one specific technology is proposed. Via a case study of wind turbines at both the material (magnets) and technology (turbine generators) levels, we demonstrate that analysis of a given intervention at different levels can lead to differing conclusions regarding the effect on vulnerability. Infrastructure design decisions should take a systemic approach; without these multilevel considerations, strategic goals aimed to help meet low-carbon targets, that is, through long-term infrastructure transitions, could be significantly jeopardized. PMID:25296295

  10. Modal seismic analysis of a nuclear power plant control panel and comparison with SAP 4

    NASA Technical Reports Server (NTRS)

    Pamidi, M. R.; Pamidi, P. R.

    1976-01-01

    The application of NASTRAN to seismic analysis by considering the example of a nuclear power plant control panel was considered. A modal analysis of a three-dimensional model of the panel, consisting of beam and quadri-lateral membrane elements, is performed. Using the results of this analysis and a typical response spectrum of an earthquake, the seismic response of the structure is obtained. ALTERs required to the program in order to compute the maximum modal responses as well as the resultant response are given. The results are compared with those obtained by using the SAP IV computer program.

  11. Seismic response analysis of NAGRA-Net stations using advanced geophysical techniques

    NASA Astrophysics Data System (ADS)

    Poggi, Valerio; Edwards, Benjamin; Dal Moro, Giancarlo; Keller, Lorenz; Fäh, Donat

    2015-04-01

    In cooperation with the National Cooperative for the Disposal of Radioactive Waste (Nagra), the Swiss Seismological Service (SED) has recently completed the installation of ten new seismological observation stations, three of them including a co-located borehole sensor. The ultimate goal of the project is to densify the existing Swiss Digital Seismic Network (SDSNet) in northern Switzerland, in order to improve the detection of very-low magnitude events and to improve the accuracy of future location solutions. This is strategic for unbiased monitoring of micro seismicity at the locations of proposed nuclear waste repositories. To further improve the quality and usability of the recordings, a seismic characterization of the area surrounding the installation area was performed at each site. The investigation consisted of a preliminary geological and geotechnical study, followed by a seismic site response analysis by means of state-of-the-art geophysical techniques. For the borehole stations, in particular, the characterization was performed by combining different types of active seismic methods (P-S refraction tomography, surface wave analysis, Vertical Seismic Profiling - VSP) with ambient vibration based approaches (wavelet decomposition, H/V spectral ratio, polarization analysis, three-component f-k analysis). The results of all analyses converged to the definition of a mean velocity profile for the site, which was later used for the computation of engineering parameters (travel time average velocity and quarter-wavelength parameters) and the analytical SH-wave transfer function. Empirical site-amplification functions are automatically determined for any station connected to the Swiss seismic networks. They are determined based on building statistical models of systematic site-specific effects in recordings of small earthquakes when compared to the Swiss stochastic ground-motion model. Computed site response is validated through comparison with these empirical

  12. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  13. Analysis of broadband seismic noise at the German Regional Seismic Network and search for improved alternative station sites

    NASA Astrophysics Data System (ADS)

    Bormann, P.; Wylegalla, K.; Klinge, K.

    The German Regional Seismic Network (GRSN) comprizes now 16 digital broadband stations equipped with Wieland-Streckeisen STS-2 seismometers, 24-bit dataloggers and a seismological data center at Erlangen. It covers the whole territory of Germany with station-spacings between 80 km to 240 km. The stations are sited in very different environments ranging from near shore at the Baltic Sea coast up to distances of about 700 km from the coast, both within cities and up to about 10 km away from any major settlement, industry or traffic roads. The underground varies from outcropping hard rocks in Hercynian mountain areas, sedimentary rocks in areas of Mesozoic platform cover to up to 1.5 km unconsolidated Quarternary and Tertiary subsoil. Accordingly, seismic background noise varies in a wide range between the upper and lower bounds of the new global noise model. The noise conditions at the GRSN have been investigated systematically by means of displacement power spectral analysis within the frequency range 10-2 5 for RUE and > 10 for BSEG have been confirmed for frequencies between about 0.6 Hz 3 Hz. Strong lateral velocity and impedance contrasts between the outcropping Triassic/Permian sedimentary rocks and the surrounding unconsolidated Quarternary/Tertiary sediments are shown to be the main cause for the strong noise reduction and signal-to-noise ratio improvement at RUE and can account for about 50% of the noise reduction at BSEG.

  14. Discrimination between induced and natural seismicity by means of nonlinear analysis

    NASA Astrophysics Data System (ADS)

    Turuntaev, S. B.; Melchaeva, O. Yu.; Vorohobina, S. V.

    2012-04-01

    Uch-Terek Rivers in Kyrgyzstan; (3) the seismicity in the region of the Geysers geothermal complex in California, US; (4) the seismicity in the region of Bishkek geophysical test site, Kyrgyzstan, recorded before and after strong electromagnetic discharges. The nonlinear analysis of the data sets on seismicity showed that technogeneous action on the geophysical medium increases the regularity of the seismic regime. It looks like the formation of stable states characterized by a finite fractal dimension of the attractor and reasonable small dimension of the embedding space. The presence of the stable states opens the possibility of forecasting the development of induced seismic activity. We also present the results of nonlinear analysis of the rate-and-state model, which allows us to describe the mechanics of the studied phenomenon. In this context, the model of motion in the fault zones that obey the two-parameters friction law suggests that if the external action causes the critical stresses to decrease e.g. due to the growth of the pore pressure or due to heating of the fault zone, we should expect the deterministic component of the seismic process to increase.

  15. Seismic Activity in the Gulf of Mexico: a Preliminary Analysis

    NASA Astrophysics Data System (ADS)

    Franco, S. I.; Canet, C.; Iglesias, A.; Valdes-Gonzales, C. M.

    2013-05-01

    The southwestern corner of Gulf of Mexico (around the northern Isthmus of Tehuantepec) is exposed to an intense deep (> 100 km) seismic activity caused by the subduction of the Cocos plate. Aside from this, the gulf has been considered as a zone of low or no-seismicity. However, a sparse shallow seismic activity is observed across the Gulf of Mexico; some of these earthquakes have been strongly felt (e.g. 23/05/2007 and 10/09/2006), and the Jaltipan, 1959 earthquake caused fatalities and severe destruction in central and southern Veracruz. In this work we analyze 5 relevant earthquakes that occurred since 2001. At the central Gulf of Mexico focal mechanisms show inverse faults oriented approximately NW-SE with dip near 45 degrees, suggesting a link to sediment loading and/or to salt tectonics. On the other hand, in the southwestern corner of the gulf we analyzed some clear examples of strike-slip faults and activity probably related to the Veracruz Fault. One anomalous earthquake, recorded in 2007 in the western margin of the gulf, shows a strike-slip mechanism indicating a transform regime probably related with the East Mexican Fault. The recent improvement of the Mexican Seismological broadband network have allowed to record small earthquakes distributed in and around the Gulf of Mexico. Although the intermediate and large earthquakes in the region are infrequent, the historic evidence indicates that the magnitudes could reach Mw~6.4. This fact could be taken in consideration to reassess the seismic hazard for oil and industrial infrastructure in the region.

  16. Synergy of seismic, acoustic, and video signals in blast analysis

    SciTech Connect

    Anderson, D.P.; Stump, B.W.; Weigand, J.

    1997-09-01

    The range of mining applications from hard rock quarrying to coal exposure to mineral recovery leads to a great variety of blasting practices. A common characteristic of many of the sources is that they are detonated at or near the earth`s surface and thus can be recorded by camera or video. Although the primary interest is in the seismic waveforms that these blasts generate, the visual observations of the blasts provide important constraints that can be applied to the physical interpretation of the seismic source function. In particular, high speed images can provide information on detonation times of individuals charges, the timing and amount of mass movement during the blasting process and, in some instances, evidence of wave propagation away from the source. All of these characteristics can be valuable in interpreting the equivalent seismic source function for a set of mine explosions and quantifying the relative importance of the different processes. This paper documents work done at the Los Alamos National Laboratory and Southern Methodist University to take standard Hi-8 video of mine blasts, recover digital images from them, and combine them with ground motion records for interpretation. The steps in the data acquisition, processing, display, and interpretation are outlined. The authors conclude that the combination of video with seismic and acoustic signals can be a powerful diagnostic tool for the study of blasting techniques and seismology. A low cost system for generating similar diagnostics using consumer-grade video camera and direct-to-disk video hardware is proposed. Application is to verification of the Comprehensive Test Ban Treaty.

  17. Slope Stability Analysis In Seismic Areas Of The Northern Apennines (Italy)

    SciTech Connect

    Lo Presti, D.; Fontana, T.; Marchetti, D.

    2008-07-08

    Several research works have been published on the slope stability in the northern Tuscany (central Italy) and particularly in the seismic areas of Garfagnana and Lunigiana (Lucca and Massa-Carrara districts), aimed at analysing the slope stability under static and dynamic conditions and mapping the landslide hazard. In addition, in situ and laboratory investigations are available for the study area, thanks to the activities undertaken by the Tuscany Seismic Survey. Based on such a huge information the co-seismic stability of few ideal slope profiles have been analysed by means of Limit equilibrium method LEM - (pseudo-static) and Newmark sliding block analysis (pseudo-dynamic). The analysis--results gave indications about the most appropriate seismic coefficient to be used in pseudo-static analysis after establishing allowable permanent displacement. Such indications are commented in the light of the Italian and European prescriptions for seismic stability analysis with pseudo-static approach. The stability conditions, obtained from the previous analyses, could be used to define microzonation criteria for the study area.

  18. An interdisciplinary perspective on social and physical determinants of seismic risk

    NASA Astrophysics Data System (ADS)

    Lin, K.-H. E.; Chang, Y.-C.; Liu, G.-Y.; Chan, C.-H.; Lin, T.-H.; Yeh, C.-H.

    2015-10-01

    While disaster studies researchers usually view risk as a function of hazard, exposure, and vulnerability, few studies have systematically examined the relationships among the various physical and socioeconomic determinants underlying disasters, and fewer have done so through seismic risk analysis. In the context of the 1999 Chi-Chi earthquake in Taiwan, this study constructs three statistical models to test different determinants that affect disaster fatality at the village level, including seismic hazard, exposure of population and fragile buildings, and demographic and socioeconomic vulnerability. The Poisson regression model is used to estimate the impact of these factors on fatalities. Research results indicate that although all of the determinants have an impact on seismic fatality, some indicators of vulnerability, such as gender ratio, percentages of young and aged population, income and its standard deviation, are the important determinants deteriorating seismic risk. These findings have strong social implications for policy interventions to mitigate such disasters.

  19. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    SciTech Connect

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  20. Analysis of earthquake data recorded by digital field seismic systems, Jackass Flats, Nevada

    SciTech Connect

    Tarr, A.C.; Rogers, A.M.

    1986-12-31

    Analysis of 59 time series from ten small magnitude earthquakes recorded in 1981 by portable digital seismic systems at the southern Nevada Test Site (NTS) yielded several significant results. We find that moment magnitude (M/sub L/) (local magnitude determined from seismic moment) correlates closely with coda duration magnitudes M/sub d/ determined by the Southern Great Basin Seismic Network (SGBSN). Further, local magnitude M/sub W A/ computed from displacement seismograms simulating Wood-Anderson response are, on average, 0.38 magnitude units larger than M/sub d/ and 0.44 magnitude units larger than (M/sub L/). Another result is that stress drops for the ten earthquakes are significantly smaller than typical stress drops for earthquakes of similar seismic moment in California. Similarly, determinations of the peak ground motion parameter Rv are up to 10 to 20 times smaller than a previously determined empirical formula relating Rv to seismic moment. We conclude that seismic waves from southern Nevada Test Site earthquakes suffer from significant anelastic attenuation, possibly in the near-surface crust under the recording sites, yielding reduced amplitude and frequency of the peak ground motion and shifting the apparent corner frequency of the source spectrum to lower values, thereby producing unexpectedly low stress drops.

  1. Review of seismic probabilistic risk assessment and the use of sensitivity analysis

    SciTech Connect

    Shiu, K.K.; Reed, J.W.; McCann, M.W. Jr.

    1985-01-01

    This paper presents results of sensitivity reviews performed to address a range of questions which arise in the context of seismic probabilistic risk assessment (PRA). These questions are the subject of this paper. A seismic PRA involves evalution of seismic hazard, component fragilities, and system responses. They are combined in an integrated analysis to obtain various risk measures, such as frequency of plant damage states. Calculation of these measures depends on combination of non-linear functions based on a number of parameters and assumptions used in the quantification process. Therefore it is often difficult to examine seismic PRA results and derive useful insights from them if detailed sensitivity studies are absent. This has been exempified in the process of trying to understand the role of low acceleration earthquakes in overall seismic risk. It is useful to understand, within a probabilistic framework, what uncertainties in the physical properties of the plant can be tolerated, if the risk from a safe shutdown earthquake is to be considered negligible. Seismic event trees and fault trees were developed to model the difference system and plant accident sequences. Hazard curves which represent various sites on the east coast were obtained; alternate structure and equipment fragility data were postulated. Various combinations of hazard and fragility data were analyzed. In addition, system modeling was perturbed to examine the impact upon the final results. Orders of magnitude variation were observed in the plant damage state frequency among the different cases. 7 refs.

  2. Detection capability of seismic network based on noise analysis and magnitude of completeness

    NASA Astrophysics Data System (ADS)

    Fischer, Tomáš; Bachura, Martin

    2014-01-01

    Assessing the detection threshold of seismic networks becomes of increased importance namely in the context of monitoring induced seismicity due to underground operations. Achieving the maximum possible sensitivity of industrial seismic monitoring is a precondition for successful control of technological procedures. Similarly, the lowest detection threshold is desirable when monitoring the natural seismic activity aimed to imaging the fault structures in 3D and to understanding the ongoing processes in the crust. We compare the application of two different methods to the data of the seismic network WEBNET that monitors the earthquake swarm activity of the West-Bohemia/Vogtland region. First, we evaluate the absolute noise level and its possible non-stationary character that results in hampering the detectability of the seismic network by producing false alarms. This is realized by the statistical analysis of the noise amplitudes using the ratio of 99 and 95 percentiles. Second, the magnitude of completeness is determined for each of the nine stations by analysing the automatic detections of an intensive swarm period from August 2011. The magnitude-frequency distributions of all detected events and events detected at individual stations are compared to determine the magnitude of completeness at a selected completeness level. The resulting magnitude of completeness M c of most of the stations varies between -0.9 and -0.5; an anomalous high M c of 0.0 is found at the most distant station, which is probably due to inadequate correction for attenuation. We find that while the absolute noise level has no significant influence to the station sensitivity, the noise stationarity correlates with station sensitivity expressed in low magnitude of completeness and vice versa. This qualifies the method of analysing the stationary character of seismic noise as an effective tool for site surveying during the seismic station deployment.

  3. Seismic fragility analysis of typical pre-1990 bridges due to near- and far-field ground motions

    NASA Astrophysics Data System (ADS)

    Mosleh, Araliya; Razzaghi, Mehran S.; Jara, José; Varum, Humberto

    2016-03-01

    Bridge damages during the past earthquakes caused several physical and economic impacts to transportation systems. Many of the existing bridges in earthquake prone areas are pre-1990 bridges and were designed with out of date regulation codes. The occurrences of strong motions in different parts of the world show every year the vulnerability of these structures. Nonlinear dynamic time history analyses were conducted to assess the seismic vulnerability of typical pre-1990 bridges. A family of existing concrete bridge representative of the most common bridges in the highway system in Iran is studied. The seismic demand consists in a set of far-field and near-field strong motions to evaluate the likelihood of exceeding the seismic capacity of the mentioned bridges. The peak ground accelerations (PGAs) were scaled and applied incrementally to the 3D models to evaluate the seismic performance of the bridges. The superstructure was assumed to remain elastic and the nonlinear behavior in piers was modeled by assigning plastic hinges in columns. In this study the displacement ductility and the PGA are selected as a seismic performance indicator and intensity measure, respectively. The results show that pre-1990 bridges subjected to near-fault ground motions reach minor and moderate damage states.

  4. Analysis of bathymetric surveys to identify coastal vulnerabilities at Cape Canaveral, Florida

    USGS Publications Warehouse

    Thompson, David M.; Plant, Nathaniel G.; Hansen, Mark E.

    2015-10-07

    The purpose of this work is to describe an updated bathymetric dataset collected in 2014 and compare it to previous datasets. The updated data focus on the bathymetric features and sediment transport pathways that connect the offshore regions to the shoreline and, therefore, are related to the protection of other portions of the coastal environment, such as dunes, that support infrastructure and ecosystems. Previous survey data include National Oceanic and Atmospheric Administration’s (NOAA) National Ocean Service (NOS) hydrographic survey from 1956 and a USGS survey from 2010 that is augmented with NOS surveys from 2006 and 2007. The primary result of this analysis is documentation and quantification of the nature and rates of bathymetric changes that are near (within about 2.5 km) the current Cape Canaveral shoreline and interpretation of the impact of these changes on future erosion vulnerability.

  5. Analysis of bathymetric surveys to identify coastal vulnerabilities at Cape Canaveral, Florida

    USGS Publications Warehouse

    Thompson, David M.; Plant, Nathaniel G.; Hansen, Mark E.

    2015-01-01

    The purpose of this work is to describe an updated bathymetric dataset collected in 2014 and compare it to previous datasets. The updated data focus on the bathymetric features and sediment transport pathways that connect the offshore regions to the shoreline and, therefore, are related to the protection of other portions of the coastal environment, such as dunes, that support infrastructure and ecosystems. Previous survey data include National Oceanic and Atmospheric Administration’s (NOAA) National Ocean Service (NOS) hydrographic survey from 1956 and a USGS survey from 2010 that is augmented with NOS surveys from 2006 and 2007. The primary result of this analysis is documentation and quantification of the nature and rates of bathymetric changes that are near (within about 2.5 km) the current Cape Canaveral shoreline and interpretation of the impact of these changes on future erosion vulnerability.

  6. A non extensive statistical physics analysis of the Hellenic subduction zone seismicity

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Papadakis, G.; Michas, G.; Sammonds, P.

    2012-04-01

    The Hellenic subduction zone is the most seismically active region in Europe [Becker & Meier, 2010]. The spatial and temporal distribution of seismicity as well as the analysis of the magnitude distribution of earthquakes concerning the Hellenic subduction zone, has been studied using the concept of Non-Extensive Statistical Physics (NESP) [Tsallis, 1988 ; Tsallis, 2009]. Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems (Vallianatos, 2011). Using this concept, Abe & Suzuki (2003;2005) investigated the spatial and temporal properties of the seismicity in California and Japan and recently Darooneh & Dadashinia (2008) in Iran. Furthermore, Telesca (2011) calculated the thermodynamic parameter q of the magnitude distribution of earthquakes of the southern California earthquake catalogue. Using the external seismic zones of 36 seismic sources of shallow earthquakes in the Aegean and the surrounding area [Papazachos, 1990], we formed a dataset concerning the seismicity of shallow earthquakes (focal depth ≤ 60km) of the subduction zone, which is based on the instrumental data of the Geodynamic Institute of the National Observatory of Athens (http://www.gein.noa.gr/, period 1990-2011). The catalogue consists of 12800 seismic events which correspond to 15 polygons of the aforementioned external seismic zones. These polygons define the subduction zone, as they are associated with the compressional stress field which characterizes a subducting regime. For each event, moment magnitude was calculated from ML according to the suggestions of Papazachos et al. (1997). The cumulative distribution functions of the inter-event times and the inter-event distances as well as the magnitude distribution for each seismic zone have been estimated, presenting a variation in the q-triplet along the Hellenic subduction zone. The models used, fit rather well to the observed

  7. Analysis of the Risk and Vulnerability of the Cancun Beach System-Wilma Hurricane Case

    NASA Astrophysics Data System (ADS)

    Silva, R.; Ruiz, G.; Escalante, E.

    2007-05-01

    In the last decade, many researchers have been focused on the growth in risk associated with global warming and its implications; such as rising sea levels, increasing cyclone frequency and intensity, among others. However, in some cases, for an adequate understanding of the processes, it is also important to incorporate short time analysis of anthropogenic modifications that induce increased vulnerability, for example the effects of Hurricane Wilma (2005) at Cancun, Mexico. Cancun is located on the Mexican Caribbean Sea (latitude 21º05' N, longitude 86º46' W) and is the most important tourist destination in Mexico. For this research several studies have been carried out integrating previous reports, historical photo analysis, field work and the application of several numerical models (wave, currents, storm surge, sediment transport, etc.) for the characterization of the system for normal and extreme conditions. The measurements of wave conditions during the passing of Hurricane Wilma in front of Cancun show maximum wave heights of around 18 m, mean wave periods of 16 s, surface and bottom currents of 2 m/s. Incredibly, more than 7 million cubic meters of sand were moved from the Cancun beach system to other coast cells thus leaving the resort with no beach. The data presented concerning modifications on the barrier island demonstrates that these extreme meteorological events were responsible for the littoral changes, due to the loss of system flexibility in the biological dynamics and physical equilibrium of the systems, with social, environmental and economic implications. The main conclusion of this work is that local anthropogenic modifications have induced more vulnerability and risk to Cancun beach than those associated with global warming.

  8. Real time magma transport imaging and earthquake localization using seismic amplitude ratio analysis

    NASA Astrophysics Data System (ADS)

    Taisne, B.; Brenguier, F.; Nercessian, A.; Beauducel, F.; Smith, P. J.

    2011-12-01

    Seismic amplitude ratio analysis (SARA) has been used successfully to track the sub-surface migration of magma prior to an eruption at Piton de la Fournaise volcano, La Réunion. The methodology is based on the temporal analysis of the seismic amplitude ratio between different pairs of stations, along with a model of seismic wave attenuation. This method has already highlighted the complexity of magma migration in the shallower part of the volcanic edifice during a seismic crisis using continuous records. We will see that this method can also be applied to the localization of individual earthquakes triggered by monitoring systems, prior to human intervention such as phase picking. As examples, the analysis is performed on two kinds of seismic events observed at Soufrière Hills Volcano, Montserrat during the last 15 years, namely: Hybrids events and Volcano-Tectonic earthquakes. Finally, we present the implementation of a fully automatic SARA method for monitoring of Piton de la Fournaise volcano using continuous data in real-time.

  9. Preliminary Analysis of Remote Triggered Seismicity in Northern Baja California Generated by the 2011, Tohoku-Oki, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Wong-Ortega, V.; Castro, R. R.; Gonzalez-Huizar, H.; Velasco, A. A.

    2013-05-01

    We analyze possible variations of seismicity in the northern Baja California due to the passage of seismic waves from the 2011, M9.0, Tohoku-Oki, Japan earthquake. The northwestern area of Baja California is characterized by a mountain range composed of crystalline rocks. These Peninsular Ranges of Baja California exhibits high microseismic activity and moderate size earthquakes. In the eastern region of Baja California shearing between the Pacific and the North American plates takes place and the Imperial and Cerro-Prieto faults generate most of the seismicity. The seismicity in these regions is monitored by the seismic network RESNOM operated by the Centro de Investigación Científica y de Educación Superior de Ensenada (CICESE). This network consists of 13 three-component seismic stations. We use the seismic catalog of RESNOM to search for changes in local seismic rates occurred after the passing of surface waves generated by the Tohoku-Oki, Japan earthquake. When we compare one month of seismicity before and after the M9.0 earthquake, the preliminary analysis shows absence of triggered seismicity in the northern Peninsular Ranges and an increase of seismicity south of the Mexicali valley where the Imperial fault jumps southwest and the Cerro Prieto fault continues.

  10. Vulnerability assessment of medieval civic towers as a tool for retrofitting design

    SciTech Connect

    Casciati, Sara; Faravelli, Lucia

    2008-07-08

    The seismic vulnerability of an ancient civic bell-tower is studied. Rather than seeing it as an intermediate stage toward a risk analysis, the assessment of vulnerability is here pursued for the purpose of optimizing the retrofit design. The vulnerability curves are drawn by carrying out a single time history analysis of a model calibrated on the basis of experimental data. From the results of this analysis, the medians of three selected performance parameters are estimated, and they are used to compute, for each of them, the probability of exceeding or attaining the three corresponding levels of light, moderate and severe damage. The same numerical model is then used to incorporate the effects of several retrofitting solutions and to re-estimate the associated vulnerability curves. The ultimate goal is to provide a numerical tool able to drive the optimization process of a retrofit design by the comparison of the vulnerability estimates associated with the different retrofitting solutions.

  11. Use of cartography in historical seismicity analysis: a reliable tool to better apprehend the contextualization of the historical documents

    NASA Astrophysics Data System (ADS)

    Thibault, Fradet; Grégory, Quenet; Kevin, Manchuel

    2014-05-01

    Historical studies, including historical seismicity analysis, deal with historical documents. Numerous factors, such as culture, social condition, demography, political situations and opinions or religious ones influence the way the events are transcribed in the archives. As a consequence, it is crucial to contextualize and compare the historical documents reporting on a given event in order to reduce the uncertainties affecting their analysis and interpretation. When studying historical seismic events it is often tricky to have a global view of all the information provided by the historical documents. It is also difficult to extract cross-correlated information from the documents and draw a precise historical context. Use of cartographic and geographic tools in GIS software is the best tool for the synthesis, interpretation and contextualization of the historical material. The main goal is to produce the most complete dataset of available information, in order to take into account all the components of the historical context and consequently improve the macroseismic analysis. The Entre-Deux-Mers earthquake (1759, Iepc= VII-VIII) [SISFRANCE 2013 - EDF-IRSN-BRGM] is well documented but has never benefited from a cross-analysis of historical documents and historical context elements. The map of available intensity data from SISFRANCE highlights a gap in macroseismic information within the estimated epicentral area. The aim of this study is to understand the origin of this gap by making a cartographic compilation of both, archive information and historical context elements. The results support the hypothesis that the lack of documents and macroseismic data in the epicentral area is related to a low human activity rather than low seismic effects in this zone. Topographic features, geographical position, flood hazard, roads and pathways locations, vineyards distribution and the forester coverage, mentioned in the archives and reported on the Cassini's map confirm this

  12. BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM.

    SciTech Connect

    DEGRASSI,G.; HOFMAYER,C.; MURPHY,C.; SUZUKI,K.; NAMITA,Y.

    2003-08-17

    The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design codes; and to assess new piping code allowable stress rules. Under this program, NUPEC has performed a large-scale seismic proving test of a representative nuclear power plant piping system. In support of the proving test, a series of materials tests, static and dynamic piping component tests, and seismic tests of simplified piping systems have also been performed. As part of collaborative efforts between the United States and Japan on seismic issues, the US Nuclear Regulatory Commission (USNRC) and its contractor, the Brookhaven National Laboratory (BNL), are participating in this research program by performing pre-test and post-test analyses, and by evaluating the significance of the program results with regard to safety margins. This paper describes BNL's pre-test analysis to predict the elasto-plastic response for one of NUPEC's simplified piping system seismic tests. The capability to simulate the anticipated ratcheting response of the system was of particular interest. Analyses were performed using classical bilinear and multilinear kinematic hardening models as well as a nonlinear kinematic hardening model. Comparisons of analysis results for each plasticity model against test results for a static cycling elbow component test and for a simplified piping system seismic test are presented in the paper.

  13. Prioritizing health: a human rights analysis of disaster, vulnerability, and urbanization in New Orleans and Port-au-Prince.

    PubMed

    Carmalt, Jean

    2014-06-14

    Climate change prompts increased urbanization and vulnerability to natural hazards. Urbanization processes are relevant to a right to health analysis of natural hazards because they can exacerbate pre-disaster inequalities that create vulnerability. The 2010 earthquake in Port-au-Prince and the 2005 hurricane in New Orleans provide vivid illustrations of the relationship between spatial inequality and the threats associated with natural hazards. The link between urbanization processes, spatial inequality, and vulnerability to natural hazards is important in terms of an analysis of the right to health; in particular, it provides a basis for arguing that states should prioritize equitable land use and development as a matter of human rights. This article draws on work by geographers, disaster specialists, and international legal scholars to argue that inequitable urbanization processes violate the obligations to respect, protect, and fulfill the human right to health in disaster-prone regions.

  14. Vulnerability of Karangkates dams area by means of zero crossing analysis of data magnetic

    SciTech Connect

    Sunaryo, E-mail: sunaryo.geofis.ub@gmail.com; Susilo, Adi

    2015-04-24

    Study with entitled Vulnerability Karangkates Dam Area By Means of Zero Crossing Analysis of Data Magnetic has been done. The study was aimed to obtain information on the vulnerability of two parts area of Karangkates dams, i.e. Lahor dam which was inaugurated in 1977 and Sutami dam inaugurated in 1981. Three important things reasons for this study are: 1). The dam age was 36 years old for Lahor dam and 32 years old for Sutami dam, 2). Geologically, the location of the dams are closed together to the Pohgajih local shear fault, Selorejo local fault, and Selorejo limestone-andesite rocks contact plane, and 3). Karangkates dams is one of the important Hydro Power Plant PLTA with the generating power of about 400 million KWH per year from a total of about 29.373MW installed in Indonesia. Geographically, the magnetic data acquisition was conducted at coordinates (112.4149oE;-8.2028oS) to (112.4839oE;-8.0989oS) by using Proton Precession Magnetometer G-856. Magnetic Data acquisition was conducted in the radial direction from the dams with diameter of about 10 km and the distance between the measurements about 500m. The magnetic data acquisition obtained the distribution of total magnetic field value in the range of 45800 nT to 44450 nT. Residual anomalies obtained by doing some corrections, including diurnal correction, International Geomagnetic Reference Field (IGRF) correction, and reductions so carried out the distribution of the total magnetic field value in the range of -650 nT to 700 nT. Based on the residual anomalies, indicate the presence of 2 zones of closed closures dipole pairs at located in the west of the Sutami dam and the northwest of the Lahor dam from 5 total zones. Overlapping on the local geological map indicated the lineament of zero crossing patterns in the contour of residual anomaly contour with the Pohgajih shear fault where located at about 4 km to the west of the Sutami dam approximately and andesite-limestone rocks contact where located

  15. Vulnerability of Karangkates dams area by means of zero crossing analysis of data magnetic

    NASA Astrophysics Data System (ADS)

    Sunaryo, Susilo, Adi

    2015-04-01

    Study with entitled Vulnerability Karangkates Dam Area By Means of Zero Crossing Analysis of Data Magnetic has been done. The study was aimed to obtain information on the vulnerability of two parts area of Karangkates dams, i.e. Lahor dam which was inaugurated in 1977 and Sutami dam inaugurated in 1981. Three important things reasons for this study are: 1). The dam age was 36 years old for Lahor dam and 32 years old for Sutami dam, 2). Geologically, the location of the dams are closed together to the Pohgajih local shear fault, Selorejo local fault, and Selorejo limestone-andesite rocks contact plane, and 3). Karangkates dams is one of the important Hydro Power Plant PLTA with the generating power of about 400 million KWH per year from a total of about 29.373MW installed in Indonesia. Geographically, the magnetic data acquisition was conducted at coordinates (112.4149oE;-8.2028oS) to (112.4839oE;-8.0989oS) by using Proton Precession Magnetometer G-856. Magnetic Data acquisition was conducted in the radial direction from the dams with diameter of about 10 km and the distance between the measurements about 500m. The magnetic data acquisition obtained the distribution of total magnetic field value in the range of 45800 nT to 44450 nT. Residual anomalies obtained by doing some corrections, including diurnal correction, International Geomagnetic Reference Field (IGRF) correction, and reductions so carried out the distribution of the total magnetic field value in the range of -650 nT to 700 nT. Based on the residual anomalies, indicate the presence of 2 zones of closed closures dipole pairs at located in the west of the Sutami dam and the northwest of the Lahor dam from 5 total zones. Overlapping on the local geological map indicated the lineament of zero crossing patterns in the contour of residual anomaly contour with the Pohgajih shear fault where located at about 4 km to the west of the Sutami dam approximately and andesite-limestone rocks contact where located at

  16. Improving resolution of crosswell seismic section based on time-frequency analysis

    SciTech Connect

    Luo, H.; Li, Y.

    1994-12-31

    According to signal theory, to improve resolution of seismic section is to extend high-frequency band of seismic signal. In cross-well section, sonic log can be regarded as a reliable source providing high-frequency information to the trace near the borehole. In such case, what to do is to introduce this high-frequency information into the whole section. However, neither traditional deconvolution algorithms nor some new inversion methods such as BCI (Broad Constraint Inversion) are satisfied because of high-frequency noise and nonuniqueness of inversion results respectively. To overcome their disadvantages, this paper presents a new algorithm based on Time-Frequency Analysis (TFA) technology which has been increasingly received much attention as an useful signal analysis too. Practical applications show that the new method is a stable scheme to improve resolution of cross-well seismic section greatly without decreasing Signal to Noise Ratio (SNR).

  17. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... COMMISSION Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment AGENCY: Nuclear Regulatory Commission (NRC.../ COL-ISG-020 titled ``Implementation of a Seismic Margin Analysis for New Reactors Based...

  18. Statistical analysis and modeling of seismicity related to the exploitation of geothermal energy

    NASA Astrophysics Data System (ADS)

    Dinske, Carsten; Langenbruch, Cornelius; Shapiro, Serge

    2016-04-01

    catalogs of the considered reservoirs contain approximately 50 per cent of the number of events in the original catalogs. Furthermore, we perform ETAS modeling (Epidemic Type Aftershock model, Ogata, 1985,1988) for two reasons. First, we want to understand if the different reservoirs are also comparable in the earthquake interaction patterns and hence in the aftershock triggering following larger magnitude induced events. Second, if we identify systematic patterns, the ETAS modeling can contribute to the forecast and consequently to the mitigation of seismicity during production of geothermal energy. We find that stationary ETAS models can not accurately capture the observed seismicity rate changes. One reason for this finding is given by the rate of induced events (or the back-ground activity in the ETAS model) which is not constant with time. Therefore we apply non-stationary ETAS modeling which results in a good agreement between observation and model. However, the needed non-stationarity in the process complicates the application of ETAS modeling for the forecast of seismicity during production. Thus, its implementation in so-called traffic-light-systems for the mitigation of possible seismic hazard requires further detailed analysis.

  19. Proteomic analysis of the nucleus accumbens of rats with different vulnerability to cocaine addiction.

    PubMed

    del Castillo, Carmen; Morales, Lidia; Alguacil, Luis F; Salas, Elisabet; Garrido, Elisa; Alonso, Elba; Pérez-García, Carmen

    2009-07-01

    Vulnerability to the addictive effects of drugs of abuse varies among individuals, but the biological basis of these differences are poorly known. This work tries to increase this knowledge by comparing the brain proteome of animals with different rate of extinction of cocaine-seeking behaviour. To achieve this goal, we used a place-preference paradigm to separate Sprague Dawley rats in two groups: rats that extinguished (E) and rats that did not extinguish (NE) cocaine-seeking behaviour after a five-day period of drug abstinence. Once the phenotype was established, we compared the protein expression in the nucleus accumbens (NAC) of these animals after a single injection of either saline (SAL) or cocaine (COC, 15 mg/kg). The analysis of protein expression was performed by 2-dimensional electrophoresis followed by matrix-assisted laser desorption/ionization time of flight mass spectrometry. When comparing E SAL and NE SAL animals we found significant differences in the expression level of 5 proteins: ATP synthase subunit alpha, fumarate hydratase, transketolase, NADH dehydrogenase [ubiquinone] flavoprotein 2 and glutathione transferase omega-1. A single injection of COC differently alters the NAC proteome of E and NE rats; thus in E COC animals there was an alteration in the expression of 6 proteins, including dihydropyrimidinase-related protein 2 and NADH dehydrogenase [ubiquinone] 1 alpha subcomplex subunit 10; whereas in NE COC rats 9 proteins were altered (including alpha-synuclein, peroxiredoxin-2 and peroxiredoxin-5). These proteins could be potential biomarkers of individual vulnerability to cocaine abuse and may be helpful in designing new treatments for cocaine addiction.

  20. Seismic fragility evaluation of a piping system in a nuclear power plant by shaking table test and numerical analysis

    SciTech Connect

    Kim, M. K.; Kim, J. H.; Choi, I. K.

    2012-07-01

    In this study, a seismic fragility evaluation of the piping system in a nuclear power plant was performed. For the evaluation of seismic fragility of the piping system, this research was progressed as three steps. At first, several piping element capacity tests were performed. The monotonic and cyclic loading tests were conducted under the same internal pressure level of actual nuclear power plants to evaluate the performance. The cracks and wall thinning were considered as degradation factors of the piping system. Second, a shaking tale test was performed for an evaluation of seismic capacity of a selected piping system. The multi-support seismic excitation was performed for the considering a difference of an elevation of support. Finally, a numerical analysis was performed for the assessment of seismic fragility of piping system. As a result, a seismic fragility for piping system of NPP in Korea by using a shaking table test and numerical analysis. (authors)

  1. Gas hydrate and seismic data analysis by using theoretical approches

    NASA Astrophysics Data System (ADS)

    Tinivella, U.; Accaino, F.; Giustiniani, M.; Loreto, M. F.

    2009-04-01

    In order to quantify the concentrations of gas hydrate and free gas in the pore space, we use a precedure based on theoretical model (Biot equations and approximations in case of seismic frequency). This approach models the different layers associated with the BSR (two solids -grains and clathrates- and two fluids -water and free gas-) including an explicit dependence on differential pressure and depth, and the effects of cementation by hydrate on shear modulus of the sediment matrix. The theory gives both compressional and shear wave velocities, and needs easy to hypothesise physical parameters (porosity, compressibility, rigidity, density, frequency dependence). These can be determined from available lithostratigraphic information, and experimental data sets if no direct measurements are available. In particular, a detailed geological knowledge of the area is essential in order to suppose a normal gradients of physical properties (porosity, density, rigidity, and compressibility) of marine sediments, in order to correctly associate the velocity anomalies to chlatrate and free gas presences and avoid misinterpretations, like the case of over- and/or under-consolidated sediments. Our theory can be applied in the cases of i) full water saturation (to reproduce absence of gas in either hydrated or gaseous phase), ii) water and gas hydrates in the pore space, and iii) water and free gas in the pore space, even if the free gas is in overpressure condition. The effect of grains cementation when the concentration of gas hydrates is high, is considered by application of the percolation model, which describes the transition of a two-phase system from a continuous (grain cementation) to a discontinuous (no cementation) state. It is finally worth to mention that a coupling factor describes the degree of coupling between pore fluid and solid frame. The concentrations can be estimated by fitting the theoretical velocity to the experimental P-wave velocity obtained from travel

  2. Tremor patches in Cascadia revealed by seismic array analysis

    NASA Astrophysics Data System (ADS)

    Ghosh, Abhijit; Vidale, John E.; Sweet, Justin R.; Creager, Kenneth C.; Wech, Aaron G.

    2009-09-01

    Episodic tremor and slip (ETS) events in Cascadia have recently been observed, illuminating the general area that radiates seismic energy in the form of non-volcanic tremor (NVT). However, the picture of the ETS zone remains fuzzy because of difficulties in tremor detection and location. To observe the intimate details of tremor, we deployed a dense 84-element small-aperture seismic array on the Olympic Peninsula, Washington, above the tremor migration path. It recorded the main ETS event in May 2008, as well as a weaker tremor episode two months earlier. Using a beamforming technique, we are able to capture and track tremor activity with an unprecedented resolution from southern Puget Sound to the Strait of Juan de Fuca. The array technique reveals up to four times more duration of tremor compared to the conventional envelope cross-correlation method. Our findings suggest that NVT is not uniformly distributed on the subduction interface, and unveils several distinct patches that release much of the tremor moment. The patches appear to be devoid of ordinary earthquakes, and may indicate the heterogeneity in fault strength that affects the modes of stress release within the ETS zone.

  3. The Effect Analysis of Strain Rate on Power Transmission Tower-Line System under Seismic Excitation

    PubMed Central

    Wang, Wenming

    2014-01-01

    The effect analysis of strain rate on power transmission tower-line system under seismic excitation is studied in this paper. A three-dimensional finite element model of a transmission tower-line system is created based on a real project. Using theoretical analysis and numerical simulation, incremental dynamic analysis of the power transmission tower-line system is conducted to investigate the effect of strain rate on the nonlinear responses of the transmission tower and line. The results show that the effect of strain rate on the transmission tower generally decreases the maximum top displacements, but it would increase the maximum base shear forces, and thus it is necessary to consider the effect of strain rate on the seismic analysis of the transmission tower. The effect of strain rate could be ignored for the seismic analysis of the conductors and ground lines, but the responses of the ground lines considering strain rate effect are larger than those of the conductors. The results could provide a reference for the seismic design of the transmission tower-line system. PMID:25105157

  4. The effect analysis of strain rate on power transmission tower-line system under seismic excitation.

    PubMed

    Tian, Li; Wang, Wenming; Qian, Hui

    2014-01-01

    The effect analysis of strain rate on power transmission tower-line system under seismic excitation is studied in this paper. A three-dimensional finite element model of a transmission tower-line system is created based on a real project. Using theoretical analysis and numerical simulation, incremental dynamic analysis of the power transmission tower-line system is conducted to investigate the effect of strain rate on the nonlinear responses of the transmission tower and line. The results show that the effect of strain rate on the transmission tower generally decreases the maximum top displacements, but it would increase the maximum base shear forces, and thus it is necessary to consider the effect of strain rate on the seismic analysis of the transmission tower. The effect of strain rate could be ignored for the seismic analysis of the conductors and ground lines, but the responses of the ground lines considering strain rate effect are larger than those of the conductors. The results could provide a reference for the seismic design of the transmission tower-line system. PMID:25105157

  5. The effect analysis of strain rate on power transmission tower-line system under seismic excitation.

    PubMed

    Tian, Li; Wang, Wenming; Qian, Hui

    2014-01-01

    The effect analysis of strain rate on power transmission tower-line system under seismic excitation is studied in this paper. A three-dimensional finite element model of a transmission tower-line system is created based on a real project. Using theoretical analysis and numerical simulation, incremental dynamic analysis of the power transmission tower-line system is conducted to investigate the effect of strain rate on the nonlinear responses of the transmission tower and line. The results show that the effect of strain rate on the transmission tower generally decreases the maximum top displacements, but it would increase the maximum base shear forces, and thus it is necessary to consider the effect of strain rate on the seismic analysis of the transmission tower. The effect of strain rate could be ignored for the seismic analysis of the conductors and ground lines, but the responses of the ground lines considering strain rate effect are larger than those of the conductors. The results could provide a reference for the seismic design of the transmission tower-line system.

  6. Resolution analysis of high-resolution marine seismic data acquired off Yeosu, Korea

    NASA Astrophysics Data System (ADS)

    Lee, Ho-Young; Kim, Wonsik; Koo, Nam-Hyung; Park, Keun-Pil; Yoo, Dong-Geun; Kang, Dong-Hyo; Kim, Young-Gun; Seo, Gab-Seok; Hwang, Kyu-Duk

    2014-05-01

    High-resolution marine seismic surveys have been conducted for the mineral exploration and engineering purpose survey. To improve the quality of high-resolution seismic data, small-scaled multi-channel seismic techniques are used. In this study, we designed high-resolution marine seismic survey using a small airgun and an 8-channel streamer cable and analyzed the resolution of the seismic data related to acquisition and processing parameters. The field survey was conducted off Yeosu, Korea where the stratified thin sedimentary layers are deposited. We used a 30 in3 airgun and an 8-channel streamer cable with a 5 m group interval. We shoot the airgun with a 5 m shot interval and recorded digital data with a 0.1 ms sample interval and 1 s record length. The offset between the source and the first channel was 20 m. We processed the acquired data with simple procedure such as gain recovery, deconvolution, digital filtering, CMP sorting, NMO correction, static correction and stacking. To understand the effect of the acquisition parameters on the vertical and horizontal resolution, we resampled the acquired data using various sample intervals and CMP intervals and produced seismic sections. The analysis results show that the detailed subsurface structures can be imaged with good resolution and continuity using acquisition parameters with a sample interval shorter than 0.2 ms and a CMP interval shorter than 2.5 m. A high-resolution marine 8-channel airgun seismic survey using appropriate acquisition and processing parameters can be effective in imaging marine subsurface structure with a high resolution. This study is a part of a National Research Laboratory (NRL) project and a part of an Energy Technology Innovation (ETI) Project of the Korea Institute of Energy Technology Evaluation and Planning (KETEP), funded by the Ministry of Trade, Industry and Energy (MOTIE). The authors thank the officers and crew of the R/V Tamhae II for their efforts in the field survey.

  7. Analysis of the seismicity in the region of Mirovo salt mine after 8 years monitoring

    NASA Astrophysics Data System (ADS)

    Dimitrova, Liliya; Solakov, Dimcho; Simeonova, Stela; Aleksandrova, Irena; Georgieva, Gergana

    2015-04-01

    Mirovo salt deposit is situated in the NE part of Bulgaria and 5 kilometers away from the town of Provadiya. The mine is in operation since 1956. The salt is produced by dilution and extraction of the brine to the surface. A system of chambers-pillars is formed within the salt body as a result of the applied technology. The mine is situated in a seismically quiet part of the state. The region is characterized with complex geological structure and several faults. During the last 3 decades a large number of small and moderate earthquakes (M<4.5) are realized in the close vicinity of the salt deposit. Local seismological network (LSN) is deployed in the region to monitor the local seismicity. It consists of 6 three component digital stations. A real-time data transfer from LSN stations to National Data Center (in Sofia) is implemented using the VPN and MAN networks of the Bulgarian Telecommunication Company. Common processing and interpretation of the data from LSN and the national seismic network is performed. Real-time and interactive data processing are performed by the Seismic Network Data Processor (SNDP) software package. More than 700 earthquakes are registered by the LSN within 30km region around the mine during the 8 years monitoring. First we processed the data and compile a catalogue of the earthquakes occur within the studied region (30km around the salt mine). Spatial pattern of seismicity is analyzed. A large number of the seismic events occurred within the northern and north-western part of the salt body. Several earthquakes occurred in close vicinity of the mine. Concerning that the earthquakes could be tectonic and/or induced an attempt is made to find criteria to distinguish natural from induced seismicity. To characterize and distinguish the main processes active in the area we also made waveform and spectral analysis of a number of earthquakes.

  8. Vulnerability analysis in terms of food insecurity and poverty using GIS and remote sensing technology applied to Sri Lanka

    NASA Astrophysics Data System (ADS)

    Shahriar, Pervez M.; Ramachandran, Mahadevan; Mutuwatte, Lal

    2003-03-01

    It is becoming increasingly recognized that computer methods such as models and Geographic Information Systems (GIS) can be valuable tools for analyzing a geographical area in terms of it's hazards vulnerability, Vulnerability is an important aspect of households' experience of poverty. The measurement and analysis of poverty, inequality and vulnerability are crucial for cognitive purposes (to know what the situation is), for analytical purposes (to understand the factors determining this situation), for policy making purposes (to design interventions best adapted to the issues), and for monitoring and evaluation purposes (to assess whether current policies are effective, and whether the situation is changing). Here vulnerability defined as the probability or risk today of being in poverty - or falling deeper into poverty - in the future. Vulnerability is a key dimension of well being since it affects individuals' behavior (in terms of investment, production patterns, coping strategies) and their perception of their own situation. This study has been conducted with the joint collaboration of World Food Programme (WFP) and International Water Management Institute (IWMI) in Sri Lanka for identifying regions and population which are food insecure, for analyzing the reasons for vulnerability to food insecurity in order to provide decision-makers with information to identify possible sectors of intervention and for identifying where and for whom food aid can be best utilized in Sri Lanka. This new approach integrates GIS and Remote sensing with other statistical packages to allow consideration of more spatial/physical parameters like accessibility to economic resources, particularly land and the assets of the built environment, creating employment, and attracting investment in order to improve the quality and quantity of goods and services for the analysis which leads the analysis to represent the real scenario. For this study a detailed topographic data are being used

  9. Illustrating the coupled human–environment system for vulnerability analysis: Three case studies

    PubMed Central

    Turner, B. L.; Matson, Pamela A.; McCarthy, James J.; Corell, Robert W.; Christensen, Lindsey; Eckley, Noelle; Hovelsrud-Broda, Grete K.; Kasperson, Jeanne X.; Kasperson, Roger E.; Luers, Amy; Martello, Marybeth L.; Mathiesen, Svein; Naylor, Rosamond; Polsky, Colin; Pulsipher, Alexander; Schiller, Andrew; Selin, Henrik; Tyler, Nicholas

    2003-01-01

    The vulnerability framework of the Research and Assessment Systems for Sustainability Program explicitly recognizes the coupled human–environment system and accounts for interactions in the coupling affecting the system's responses to hazards and its vulnerability. This paper illustrates the usefulness of the vulnerability framework through three case studies: the tropical southern Yucatán, the arid Yaqui Valley of northwest Mexico, and the pan-Arctic. Together, these examples illustrate the role of external forces in reshaping the systems in question and their vulnerability to environmental hazards, as well as the different capacities of stakeholders, based on their access to social and biophysical capital, to respond to the changes and hazards. The framework proves useful in directing attention to the interacting parts of the coupled system and helps identify gaps in information and understanding relevant to reducing vulnerability in the systems as a whole. PMID:12815106

  10. Effects of surface topography on ground shaking prediction: implications for seismic hazard analysis and recommendations for seismic design

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Massa, Marco; Lovati, Sara; Spallarossa, Daniele

    2014-06-01

    This study examines the role of topographic effects on the prediction of earthquake ground motion. Ground motion prediction equations (GMPEs) are mathematical models that estimate the shaking level induced by an earthquake as a function of several parameters, such as magnitude, source-to-site distance, style of faulting and ground type. However, little importance is given to the effects of topography, which, as known, may play a significant role on the level, duration and frequency content of ground motion. Ridges and crests are often lost inside the large number of sites considered in the definition of a GMPE. Hence, it is presumable that current GMPEs are unable to accurately predict the shaking level at the top of a relief. The present work, which follows the article of Massa et al. about topographic effects, aims at overcoming this limitation by amending an existing GMPE with an additional term to account for the effects of surface topography at a specific site. First, experimental ground motion values and ground motions predicted by the attenuation model of Bindi et al. for five case studies are compared and contrasted in order to quantify their discrepancy and to identify anomalous behaviours of the sites investigated. Secondly, for the site of Narni (Central Italy), amplification factors derived from experimental measurements and numerical analyses are compared and contrasted, pointing out their impact on probabilistic seismic hazard analysis and design norms. In particular, with reference to the Italian building code, our results have highlighted the inadequacy of the national provisions concerning the definition of the seismic load at top of ridges and crests, evidencing a significant underestimation of ground motion around the site resonance frequency.

  11. Safe Spaces, Support, Social Capital: A Critical Analysis of Artists Working with Vulnerable Young People in Educational Contexts

    ERIC Educational Resources Information Center

    Sellman, Edward

    2015-01-01

    This article provides a critical and thematic analysis of three research projects involving artists working with vulnerable young people in educational contexts. It argues that artists create safe spaces in contrast to traditional educational activities but it will also raise questions about what constitutes such a space for participants. It will…

  12. Assessing the vulnerability of Brazilian municipalities to the vectorial transmission of Trypanosoma cruzi using multi-criteria decision analysis.

    PubMed

    Vinhaes, Márcio Costa; de Oliveira, Stefan Vilges; Reis, Priscilleyne Ouverney; de Lacerda Sousa, Ana Carolina; Silva, Rafaella Albuquerque E; Obara, Marcos Takashi; Bezerra, Cláudia Mendonça; da Costa, Veruska Maia; Alves, Renato Vieira; Gurgel-Gonçalves, Rodrigo

    2014-09-01

    Despite the dramatic reduction in Trypanosoma cruzi vectorial transmission in Brazil, acute cases of Chagas disease (CD) continue to be recorded. The identification of areas with greater vulnerability to the occurrence of vector-borne CD is essential to prevention, control, and surveillance activities. In the current study, data on the occurrence of domiciliated triatomines in Brazil (non-Amazonian regions) between 2007 and 2011 were analyzed. Municipalities' vulnerability was assessed based on socioeconomic, demographic, entomological, and environmental indicators using multi-criteria decision analysis (MCDA). Overall, 2275 municipalities were positive for at least one of the six triatomine species analyzed (Panstrongylus megistus, Triatoma infestans, Triatoma brasiliensis, Triatoma pseudomaculata, Triatoma rubrovaria, and Triatoma sordida). The municipalities that were most vulnerable to vector-borne CD were mainly in the northeast region and exhibited a higher occurrence of domiciliated triatomines, lower socioeconomic levels, and more extensive anthropized areas. Most of the 39 new vector-borne CD cases confirmed between 2001 and 2012 in non-Amazonian regions occurred within the more vulnerable municipalities. Thus, MCDA can help to identify the states and municipalities that are most vulnerable to the transmission of T. cruzi by domiciliated triatomines, which is critical for directing adequate surveillance, prevention, and control activities. The methodological approach and results presented here can be used to enhance CD surveillance in Brazil.

  13. Magma migration at the onset of the 2012-13 Tolbachik eruption revealed by Seismic Amplitude Ratio Analysis

    NASA Astrophysics Data System (ADS)

    Caudron, Corentin; Taisne, Benoit; Kugaenko, Yulia; Saltykov, Vadim

    2015-12-01

    In contrast of the 1975-76 Tolbachik eruption, the 2012-13 Tolbachik eruption was not preceded by any striking change in seismic activity. By processing the Klyuchevskoy volcano group seismic data with the Seismic Amplitude Ratio Analysis (SARA) method, we gain insights into the dynamics of magma movement prior to this important eruption. A clear seismic migration within the seismic swarm, started 20 hours before the reported eruption onset (05:15 UTC, 26 November 2012). This migration proceeded in different phases and ended when eruptive tremor, corresponding to lava flows, was recorded (at ~ 11:00 UTC, 27 November 2012). In order to get a first order approximation of the magma location, we compare the calculated seismic intensity ratios with the theoretical ones. As expected, the observations suggest that the seismicity migrated toward the eruption location. However, we explain the pre-eruptive observed ratios by a vertical migration under the northern slope of Plosky Tolbachik volcano followed by a lateral migration toward the eruptive vents. Another migration is also captured by this technique and coincides with a seismic swarm that started 16-20 km to the south of Plosky Tolbachik at 20:31 UTC on November 28 and lasted for more than 2 days. This seismic swarm is very similar to the seismicity preceding the 1975-76 Tolbachik eruption and can be considered as a possible aborted eruption.

  14. Temporal patterns in southern Aegean seismicity revealed by the multiresolution wavelet analysis

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Hloupis, George; Nikolintaga, Irini; Vallianatos, Filippos

    2007-12-01

    We applied multiresolution wavelet analysis to the sequence of times between earthquakes occurred between 1970 and 2003 in the southern Aegean area, one of the most seismically active area in the Mediterranean. We observed a twofold features in the wavelet-coefficient standard deviation σwav: (i) at low scales it decreases in correspondence with the occurrence of the strongest earthquake, mainly due to the aftershock activation mechanism; (ii) at high scales it is characterized by oscillating behaviour, which is a typical background seismicity.

  15. Application of the SASSI soil structure interaction method to CANDU 6 NPP seismic analysis

    SciTech Connect

    Ricciuti, R.A.; Elgohary, M.; Usmani, S.A.

    1996-12-01

    The standard CANDU 6 NPP has been conservatively qualified for a Design Basis Earthquake (DBE) peak horizontal ground acceleration of 0.2 g. Currently there are potential opportunities for siting the CANDU 6 at higher seismicity sites. In order to be able to extend the use of a standardized design for sites with higher seismicity than the standard plant, various design options, including the use of the SASSI Soil Structure Interaction (SSI) analysis method, are being evaluated. This paper presents the results of a study to assess the potential benefits from utilization of the SASSI computer program and the use of more realistic damping ratios for the structures.

  16. Unique problems associated with seismic analysis of partially gas-saturated unconsolidated sediments

    USGS Publications Warehouse

    Lee, M.W.; Collett, T.S.

    2009-01-01

    Gas hydrate stability conditions restrict the occurrence of gas hydrate to unconsolidated and high water-content sediments at shallow depths. Because of these host sediments properties, seismic and well log data acquired for the detection of free gas and associated gas hydrate-bearing sediments often require nonconventional analysis. For example, a conventional method of identifying free gas using the compressional/shear-wave velocity (Vp/Vs) ratio at the logging frequency will not work, unless the free-gas saturations are more than about 40%. The P-wave velocity dispersion of partially gas-saturated sediments causes a problem in interpreting well log velocities and seismic data. Using the White, J.E. [1975. Computed seismic speeds and attenuation in rocks with partial gas saturation. Geophysics 40, 224-232] model for partially gas-saturated sediments, the difference between well log and seismic velocities can be reconciled. The inclusion of P-wave velocity dispersion in interpreting well log data is, therefore, essential to identify free gas and to tie surface seismic data to synthetic seismograms.

  17. Areal distribution of sedimentary facies determined from seismic facies analysis and models of modern depositional systems

    SciTech Connect

    Seramur, K.C.; Powell, R.D.; Carpenter, P.J.

    1988-02-01

    Seismic facies analysis was applied to 3.5-kHz single-channel analog reflection profiles of the sediment fill within Muir Inlet, Glacier Bay, southeast Alaska. Nine sedimentary facies have been interpreted from seven seismic facies identified on the profiles. The interpretations are based on reflection characteristics and structural features of the seismic facies. The following reflection characteristics and structural features are used: reflector spacing, amplitude and continuity of reflections, internal reflection configurations, attitude of reflection terminations at a facies boundary, body geometry of a facies, and the architectural associations of seismic facies within each basin. The depositional systems are reconstructed by determining the paleotopography, bedding patterns, sedimentary facies, and modes of deposition within the basin. Muir Inlet is a recently deglaciated fjord for which successive glacier terminus positions and consequent rates of glacial retreat are known. In this environment the depositional processes and sediment characteristics vary with distance from a glacier terminus, such that during a retreat a record of these variations is preserved in the aggrading sediment fill. Sedimentary facies within the basins of lower Muir Inlet are correlated with observed depositional processes near the present glacier terminus in the upper inlet. The areal distribution of sedimentary facies within the basins is interpreted using the seismic facies architecture and inferences from known sediment characteristics proximal to present glacier termini.

  18. Analysis of the seismic performance of isolated buildings according to life-cycle cost.

    PubMed

    Dang, Yu; Han, Jian-Ping; Li, Yong-Tao

    2015-01-01

    This paper proposes an indicator of seismic performance based on life-cycle cost of a building. It is expressed as a ratio of lifetime damage loss to life-cycle cost and determines the seismic performance of isolated buildings. Major factors are considered, including uncertainty in hazard demand and structural capacity, initial costs, and expected loss during earthquakes. Thus, a high indicator value indicates poor building seismic performance. Moreover, random vibration analysis is conducted to measure structural reliability and evaluate the expected loss and life-cycle cost of isolated buildings. The expected loss of an actual, seven-story isolated hospital building is only 37% of that of a fixed-base building. Furthermore, the indicator of the structural seismic performance of the isolated building is much lower in value than that of the structural seismic performance of the fixed-base building. Therefore, isolated buildings are safer and less risky than fixed-base buildings. The indicator based on life-cycle cost assists owners and engineers in making investment decisions in consideration of structural design, construction, and expected loss. It also helps optimize the balance between building reliability and building investment.

  19. An enhancement of NASTRAN for the seismic analysis of structures. [nuclear power plants

    NASA Technical Reports Server (NTRS)

    Burroughs, J. W.

    1980-01-01

    New modules, bulk data cards and DMAP sequence were added to NASTRAN to aid in the seismic analysis of nuclear power plant structures. These allow input consisting of acceleration time histories and result in the generation of acceleration floor response spectra. The resulting system contains numerous user convenience features, as well as being reasonably efficient.

  20. First seismic shear wave velocity profile of the lunar crust as extracted from the Apollo 17 active seismic data by wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-04-01

    We present a new seismic velocity model of the shallow lunar crust, including, for the first time, shear wave velocity information. So far, the shear wave velocity structure of the lunar near-surface was effectively unconstrained due to the complexity of lunar seismograms. Intense scattering and low attenuation in the lunar crust lead to characteristic long-duration reverberations on the seismograms. The reverberations obscure later arriving shear waves and mode conversions, rendering them impossible to identify and analyze. Additionally, only vertical component data were recorded during the Apollo active seismic experiments, which further compromises the identification of shear waves. We applied a novel processing and analysis technique to the data of the Apollo 17 lunar seismic profiling experiment (LSPE), which involved recording seismic energy generated by several explosive packages on a small areal array of four vertical component geophones. Our approach is based on the analysis of the spatial gradients of the seismic wavefield and yields key parameters such as apparent phase velocity and rotational ground motion as a function of time (depth), which cannot be obtained through conventional seismic data analysis. These new observables significantly enhance the data for interpretation of the recorded seismic wavefield and allow, for example, for the identification of S wave arrivals based on their lower apparent phase velocities and distinct higher amount of generated rotational motion relative to compressional (P-) waves. Using our methodology, we successfully identified pure-mode and mode-converted refracted shear wave arrivals in the complex LSPE data and derived a P- and S-wave velocity model of the shallow lunar crust at the Apollo 17 landing site. The extracted elastic-parameter model supports the current understanding of the lunar near-surface structure, suggesting a thin layer of low-velocity lunar regolith overlying a heavily fractured crust of basaltic

  1. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  2. An analysis of seismic risk from a tourism point of view.

    PubMed

    Mäntyniemi, Päivi

    2012-07-01

    Global awareness of natural calamities increased after the destructive Indian Ocean tsunami of December 2004, largely because many foreigners lost their lives, especially in Thailand. This paper explores how best to communicate the seismic risk posed by different travel destinations to crisis management personnel in tourists' home countries. The analysis of seismic risk should be straightforward enough for non-specialists, yet powerful enough to identify the travel destinations that are most at risk. The output for each location is a point in 3D space composed of the natural and built-up environment and local tourism. The tourism-specific factors can be tailored according to the tourists' nationality. The necessary information can be collected from various directories and statistics, much of it available over the Internet. The output helps to illustrate the overall seismic risk conditions of different travel destinations, allows for comparison across destinations, and identifies the places that are most at risk.

  3. An analysis of seismic risk from a tourism point of view.

    PubMed

    Mäntyniemi, Päivi

    2012-07-01

    Global awareness of natural calamities increased after the destructive Indian Ocean tsunami of December 2004, largely because many foreigners lost their lives, especially in Thailand. This paper explores how best to communicate the seismic risk posed by different travel destinations to crisis management personnel in tourists' home countries. The analysis of seismic risk should be straightforward enough for non-specialists, yet powerful enough to identify the travel destinations that are most at risk. The output for each location is a point in 3D space composed of the natural and built-up environment and local tourism. The tourism-specific factors can be tailored according to the tourists' nationality. The necessary information can be collected from various directories and statistics, much of it available over the Internet. The output helps to illustrate the overall seismic risk conditions of different travel destinations, allows for comparison across destinations, and identifies the places that are most at risk. PMID:22066886

  4. An Integrated Approach for Urban Earthquake Vulnerability Analyses

    NASA Astrophysics Data System (ADS)

    Düzgün, H. S.; Yücemen, M. S.; Kalaycioglu, H. S.

    2009-04-01

    -economical, structural, coastal, ground condition, organizational vulnerabilities, as well as accessibility to critical services within the framework. The proposed framework has the following eight components: Seismic hazard analysis, soil response analysis, tsunami inundation analysis, structural vulnerability analysis, socio-economic vulnerability analysis, accessibility to critical services, GIS-based integrated vulnerability assessment, and visualization of vulnerabilities in 3D virtual city model The integrated model for various vulnerabilities obtained for the urban area is developed in GIS environment by using individual vulnerability assessments for considered elements at risk and serve for establishing the backbone of the spatial decision support system. The stages followed in the model are: Determination of a common mapping unit for each aspect of urban earthquake vulnerability, formation of a geo-database for the vulnerabilities, evaluation of urban vulnerability based on multi attribute utility theory with various weighting algorithms, mapping of the evaluated integrated earthquake risk in geographic information systems (GIS) in the neighborhood scale. The framework is also applicable to larger geographical mapping scales, for example, the building scale. When illustrating the results in building scale, 3-D visualizations with remote sensing data is used so that decision-makers can easily interpret the outputs. The proposed vulnerability assessment framework is flexible and can easily be applied to urban environments at various geographical scales with different mapping units. The obtained total vulnerability maps for the urban area provide a baseline for the development of risk reduction strategies for the decision makers. Moreover, as several aspects of elements at risk for an urban area is considered through vulnerability analyses, effect on changes in vulnerability conditions on the total can easily be determined. The developed approach also enables decision makers to

  5. Effects of relay chatter in seismic probabilistic safety analysis

    SciTech Connect

    Reed, J.W.; Shiu, K.K.

    1985-01-01

    In the Zion and Indian Point Probabilistic Safety Studies, relay chatter was dismissed as a credible event and hence was not formally included in the analyses. Although little discussion is given in the Zion and Indian Point PSA documentation concerning the basis for this decision, it has been expressed informally that it was assumed that the operators will be able to reset all relays in a timely manner. Currently, it is the opinion of many professionals that this may be an oversimplification. The three basic areas which must be considered in addressing relay chatter include the fragility of the relays per se, the reliability of the operators to reset the relays and finally the systems response aspects. Each of these areas is reviewed and the implications for seismic PSA are discussed. Finally, recommendations for future research are given.

  6. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  7. Shallow prospect evaluation in Shahbazpur structure using seismic attributes analysis, Southern Bangladesh.

    NASA Astrophysics Data System (ADS)

    Rahman, M.

    2015-12-01

    Shahbazpur structure is located within the Hatia trough a southern extension of prolific Surma Basin, where lies all of the largest Gas fields of Bangladesh. A method is established to delineate the structural mapping precisely by interpreting four 2D seismic lines that are acquired over Shahbazpur structure. Moreover direct hydrocarbon indicators (DHI) related attributes analyzed for further confirmation of presence of hydrocarbon. To do this synthetic generation, seismic well tie, velocity modelling and depth conversion has been performed. Seismic attribute analysis used in this study is mostly related to bright spot identification in reservoir zones as well as to identify the similar response in both below and above of the reservoir zones. Seismic interpretation shows that Shahbazpur structure is roughly an oval shaped anticline with simple four way dip closure which will be a good trap for hydrocarbon accumulation. A limited number of seismic attributes functions that are available in an academic version of Petrel software are applied to analyze attributes. Taking in consideration of possible interpretation pitfalls, attributes analysis confirmed that bright spots exist in the shallower part of the structure above the present reservoir zones which might be a potential shallow gas reserve. The bright spots are located within Shahbazpur sequence I of Dupi Tila Group of Pleistocene age and Shahbazpur sequence II of Tipam Group of Pleistocene-Pliocene age. This signature will play a very important role in next well planning on the same structure to test the shallow accumulation of hydrocarbon. For better understanding of this shallow reserve, it is suggested to acquire 3D seismic data over Shahbazpur structure which will help to evaluate the hydrocarbon accumulation and to identify gas migration pathways.

  8. Retrospective analysis of seismic regime features before the Taiwan earthquakes of 1999 and 2002

    NASA Astrophysics Data System (ADS)

    Kuzin, I. P.; Flyonov, A. B.

    2016-05-01

    The features of the seismic regime before the strongest earthquakes of Taiwan in the late 20th (Chi-Chi on September 21, 1999, M w = 7.6) and the early 21st century (March 31, 2002, M w = 7.4) are analyzed. Based on 1990-1999 and 1994-2002 data, respectively, retrospective analysis of three seismic regime parameters are studied: the total annual number of earthquakes N Σ in the range of M L = 2.5-5.5 and M w = 3.0-7.0; the total annual quantity of released seismic energy Σ E, J; and angular coefficient b of earthquake recurrence graphs. Two explicit subperiods are revealed in the course of the seismic regime: quiescence in 1990-1996 before the Chi-Chi earthquake and in 1994-1997 before the March 2002 earthquake; in 1997-1999 and 1998-2002, respectively, seismic activation is observed. Due to the predominance of weak earthquakes during the Chi-Chi earthquake preparation, factor b appeared relatively higher (-1.16 on average); in contrast, before the March 2002 earthquake, due to the occurrence of foreshocks with M w = 6.8-7.0, the factor b values appeared relatively lower (-0.55 and-0.74 for the quiescence and activation subperiods, respectively). Despite the fundamental difference in the seismotectonic situation between the domains where two mainshocks occurred and significantly difference energy ranges of the initial seismic events, the analysis results are similar for both earthquakes. In both cases, the mainshock occurred at the peak of released energy, which can be considered a coincidence. Solid verification of this positive tendency requires the accumulation of seismological statistics.

  9. Future Directions in Vulnerability to Depression among Youth: Integrating Risk Factors and Processes across Multiple Levels of Analysis

    PubMed Central

    Hankin, Benjamin L.

    2014-01-01

    Depression is a developmental phenomenon. Considerable progress has been made in describing the syndrome, establishing its prevalence and features, providing clues as to its etiology, and developing evidence-based treatment and prevention options. Despite considerable headway in distinct lines of vulnerability research, there is an explanatory gap in the field ability to more comprehensively explain and predict who is likely to become depressed, when, and why. Still, despite clear success in predicting moderate variance for future depression, especially with empirically rigorous methods and designs, the heterogeneous and multi-determined nature of depression suggests that additional etiologies need to be included to advance knowledge on developmental pathways to depression. This paper advocates for a multiple levels of analysis approach to investigating vulnerability to depression across the lifespan and providing a more comprehensive understanding of its etiology. One example of a multiple levels of analysis model of vulnerabilities to depression is provided that integrates the most accessible, observable factors (e.g., cognitive and temperament risks), intermediate processes and endophenotypes (e.g., information processing biases, biological stress physiology, and neural activation and connectivity), and genetic influences (e.g., candidate genes and epigenetics). Evidence for each of these factors as well as their cross-level integration is provided. Methodological and conceptual considerations important for conducting integrative, multiple levels of depression vulnerability research are discussed. Finally, translational implications for how a multiple levels of analysis perspective may confer additional leverage to reduce the global burden of depression and improve care are considered. PMID:22900513

  10. Seismic Risk Perception compared with seismic Risk Factors

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Pessina, Vera; Pino, Nicola Alessandro; Peruzza, Laura

    2016-04-01

    The communication of natural hazards and their consequences is one of the more relevant ethical issues faced by scientists. In the last years, social studies have provided evidence that risk communication is strongly influenced by the risk perception of people. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. A theory that offers an integrative approach to understanding and explaining risk perception is still missing. To explain risk perception, it is necessary to consider several perspectives: social, psychological and cultural perspectives and their interactions. This paper presents the results of the CATI survey on seismic risk perception in Italy, conducted by INGV researchers on funding by the DPC. We built a questionnaire to assess seismic risk perception, with a particular attention to compare hazard, vulnerability and exposure perception with the real data of the same factors. The Seismic Risk Perception Questionnaire (SRP-Q) is designed by semantic differential method, using opposite terms on a Likert scale to seven points. The questionnaire allows to obtain the scores of five risk indicators: Hazard, Exposure, Vulnerability, People and Community, Earthquake Phenomenon. The questionnaire was administered by telephone interview (C.A.T.I.) on a statistical sample at national level of over 4,000 people, in the period January -February 2015. Results show that risk perception seems be underestimated for all indicators considered. In particular scores of seismic Vulnerability factor are extremely low compared with house information data of the respondents. Other data collected by the questionnaire regard Earthquake information level, Sources of information, Earthquake occurrence with respect to other natural hazards, participation at risk reduction activities and level of involvement. Research on risk perception aims to aid risk analysis and policy-making by

  11. Seismic analysis of the large 70-meter antenna, part 1: Earthquake response spectra versus full transient analysis

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    As a check on structure safety aspects, two approaches in seismic analysis for the large 70-m antennas are presented. The first approach, commonly used by civil engineers, utilizes known recommended design response spectra. The second approach, which is the full transient analysis, is versatile and applicable not only to earthquake loading but also to other dynamic forcing functions. The results obtained at the fundamental structural frequency show that the two approaches are in good agreement with each other and both approaches show a safe design. The results also confirm past 64-m antenna seismic studies done by the Caltech Seismology Staff.

  12. Acoustic radiation force impulse imaging of vulnerable plaques: a finite element method parametric analysis

    PubMed Central

    Doherty, Joshua R.; Dumont, Douglas M.; Trahey, Gregg E.; Palmeri, Mark L.

    2012-01-01

    Plaque rupture is the most common cause of complications such as stroke and coronary heart failure. Recent histopathological evidence suggests that several plaque features, including a large lipid core and a thin fibrous cap, are associated with plaques most at risk for rupture. Acoustic Radiation Force Impulse (ARFI) imaging, a recently developed ultrasound-based elasticity imaging technique, shows promise for imaging these features noninvasively. Clinically, this could be used to distinguish vulnerable plaques, for which surgical intervention may be required, from those less prone to rupture. In this study, a parametric analysis using Finite-Element Method (FEM) models was performed to simulate ARFI imaging of five different carotid artery plaques across a wide range of material properties. It was demonstrated that ARFI could resolve the softer lipid pool from the surrounding, stiffer media and fibrous cap and was most dependent upon the stiffness of the lipid pool component. Stress concentrations due to an ARFI excitation were located in the media and fibrous cap components. In all cases, the maximum Von Mises stress was < 1.2 kPa. In comparing these results with others investigating plaque rupture, it is concluded that while the mechanisms may be different, the Von Mises stresses imposed by ARFI are orders of magnitude lower than the stresses associated with blood pressure. PMID:23122224

  13. The ecological structures as components of flood and erosion vulnerability analysis in costal landscapes

    NASA Astrophysics Data System (ADS)

    Valentini, E.; Taramelli, A.; Martina, M.; Persichillo, M. G.; Casarotti, C.; Meisina, C.

    2014-12-01

    The direct and the indirect changes of natural habitats for coastal development can affect the level of exposure to erosion and flooding (inundation). Although engineered structures are still preferred for coastal safety there is an increasing number of applications of ecosystem-based solutions worldwide as the building with nature approaches and the arising natural capital evaluation. A question to which we should respond, is the possibility of using the wide range of satellite data and the already available Earth Observation based products to make a synoptic structural and environmental vulnerability assessment. By answering to this, we could also understand, if and how many markers/signals can be identified in the landscape components, to define transitions to and from nonlinear processes - to and from scale invariant spatial distributions- characterizing the evolution of the environmental patch size mosaic, the landscape. The Wadden Sea, in example, is a productive estuarine area in the south-eastern coastal zone of the North Sea. It is characterized by extensive tidal mud flats, saltmarshes and by the tidal channel network between the mainland and the chain of islands along the North Sea side. The area has a UNESCO World Heritage Status and a Natura 2000 status. Here, we identified thresholds to distinguish spatial and temporal patterns controlled by changes in environmental variables. These patterns are represented by the cover percent and by the structural level of vegetation and sediment/soil in each identified patch. The environmental variables are those able to act on the patch size distribution as the forcing factors from the sea (wind and waves fields) or from the climate and the hydrology drivers. The Bayesian approach defines the dependencies of the spatial patch size distribution from the major flooding and erosion environmental variables. When the analysis is scaled up from the ecosystem units to the landscape level thanks to the satellite

  14. Romanian Educational Seismic Network Project

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers

  15. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT SEISMIC ANALYSIS IN SUPPORT OF INCREASED LIQUID LEVEL IN 241-AP TANK FARMS

    SciTech Connect

    MACKEY TC; ABBOTT FG; CARPENTER BG; RINKER MW

    2007-02-16

    The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford. The "Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Project" is in support of Tri-Party Agreement Milestone M-48-14.

  16. The R-package 'eseis' - towards a toolbox for comprehensive seismic data analysis

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2015-04-01

    There are plenty of software solutions to process seismic data. However, most of these are either not free and open-source, are focused on specialised tasks, lack appropriate documentation/examples or are limited to command-line processing. R is the most widely used and still fastest growing scientific software worldwide. This free and open-source software allows contribution of user-build function packages (currently 6091) that cover nearly all scientific research fields. However, support of seismic data is only limited. This contribution is devoted to present the R-package 'eseis', a collection of functions to handle seismic data, mostly for but not limited to "environmental seismology", i.e. analysis of seismic signals, emitted by Earth surface processes such as landslides, rockfalls or debris flows. The package allows import/export/conversion of different data formats (cube, mseed, sac), signal processing (deconvolution, filtering, clipping/merging, power spectral density estimates), event handling (triggering, locating) and data visualisation (2D-plots, images, animations). The main advantages of using this package are the embedding of processed data in a huge framework of other scientific analysis approaches, the presence of a sound documentation and tested examples, benefit from a worldwide help and discussion network, the possibility to modify all functions and enlarge the functionality by the user.

  17. Seismic slope-performance analysis: from hazard map to decision support system

    USGS Publications Warehouse

    Miles, Scott B.; Keefer, David K.; Ho, Carlton L.

    1999-01-01

    In response to the growing recognition of engineers and decision-makers of the regional effects of earthquake-induced landslides, this paper presents a general approach to conducting seismic landslide zonation, based on the popular Newmark's sliding block analogy for modeling coherent landslides. Four existing models based on the sliding block analogy are compared. The comparison shows that the models forecast notably different levels of slope performance. Considering this discrepancy along with the limitations of static maps as a decision tool, a spatial decision support system (SDSS) for seismic landslide analysis is proposed, which will support investigations over multiple scales for any number of earthquake scenarios and input conditions. Most importantly, the SDSS will allow use of any seismic landslide analysis model and zonation approach. Developments associated with the SDSS will produce an object-oriented model for encapsulating spatial data, an object-oriented specification to allow construction of models using modular objects, and a direct-manipulation, dynamic user-interface that adapts to the particular seismic landslide model configuration.

  18. Site specific seismic hazard analysis at the DOE Kansas City Plant

    SciTech Connect

    Lynch, D.T.; Drury, M.A.; Meis, R.C.; Bieniawski, A.; Savy, J.B.; Llopis, J.L.; Constantino, C.; Hashimoto, P.S.; Campbell, K.W.

    1995-10-01

    A site specific seismic hazard analysis is being conducted for the Kansas City Plant to support an on-going structural evaluation of existing buildings. This project is part of the overall review of facilities being conducted by DOE. The seismic hazard was probabilistically defined at the theoretical rock outcrop by Lawrence Livermore National Laboratory. The USArmy Engineer Waterways Experiment Station conducted a subsurface site investigation to characterize in situ S-wave velocities and other subsurface physical properties related to the geology in the vicinity of the Main Manufacturing Building (MMB) at the Bannister Federal Complex. The test program consisted of crosshole S-wave, seismic cone penetrometer testing,and laboratory soil analyses. The information acquired from this investigation was used in a site response analysis by City College of New York to determine the earthquake motion at grade. Ground response spectra appropriate for design and evaluation of Performance Category 1 and 2 structures, systems, and components were recommended. Effects of seismic loadings on the buildings will be used to aid in designing any structural modifications.

  19. A Comparison of seismic instrument noise coherence analysis techniques

    USGS Publications Warehouse

    Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.

    2011-01-01

    The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.

  20. Development of hazard-compatible building fragility and vulnerability models

    USGS Publications Warehouse

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  1. An analysis of seismic hazard in the Upper Rhine Graben enlightened by the example of the New Madrid seismic zone.

    NASA Astrophysics Data System (ADS)

    Doubre, Cécile; Masson, Frédéric; Mazzotti, Stéphane; Meghraoui, Mustapha

    2014-05-01

    Seismic hazard in the "stable" continental regions and low-level deformation zones is one of the most difficult issues to address in Earth sciences. In these zones, instrumental and historical seismicity are not well known (sparse seismic networks, seismic cycle too long to be covered by the human history, episodic seismic activity) and many active structures remain poorly characterized or unknown. This is the case of the Upper Rhine Graben, the central segment of the European Cenozoic rift system (ECRIS) of Oligocene age, which extends from the North Sea through Germany and France to the Mediterranean coast over a distance of some 1100 km. Even if this region has already experienced some destructive earthquakes, its present-day seismicity is moderate and the deformation observed by geodesy is very small (below the current measurement accuracy). The strain rate does not exceed 10-10 and paleoseismic studies indicate an average return period of 2.5 to 3 103 ka for large earthquakes. The largest earthquake known for this zone is the 1356 Basel earthquake, with a magnitude generally estimated about 6.5 (Meghraoui et al., 2001) but recently re-evaluated between 6.7 and 7.1 (Fäh et al et al., 2009). A comparison of the Upper Rhine Graben with equivalent regions around the world could help improve our evaluation of seismic hazard of this region. This is the case of the New Madrid seismic zone, one of the best studied intraplate system in central USA, which experienced an M 7.0 - 7.5 earthquake in 1811-1812 and shares several characteristics with the Upper Rhine Graben, i.e. the general framework of inherited geological structures (reactivation of a failed rift / graben), seismicity patterns (spatial variability of small and large earthquakes), the null or low rate of deformation, and the location in a "stable" continental interior. Looking at the Upper Rhine Graben as an analogue of the New Madrid seismic zone, we can re-evaluate its seismic hazard and consider the

  2. Extended defense systems :I. adversary-defender modeling grammar for vulnerability analysis and threat assessment.

    SciTech Connect

    Merkle, Peter Benedict

    2006-03-01

    Vulnerability analysis and threat assessment require systematic treatments of adversary and defender characteristics. This work addresses the need for a formal grammar for the modeling and analysis of adversary and defender engagements of interest to the National Nuclear Security Administration (NNSA). Analytical methods treating both linguistic and numerical information should ensure that neither aspect has disproportionate influence on assessment outcomes. The adversary-defender modeling (ADM) grammar employs classical set theory and notation. It is designed to incorporate contributions from subject matter experts in all relevant disciplines, without bias. The Attack Scenario Space U{sub S} is the set universe of all scenarios possible under physical laws. An attack scenario is a postulated event consisting of the active engagement of at least one adversary with at least one defended target. Target Information Space I{sub S} is the universe of information about targets and defenders. Adversary and defender groups are described by their respective Character super-sets, (A){sub P} and (D){sub F}. Each super-set contains six elements: Objectives, Knowledge, Veracity, Plans, Resources, and Skills. The Objectives are the desired end-state outcomes. Knowledge is comprised of empirical and theoretical a priori knowledge and emergent knowledge (learned during an attack), while Veracity is the correspondence of Knowledge with fact or outcome. Plans are ordered activity-task sequences (tuples) with logical contingencies. Resources are the a priori and opportunistic physical assets and intangible attributes applied to the execution of associated Plans elements. Skills for both adversary and defender include the assumed general and task competencies for the associated plan set, the realized value of competence in execution or exercise, and the opponent's planning assumption of the task competence.

  3. Large-scale seismic signal analysis with Hadoop

    NASA Astrophysics Data System (ADS)

    Addair, T. G.; Dodge, D. A.; Walter, W. R.; Ruppert, S. D.

    2014-05-01

    In seismology, waveform cross correlation has been used for years to produce high-precision hypocenter locations and for sensitive detectors. Because correlated seismograms generally are found only at small hypocenter separation distances, correlation detectors have historically been reserved for spotlight purposes. However, many regions have been found to produce large numbers of correlated seismograms, and there is growing interest in building next-generation pipelines that employ correlation as a core part of their operation. In an effort to better understand the distribution and behavior of correlated seismic events, we have cross correlated a global dataset consisting of over 300 million seismograms. This was done using a conventional distributed cluster, and required 42 days. In anticipation of processing much larger datasets, we have re-architected the system to run as a series of MapReduce jobs on a Hadoop cluster. In doing so we achieved a factor of 19 performance increase on a test dataset. We found that fundamental algorithmic transformations were required to achieve the maximum performance increase. Whereas in the original IO-bound implementation, we went to great lengths to minimize IO, in the Hadoop implementation where IO is cheap, we were able to greatly increase the parallelism of our algorithms by performing a tiered series of very fine-grained (highly parallelizable) transformations on the data. Each of these MapReduce jobs required reading and writing large amounts of data. But, because IO is very fast, and because the fine-grained computations could be handled extremely quickly by the mappers, the net was a large performance gain.

  4. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  5. A Vulnerability-Benefit Analysis of Fossil Fuel CO2 Emissions

    NASA Astrophysics Data System (ADS)

    Delman, E. M.; Stephenson, S. R.; Davis, S. J.; Diffenbaugh, N. S.

    2015-12-01

    Although we can anticipate continued improvements in our understanding of future climate impacts, the central challenge of climate change is not scientific, but rather political and economic. In particular, international climate negotiations center on how to share the burden of uncertain mitigation and adaptation costs. We expose the relative economic interests of different countries by assessing and comparing their vulnerability to climate impacts and the economic benefits they derive from the fossil fuel-based energy system. Vulnerability refers to the propensity of humans and their assets to suffer when impacted by hazards, and we draw upon the results from a number of prior studies that have quantified vulnerability using multivariate indices. As a proxy for benefit, we average CO2 related to each country's extraction of fossil fuels, production of CO2 emissions, and consumption of goods and services (Davis et al., 2011), which should reflect benefits accrued in proportion to national economic dependence on fossil fuels. We define a nondimensional vulnerability-benefit ratio for each nation and find a large range across countries. In general, we confirm that developed and emerging economies such as the U.S., Western Europe, and China rely heavily on fossil fuels and have substantial resources to respond to the impacts of climate change, while smaller, less-developed economies such as Sierra Leone and Vanuatu benefit little from current CO2 emissions and are much more vulnerable to adverse climate impacts. In addition, we identify some countries with a high vulnerability and benefit, such as Iraq and Nigeria; conversely, some nations exhibit both a low vulnerability and benefit, such as New Zealand. In most cases, the ratios reflect the nature of energy-climate policies in each country, although certain nations - such as the United Kingdom and France - assume a level of responsibility incongruous with their ratio and commit to mitigation policy despite

  6. Development of adaptive seismic isolators for ultimate seismic protection of civil structures

    NASA Astrophysics Data System (ADS)

    Li, Jianchun; Li, Yancheng; Li, Weihua; Samali, Bijan

    2013-04-01

    Base isolation is the most popular seismic protection technique for civil engineering structures. However, research has revealed that the traditional base isolation system due to its passive nature is vulnerable to two kinds of earthquakes, i.e. the near-fault and far-fault earthquakes. A great deal of effort has been dedicated to improve the performance of the traditional base isolation system for these two types of earthquakes. This paper presents a recent research breakthrough on the development of a novel adaptive seismic isolation system as the quest for ultimate protection for civil structures, utilizing the field-dependent property of the magnetorheological elastomer (MRE). A novel adaptive seismic isolator was developed as the key element to form smart seismic isolation system. The novel isolator contains unique laminated structure of steel and MR elastomer layers, which enable its large-scale civil engineering applications, and a solenoid to provide sufficient and uniform magnetic field for energizing the field-dependent property of MR elastomers. With the controllable shear modulus/damping of the MR elastomer, the developed adaptive seismic isolator possesses a controllable lateral stiffness while maintaining adequate vertical loading capacity. In this paper, a comprehensive review on the development of the adaptive seismic isolator is present including designs, analysis and testing of two prototypical adaptive seismic isolators utilizing two different MRE materials. Experimental results show that the first prototypical MRE seismic isolator can provide stiffness increase up to 37.49%, while the second prototypical MRE seismic isolator provides amazing increase of lateral stiffness up to1630%. Such range of increase of the controllable stiffness of the seismic isolator makes it highly practical for developing new adaptive base isolation system utilizing either semi-active or smart passive controls.

  7. Structure soil structure interaction effects: Seismic analysis of safety related collocated concrete structures

    SciTech Connect

    Joshi, J.R.

    2000-06-20

    The Process, Purification and Stack Buildings are collocated safety related concrete shear wall structures with plan dimensions in excess of 100 feet. An important aspect of their seismic analysis was the determination of structure soil structure interaction (SSSI) effects, if any. The SSSI analysis of the Process Building, with one other building at a time, was performed with the SASSI computer code for up to 50 frequencies. Each combined model had about 1500 interaction nodes. Results of the SSSI analysis were compared with those from soil structure interaction (SSI) analysis of the individual buildings, done with ABAQUS and SASSI codes, for three parameters: peak accelerations, seismic forces and the in-structure floor response spectra (FRS). The results may be of wider interest due to the model size and the potential applicability to other deep soil layered sites. Results obtained from the ABAQUS analysis were consistently higher, as expected, than those from the SSI and SSSI analyses using the SASSI. The SSSI effect between the Process and Purification Buildings was not significant. The Process and Stack Building results demonstrated that under certain conditions a massive structure can have an observable effect on the seismic response of a smaller and less stiff structure.

  8. Vulnerability of the Vulnerability Thesis.

    ERIC Educational Resources Information Center

    Lutz, Frank W.

    1996-01-01

    Reexamines Callahan's book, "Education and the Cult of Efficiency" (1962), and his vulnerability thesis regarding school superintendents, discussing recommendations it made and highlighting public education in the 1990s. Callahan's recommendations were well-received but not well-heeded, and the vulnerability thesis did not provide the stimulus for…

  9. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    NASA Astrophysics Data System (ADS)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  10. Nonlinear Seismic Correlation Analysis of the JNES/NUPEC Large-Scale Piping System Tests.

    SciTech Connect

    Nie,J.; DeGrassi, G.; Hofmayer, C.; Ali, S.

    2008-06-01

    The Japan Nuclear Energy Safety Organization/Nuclear Power Engineering Corporation (JNES/NUPEC) large-scale piping test program has provided valuable new test data on high level seismic elasto-plastic behavior and failure modes for typical nuclear power plant piping systems. The component and piping system tests demonstrated the strain ratcheting behavior that is expected to occur when a pressurized pipe is subjected to cyclic seismic loading. Under a collaboration agreement between the US and Japan on seismic issues, the US Nuclear Regulatory Commission (NRC)/Brookhaven National Laboratory (BNL) performed a correlation analysis of the large-scale piping system tests using derailed state-of-the-art nonlinear finite element models. Techniques are introduced to develop material models that can closely match the test data. The shaking table motions are examined. The analytical results are assessed in terms of the overall system responses and the strain ratcheting behavior at an elbow. The paper concludes with the insights about the accuracy of the analytical methods for use in performance assessments of highly nonlinear piping systems under large seismic motions.

  11. Areal distribution of sedimentary facies determined from seismic facies analysis and models of modern depositional systems

    SciTech Connect

    Seramur, K.C.; Powell, R.D.; Carpenter, P.J.

    1988-01-01

    Seismic facies analysis was applied to 3.5-kHz single-channel analog reflection profiles of the sediment fill within Muir Inlet, Glacier Bay, southeast Alaska. Nine sedimentary facies have been interpreted from seven seismic facies identified on the profiles. The interpretations are based on reflection characteristics and structural features of the seismic facies. The following reflection characteristics and structural features are used: reflector spacing, amplitude and continuity of reflections, internal reflection configurations, attitude of reflection terminations at a facies boundary, body geometry of a facies, and the architectural associations of seismic facies within each basin. The depositional systems are reconstructed by determining the paleotopography, bedding patterns, sedimentary facies, and modes of deposition within the basin. Muir Inlet is a recently deglaciated fjord for which successive glacier terminus positions and consequent rates of glacial retreat are known. In this environment the depositional processes and sediment characteristics vary with distance from a glacier terminus, such that during a retreat a record of these variations is preserved in the aggrading sediment fill. Sedimentary facies within the basins of lower Muir Inlet are correlated with observed depositional processes near the present glacier terminus in the upper inlet.

  12. Feasibility of time-lapse AVO and AVOA analysis to monitor compaction-induced seismic anisotropy

    NASA Astrophysics Data System (ADS)

    He, Y.-X.; Angus, D. A.; Yuan, S. Y.; Xu, Y. G.

    2015-11-01

    Hydrocarbon reservoir production generally results in observable time-lapse physical property changes, such as velocity increases within a compacting reservoir. However, the physical property changes that lead to velocity changes can be difficult to isolate uniquely. Thus, integrated hydro-mechanical simulation, stress-sensitive rock physics models and time-lapse seismic modelling workflows can be employed to study the influence of velocity changes and induced seismic anisotropy due to reservoir compaction. We study the influence of reservoir compaction and compartmentalization on time-lapse seismic signatures for reflection amplitude variation with offset (AVO) and azimuth (AVOA). Specifically, the time-lapse AVO and AVOA responses are predicted for two models: a laterally homogeneous four-layer dipping model and a laterally heterogeneous graben structure reservoir model. Seismic reflection coefficients for different offsets and azimuths are calculated for compressional (P-P) and converted shear (P-S) waves using an anisotropic ray tracer as well as using approximate equations for AVO and AVOA. The simulations help assess the feasibility of using time-lapse AVO and AVOA signatures to monitor reservoir compartmentalization as well as evaluate induced stress anisotropy due to changes in the effective stress field. The results of this study indicate that time-lapse AVO and AVOA analysis can be applied as a potential means for qualitatively and semi-quantitatively linking azimuthal anisotropy changes caused by reservoir production to pressure/stress changes.

  13. Cluster analysis of Landslide Vulnerable region on an urban Area in South Korea

    NASA Astrophysics Data System (ADS)

    Moon, Yonghee; Lee, Sangeun; Kim, Myoungsoo; Baek, Jongrak

    2016-04-01

    Mountain areas occupy about 65% of the territory in South Korea. Due to the rapid population growth and urbanization, many cities suffer from the limitation in space, and hence the commercial buildings, educational facilities, and housing settlement areas continue to stretch until the bottom of the mountain. In result, residents become more and more vulnerable to landslides and debris flow. This led to the central government to perceiving the need for strengthening regulations relevant to urban planning. In order to consider risks due to landslides and debris flow in the stage of urban planning, present authors suggested the strategies, including: first, selecting priority areas necessary to manage landslide-related disasters strictly; second, establishing the integrated management system useful to offer technical assistances to persons in charge of urban planning in the areas; third, promoting disaster awareness programs with those persons along with the central government. As the first attempt, this study mainly discusses the GIS-application procedures in which authors selected the priority areas, which are summarized: 1. Collect the landslide historical data for the period 1999 - 2012 when the disasters particularly threatened the whole country. 2. Define the areas with the one-kilometer radius around the landslide occurrence places. 3. Exclude the areas where population is less than 100 persons per 1 km2. 4. Exclude the areas where mountains with Grade I or II of landslide risk (announced by the Korea Forest Service) go below a certain portion of the area. 5. Carry out the cluster analysis with the remaining areas 6. Classify the types at the standpoint of landslide disaster risk management. Through the procedures, this study obtained a total of 86 priority areas, which were also classified into 24 areas - Type A (high population exposure and mid landslide occurrence likelihood) -, 25 areas - Type B (mid population exposure and high landslide occurrence

  14. Common and Rare Variant Analysis in Early-Onset Bipolar Disorder Vulnerability

    PubMed Central

    Jamain, Stéphane; Cichon, Sven; Etain, Bruno; Mühleisen, Thomas W.; Georgi, Alexander; Zidane, Nora; Chevallier, Lucie; Deshommes, Jasmine; Nicolas, Aude; Henrion, Annabelle; Degenhardt, Franziska; Mattheisen, Manuel; Priebe, Lutz; Mathieu, Flavie; Kahn, Jean-Pierre; Henry, Chantal; Boland, Anne; Zelenika, Diana; Gut, Ivo; Heath, Simon; Lathrop, Mark; Maier, Wolfgang; Albus, Margot; Rietschel, Marcella; Schulze, Thomas G.; McMahon, Francis J.; Kelsoe, John R.; Hamshere, Marian; Craddock, Nicholas; Nöthen, Markus M.; Bellivier, Frank; Leboyer, Marion

    2014-01-01

    Bipolar disorder is one of the most common and devastating psychiatric disorders whose mechanisms remain largely unknown. Despite a strong genetic contribution demonstrated by twin and adoption studies, a polygenic background influences this multifactorial and heterogeneous psychiatric disorder. To identify susceptibility genes on a severe and more familial sub-form of the disease, we conducted a genome-wide association study focused on 211 patients of French origin with an early age at onset and 1,719 controls, and then replicated our data on a German sample of 159 patients with early-onset bipolar disorder and 998 controls. Replication study and subsequent meta-analysis revealed two genes encoding proteins involved in phosphoinositide signalling pathway (PLEKHA5 and PLCXD3). We performed additional replication studies in two datasets from the WTCCC (764 patients and 2,938 controls) and the GAIN-TGen cohorts (1,524 patients and 1,436 controls) and found nominal P-values both in the PLCXD3 and PLEKHA5 loci with the WTCCC sample. In addition, we identified in the French cohort one affected individual with a deletion at the PLCXD3 locus and another one carrying a missense variation in PLCXD3 (p.R93H), both supporting a role of the phosphatidylinositol pathway in early-onset bipolar disorder vulnerability. Although the current nominally significant findings should be interpreted with caution and need replication in independent cohorts, this study supports the strategy to combine genetic approaches to determine the molecular mechanisms underlying bipolar disorder. PMID:25111785

  15. Common and rare variant analysis in early-onset bipolar disorder vulnerability.

    PubMed

    Jamain, Stéphane; Cichon, Sven; Etain, Bruno; Mühleisen, Thomas W; Georgi, Alexander; Zidane, Nora; Chevallier, Lucie; Deshommes, Jasmine; Nicolas, Aude; Henrion, Annabelle; Degenhardt, Franziska; Mattheisen, Manuel; Priebe, Lutz; Mathieu, Flavie; Kahn, Jean-Pierre; Henry, Chantal; Boland, Anne; Zelenika, Diana; Gut, Ivo; Heath, Simon; Lathrop, Mark; Maier, Wolfgang; Albus, Margot; Rietschel, Marcella; Schulze, Thomas G; McMahon, Francis J; Kelsoe, John R; Hamshere, Marian; Craddock, Nicholas; Nöthen, Markus M; Bellivier, Frank; Leboyer, Marion

    2014-01-01

    Bipolar disorder is one of the most common and devastating psychiatric disorders whose mechanisms remain largely unknown. Despite a strong genetic contribution demonstrated by twin and adoption studies, a polygenic background influences this multifactorial and heterogeneous psychiatric disorder. To identify susceptibility genes on a severe and more familial sub-form of the disease, we conducted a genome-wide association study focused on 211 patients of French origin with an early age at onset and 1,719 controls, and then replicated our data on a German sample of 159 patients with early-onset bipolar disorder and 998 controls. Replication study and subsequent meta-analysis revealed two genes encoding proteins involved in phosphoinositide signalling pathway (PLEKHA5 and PLCXD3). We performed additional replication studies in two datasets from the WTCCC (764 patients and 2,938 controls) and the GAIN-TGen cohorts (1,524 patients and 1,436 controls) and found nominal P-values both in the PLCXD3 and PLEKHA5 loci with the WTCCC sample. In addition, we identified in the French cohort one affected individual with a deletion at the PLCXD3 locus and another one carrying a missense variation in PLCXD3 (p.R93H), both supporting a role of the phosphatidylinositol pathway in early-onset bipolar disorder vulnerability. Although the current nominally significant findings should be interpreted with caution and need replication in independent cohorts, this study supports the strategy to combine genetic approaches to determine the molecular mechanisms underlying bipolar disorder. PMID:25111785

  16. Seismic observations and multidisciplinary interpretation of their analysis: understanding the unrest at Turrialba volcano (Costa Rica)

    NASA Astrophysics Data System (ADS)

    Martini, F.; Ovsicori-Una Volcano Monitoring Group

    2009-04-01

    significant interference on troposphere O3 measurements at 2-3 km altitude ~50 km W from the volcano, detected by the Ozone Monitoring Instrument (OMI) on NASA's EOS-Aura satellite. The current geodetic network at Turrialba volcano (comprising two small EDM networks, leveling lines, an electronic tiltmeter and periodical GPS campaigns) measuring during the reawakening of the volcano for the past decade, is very limited but it has detected an inflationary trend in the crater area in the last 2 years. The 2007 peak in seismic activity has marked an important change in the seismicity patterns as well as in the geochemical, geodetical and field observations. Previous to it, VT type events have been mainly recorded, typically showing a spindle shape waveform most likely due to the strongly scattering volcanic environment. Since late 2007, gas-driven deep impulsive events have dominated the seismicity, often followed by episodes of harmonic tremor. In this work, we present a summary of the activity of the volcano and the data collected during more than 10 years of monitoring, with particular emphasis on the changes occurred over the last 2 years. We show results from analysis of the seismic data collected by the seismic permanent network and by a small aperture short-period seismic array deployed in 2008, as well as the initial observations recorded by several broad-band arrays due to be deployed at the end of January 2009. Integrating the geochemistry, geophysical, geodetical, and field data available, we present an interpretation of the seismic observations and the current status of the volcano.

  17. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  18. Probabilistic seismic hazard analysis for offshore structures in the Santa Barbara Channel phase 2 report

    SciTech Connect

    Foxall, W; Savy, J

    1999-08-06

    This report summarizes progress through Phase 2 of the probabilistic seismic hazards analysis (PSHA) for the Santa Barbara Channel being carried out by the Lawrence Livermore National Laboratory (LLNL) for the Minerals Management Service (MMS) of the US Department of the Interior. The purpose of the PSHA is to provide a basis for development by MMS of regulations governing evaluation of applications to re-license existing oil platforms in federal waters within the Channel with respect to seismic loading. The final product of the analysis will be hazard maps of ground motion parameters at specified probability levels of exceedence. This report summarizes the characterization of local earthquake sources within the Channel and onshore areas of the Western Transverse Ranges, development of a ground motion attenuation model for the region, and presents preliminary hazard results at three selected sites.

  19. Nonlinear dynamic analysis of multi-base seismically isolated structures with uplift potential I: formulation

    NASA Astrophysics Data System (ADS)

    Tsopelas, Panos C.; Roussis, Panayiotis C.; Constantinou, Michael C.

    2009-09-01

    The complexity of modern seismically isolated structures requires the analysis of the structural system and the isolation system in its entirety and the ability to capture potential discontinuous phenomena such as isolator uplift and their effects on the superstructures and the isolation hardware. In this paper, an analytical model is developed and a computational algorithm is formulated to analyze complex seismically isolated superstructures even when undergoing highly-nonlinear phenomena such as uplift. The computational model has the capability of modeling various types of isolation devices with strong nonlinearities, analyzing multiple superstructures (up to five separate superstructures) on multiple bases (up to five bases), and capturing the effects of lateral loads on bearing axial forces, including bearing uplift. The model developed herein has been utilized to form the software platform 3D-BASIS-ME-MB, which provides the practicing engineering community with a versatile tool for analysis and design of complex structures with modern isolation systems.

  20. Vulnerable Plaque

    MedlinePlus

    ... all vulnerable plaque ruptures, and researchers at the Texas Heart Institute are looking at ways to determine ... comments. Terms of Use and Privacy Policy © Copyright Texas Heart Institute All rights reserved.

  1. Active seismic experiment

    NASA Technical Reports Server (NTRS)

    Kovach, R. L.; Watkins, J. S.; Talwani, P.

    1972-01-01

    The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

  2. Stability analysis of the Ischia Mt. Nuovo block, Italy, under extreme seismic shaking

    NASA Astrophysics Data System (ADS)

    Ausilia Paparo, Maria; Tinti, Stefano

    2016-04-01

    In this work we investigate the equilibrium conditions of the Mt. Nuovo block, a unit that is found on the northwestern flank of Mt. Epomeo in the Ischia Island, Italy, using the Minimum Lithostatic Deviation Method (Tinti and Manucci 2006, 2008; Paparo et al. 2013). The block, involved in a deep-seated gravitational slope deformation (DSGSD, Della Seta et al., 2012) process, forms an interesting scenario to study earthquake-induced instability because i) Ischia is a seismically active volcanic island; ii) the slopes of Mt. Epomeo are susceptible to mass movements; iii) there exist an abundant literature on historical local seismicity and on slope geology. In our slope stability analysis, we account for seismic load by means of peak ground acceleration (PGA) values taken from Italian seismic hazard maps (Gruppo di Lavoro MPS, 2004), and integrated with estimates based on local seismicity and suitable (MCS) I - PGA regression laws. We find that the Mt Nuovo block could not be destabilised by the 1883 Casamicciola earthquake (that is the largest known historical earthquake in the island taking place on a fault to the north of the block), but we find also that if an earthquake of the same size occurred in the Mt. Nuovo zone, the block would be mobilised and therefore generate a tsunami (Zaniboni et al, 2013), with disastrous consequences not only for Ischia, but also for the surrounding region. This work was carried out in the frame of the EU Project called ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe (Grant 603839, 7th FP, ENV.2013.6.4-3)

  3. An Earthquake Source Ontology for Seismic Hazard Analysis and Ground Motion Simulation

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Jordan, T. H.; Gil, Y.; Ratnakar, V.

    2005-12-01

    Representation of the earthquake source is an important element in seismic hazard analysis and earthquake simulations. Source models span a range of conceptual complexity - from simple time-independent point sources to extended fault slip distributions. Further computational complexity arises because the seismological community has established so many source description formats and variations thereof; what this means is that conceptually equivalent source models are often expressed in different ways. Despite the resultant practical difficulties, there exists a rich semantic vocabulary for working with earthquake sources. For these reasons, we feel it is appropriate to create a semantic model of earthquake sources using an ontology, a computer science tool from the field of knowledge representation. Unlike the domain of most ontology work to date, earthquake sources can be described by a very precise mathematical framework. Another uniqueness associated with developing such an ontology is that earthquake sources are often used as computational objects. A seismologist generally wants more than to simply construct a source and have it be well-formed and properly described; additionally, the source will be used for performing calculations. Representation and manipulation of complex mathematical objects presents a challenge to the ontology development community. In order to enable simulations involving many different types of source models, we have completed preliminary development of a seismic point source ontology. The use of an ontology to represent knowledge provides machine interpretability and the ability to validate logical consistency and completeness. Our ontology, encoded using the OWL Web Ontology Language - a standard from the World Wide Web Consortium, contains the conceptual definitions and relationships necessary for source translation services. For example, specification of strike, dip, rake, and seismic moment will automatically translate into a double

  4. Seismic Analysis of Four Solar-like Stars Observed during More Than Eight Months by Kepler

    NASA Astrophysics Data System (ADS)

    Mathur, S.; Campante, T. L.; Handberg, R.; García, R. A.; Appourchaux, T.; Bedding, T. R.; Mosser, B.; Chaplin, W. J.; Ballot, J.; Benomar, O.; Bonanno, A.; Corsaro, E.; Gaulme, P.; Hekker, S.; Régulo, C.; Salabert, D.; Verner, G.; White, T. R.; Brandão, I. M.; Creevey, O. L.; Dogan, G.; Bazot, M.; Cunha, M. S.; Elsworth, Y.; Huber, D.; Hale, S. J.; Houdek, G.; Karoff, C.; Lundkvist, M.; Metcalfe, T. S.; Molenda-Zakowicz, J.; Monteiro, M. J. P. F. G.; Thompson, M. J.; Stello, D.; Christensen-Dalsgaard, J.; Gilliland, R. L.; Kawaler, S. D.; Kjeldsen, H.; Clarke, B. D.; Girouard, F. R.; Hall, J. R.; Quintana, E. V.; Sanderfer, D. T.; Seader, S. E.

    2012-09-01

    Having started science operations in May 2009, the Kepler photometer has been able to provide exquisite data for solar-like stars. Five out of the 42 stars observed continuously during the survey phase show evidence of oscillations, even though they are rather faint (magnitudes from 10.5 to 12). In this paper, we present an overview of the results of the seismic analysis of 4 of these stars observed during more than eight months.

  5. Analysis and models of pre-injection surface seismic array noise recorded at the Aquistore carbon storage site

    NASA Astrophysics Data System (ADS)

    Birnie, Claire; Chambers, Kit; Angus, Doug; Stork, Anna L.

    2016-08-01

    Noise is a persistent feature in seismic data and so poses challenges in extracting increased accuracy in seismic images and physical interpretation of the subsurface. In this paper, we analyse passive seismic data from the Aquistore carbon capture and storage pilot project permanent seismic array to characterise, classify and model seismic noise. We perform noise analysis for a three-month subset of passive seismic data from the array and provide conclusive evidence that the noise field is not white, stationary, or Gaussian; characteristics commonly yet erroneously assumed in most conventional noise models. We introduce a novel noise modelling method that provides a significantly more accurate characterisation of real seismic noise compared to conventional methods, which is quantified using the Mann-Whitney-White statistical test. This method is based on a statistical covariance modelling approach created through the modelling of individual noise signals. The identification of individual noise signals, broadly classified as stationary, pseudo-stationary and non-stationary, provides a basis on which to build an appropriate spatial and temporal noise field model. Furthermore, we have developed a workflow to incorporate realistic noise models within synthetic seismic data sets providing an opportunity to test and analyse detection and imaging algorithms under realistic noise conditions.

  6. Watershed Reliability, Resilience And Vulnerability Analysis Under Uncertainty Using Water Quality Data

    EPA Science Inventory

    A method for assessment of watershed health is developed by employing measures of reliability, resilience and vulnerability (R-R-V) using stream water quality data. Observed water quality data are usually sparse, so that a water quality time series is often reconstructed using s...

  7. Martial Arts and Socially Vulnerable Youth. An Analysis of Flemish Initiatives

    ERIC Educational Resources Information Center

    Theeboom, Marc; De Knop, Paul; Wylleman, Paul

    2008-01-01

    Notwithstanding the lack of empirical support for its positive socio-psychological effects, numerous educators and welfare workers make use of martial arts in their work with socially vulnerable youth. Using qualitative methodology, the aims, approaches and personal experiences were analysed of teachers and co-ordinators involved in specific…

  8. Vulnerability to cavitation of central California Arctostaphylos (Ericaceae): a new analysis.

    PubMed

    Jacobsen, Anna L; Brandon Pratt, R

    2013-02-01

    A recent study, 'Influence of summer marine fog and low cloud stratus on water relations of evergreen woody shrubs (Arctostaphylos: Ericaceae) in the chaparral of central California' by M. Vasey, M.E. Loik, and V.T. Parker (2012, Oecologia, in press), presented data on the vulnerability to cavitation of eight Arctostaphylos species. We reanalyzed the vulnerability data presented in this manuscript using a different statistical model and have arrived at different conclusions than those reported previously. We suggest that regional differences have not lead to differentiation in cavitation resistance among populations of an Arctostaphylos species and, contrary to the conclusions of Vasey et al., the xylem of maritime species appears to be "overbuilt" for their current environment and do not appear to be especially vulnerable to water stress. Importantly, data on vulnerability to cavitation are limited for Arctostaphylos species from these sites. More specifically, treatment factors of site and region were not replicated and therefore conclusions drawn from these data are necessarily limited.

  9. A parametric study of nonlinear seismic response analysis of transmission line structures.

    PubMed

    Tian, Li; Wang, Yanming; Yi, Zhenhua; Qian, Hui

    2014-01-01

    A parametric study of nonlinear seismic response analysis of transmission line structures subjected to earthquake loading is studied in this paper. The transmission lines are modeled by cable element which accounts for the nonlinearity of the cable based on a real project. Nonuniform ground motions are generated using a stochastic approach based on random vibration analysis. The effects of multicomponent ground motions, correlations among multicomponent ground motions, wave travel, coherency loss, and local site on the responses of the cables are investigated using nonlinear time history analysis method, respectively. The results show the multicomponent seismic excitations should be considered, but the correlations among multicomponent ground motions could be neglected. The wave passage effect has a significant influence on the responses of the cables. The change of the degree of coherency loss has little influence on the response of the cables, but the responses of the cables are affected significantly by the effect of coherency loss. The responses of the cables change little with the degree of the difference of site condition changing. The effect of multicomponent ground motions, wave passage, coherency loss, and local site should be considered for the seismic design of the transmission line structures.

  10. Dynamics of the Oso-Steelhead landslide from broadband seismic analysis

    NASA Astrophysics Data System (ADS)

    Hibert, C.; Stark, C. P.; Ekström, G.

    2015-06-01

    We carry out a combined analysis of the short- and long-period seismic signals generated by the devastating Oso-Steelhead landslide that occurred on 22 March 2014. The seismic records show that the Oso-Steelhead landslide was not a single slope failure, but a succession of multiple failures distinguished by two major collapses that occurred approximately 3 min apart. The first generated long-period surface waves that were recorded at several proximal stations. We invert these long-period signals for the forces acting at the source, and obtain estimates of the first failure runout and kinematics, as well as its mass after calibration against the mass-centre displacement estimated from remote-sensing imagery. Short-period analysis of both events suggests that the source dynamics of the second event is more complex than the first. No distinct long-period surface waves were recorded for the second failure, which prevents inversion for its source parameters. However, by comparing the seismic energy of the short-period waves generated by both events we are able to estimate the volume of the second. Our analysis suggests that the volume of the second failure is about 15-30% of the total landslide volume, giving a total volume mobilized by the two events between 7 × 106 and 10 × 106 m3, in agreement with estimates from ground observations and lidar mapping.

  11. Assessment of prey vulnerability through analysis of wolf movements and kill sites.

    PubMed

    Bergman, Eric J; Garrott, Robert A; Creel, Scott; Borkowski, John J; Jaffe, Rosemary; Watson, E G R

    2006-02-01

    Within predator-prey systems behavior can heavily influence spatial dynamics, and accordingly, the theoretical study of how spatial dynamics relate to stability within these systems has a rich history. However, our understanding of these behaviors in large mammalian systems is poorly developed. To address the relationship between predator selection patterns, prey density, and prey vulnerability, we quantified selection patterns for two fine-scale behaviors of a recovering wolf (Canis lupus) population in Yellowstone National Park, Wyoming, USA. Wolf spatial data were collected between November and May from 1998-1999 until 2001-2002. Over four winters, 244 aerial locations, 522 ground-based telemetry locations, 1287 km of movement data from snow tracking, and the locations of 279 wolf kill sites were recorded. There was evidence that elk (Cervus elaphus) and bison (Bison bison) densities had a weak effect on the sites where wolves traveled and made kills. Wolf movements showed a strong selection for geothermal areas, meadows, and areas near various types of habitat edges. Proximity to edge and habitat class also had a strong influence on the locations where elk were most vulnerable to predation. There was little evidence that wolf kill sites differed from the places where wolves traveled, indicating that elk vulnerability influenced where wolves selected to travel. Our results indicate that elk are more vulnerable to wolves under certain conditions and that wolves are capable of selecting for these conditions. As such, vulnerability plays a central role in predator-prey behavioral games and can potentially impact the systems to which they relate. PMID:16705979

  12. A rainfall risk analysis thanks to an GIS based estimation of urban vulnerability

    NASA Astrophysics Data System (ADS)

    Renard, Florent; Pierre-Marie, Chapon

    2010-05-01

    The urban community of Lyon, situated in France in the north of the Rhône valley, comprises 1.2 million inhabitants within 515 km ². With such a concentration of issues, policy makers and local elected officials therefore attach great importance to the management of hydrological risks, particularly due to the inherent characteristics of the territory. If the hazards associated with these risks in the territory of Lyon have been the subject of numerous analyses, studies on the vulnerability of greater Lyon are rare and have common shortcomings that impair their validity. We recall that the risk is seen as the classic relationship between the probability of occurrence of hazards and vulnerability. In this article, this vulnerability will be composed of two parts. The first one is the sensitivity of the stakes facing hydrological hazards as urban runoff, that is to say, their propensity to suffer damage during a flood (Gleize and Reghezza, 2007). The second factor is their relative importance in the functioning of the community. Indeed, not all the stakes could provide the same role and contribution to the Greater Lyon. For example, damage to the urban furniture such as bus shelter seems less harmful to the activities of the urban area than that of transport infrastructure (Renard and Chapon, 2010). This communication proposes to assess the vulnerability of Lyon urban area facing to hydrological hazards. This territory is composed of human, environmental and material stakes. The first part of this work is to identify all these issues so as to completeness. Then, is it required to build a "vulnerability index" (Tixier et al, 2006). Thus, it is necessary to use methods of multicriteria decision aid to evaluate the two components of vulnerability: the sensitivity and the contribution to the functioning of the community. Finally, the results of the overall vulnerability are presented, and then coupled to various hazards related to water such as runoff associated with

  13. Crucial role of detailed function, task, timeline, link and human vulnerability analyses in HRA. [Human Reliability Analysis (HRA)

    SciTech Connect

    Ryan, T.G.; Haney, L.N.; Ostrom, L.T.

    1992-01-01

    This paper addresses one major cause for large uncertainties in human reliability analysis (HRA) results, that is, an absence of detailed function, task, timeline, link and human vulnerability analyses. All too often this crucial step in the HRA process is done in a cursory fashion using word of mouth or written procedures which themselves may incompletely or inaccurately represent the human action sequences and human error vulnerabilities being analyzed. The paper examines the potential contributions these detailed analyses can make in achieving quantitative and qualitative HRA results which are: (1) creditable, that is, minimize uncertainty, (2) auditable, that is, systematically linking quantitative results and qualitative information from which the results are derived, (3) capable of supporting root cause analyses on human reliability factors determined to be major contributors to risk, and (4) capable of repeated measures and being combined with similar results from other analyses to examine HRA issues transcending individual systems and facilities. Based on experience analyzing test and commercial nuclear reactors, and medical applications of nuclear technology, an iterative process is suggested for doing detailed function, task, timeline, link and human vulnerability analyses using documentation reviews, open-ended and structured interviews, direct observations, and group techniques. Finally, the paper concludes that detailed analyses done in this manner by knowledgeable human factors practitioners, can contribute significantly to the credibility, auditability, causal factor analysis, and combining goals of the HRA.

  14. Numerical Analysis of JNES Seismic Tests on Degraded Combined Piping System

    SciTech Connect

    Zhang T.; Nie J.; Brust, F.; Wilkowski, G.; Hofmayer, C.; Ali, S.; Shim, D-J.

    2012-02-02

    Nuclear power plant safety under seismic conditions is an important consideration. The piping systems may have some defects caused by fatigue, stress corrosion cracking, etc., in aged plants. These cracks may not only affect the seismic response but also grow and break through causing loss of coolant. Therefore, an evaluation method needs to be developed to predict crack growth behavior under seismic excitation. This paper describes efforts conducted to analyze and better understand a series of degraded pipe tests under seismic loading that was conducted by Japan Nuclear Energy Safety Organization (JNES). A special 'cracked-pipe element' (CPE) concept, where the element represented the global moment-rotation response due to the crack, was developed. This approach was developed to significantly simplify the dynamic finite element analysis in fracture mechanics fields. In this paper, model validation was conducted by comparisons with a series of pipe tests with circumferential through-wall and surface cracks under different excitation conditions. These analyses showed that reasonably accurate predictions could be made using the abaqus connector element to model the complete transition of a circumferential surface crack to a through-wall crack under cyclic dynamic loading. The JNES primary loop recirculation piping test was analyzed in detail. This combined-component test had three crack locations and multiple applied simulated seismic block loadings. Comparisons were also made between the ABAQUS finite element (FE) analyses results to the measured displacements in the experiment. Good agreement was obtained, and it was confirmed that the simplified modeling is applicable to a seismic analysis for a cracked pipe on the basis of fracture mechanics. Pipe system leakage did occur in the JNES tests. The analytical predictions using the CPE approach did not predict leakage, suggesting that cyclic ductile tearing with large-scale plasticity was not the crack growth mode for

  15. Seismic response of a full-scale wind turbine tower using experimental and numerical modal analysis

    NASA Astrophysics Data System (ADS)

    Kandil, Kamel Sayed Ahmad; Saudi, Ghada N.; Eltaly, Boshra Aboul-Anen; El-khier, Mostafa Mahmoud Abo

    2016-09-01

    Wind turbine technology has developed tremendously over the past years. In Egypt, the Zafarana wind farm is currently generating at a capacity of 517 MW, making it one of the largest onshore wind farms in the world. It is located in an active seismic zone along the west side of the Gulf of Suez. Accordingly, seismic risk assessment is demanded for studying the structural integrity of wind towers under expected seismic hazard events. In the context of ongoing joint Egypt-US research project "Seismic Risk Assessment of Wind Turbine Towers in Zafarana wind Farm Egypt" (Project ID: 4588), this paper describes the dynamic performance investigation of an existing Nordex N43 wind turbine tower. Both experimental and numerical work are illustrated explaining the methodology adopted to investigate the dynamic behavior of the tower under seismic load. Field dynamic testing of the full-scale tower was performed using ambient vibration techniques (AVT). Both frequency domain and time domain methods were utilized to identify the actual dynamic properties of the tower as built in the site. Mainly, the natural frequencies, their corresponding mode shapes and damping ratios of the tower were successfully identified using AVT. A vibration-based finite element model (FEM) was constructed using ANSYS V.12 software. The numerical and experimental results of modal analysis were both compared for matching purpose. Using different simulation considerations, the initial FEM was updated to finally match the experimental results with good agreement. Using the final updated FEM, the response of the tower under the AQABA earthquake excitation was investigated. Time history analysis was conducted to define the seismic response of the tower in terms of the structural stresses and displacements. This work is considered as one of the pioneer structural studies of the wind turbine towers in Egypt. Identification of the actual dynamic properties of the existing tower was successfully performed

  16. Quantifying Distribution of Recent Sediment Using XRF Analysis and Seismic Data in the Hudson River Estuary

    NASA Astrophysics Data System (ADS)

    Haberman, M.; Nitsche, F. O.; Kenna, T. C.; Sands, E.; Bell, R. E.; Ryan, W. B.

    2006-12-01

    estimate average sedimentation rates. In general, we found good correspondence between results of the sediment core analysis and the layers identified in the seismic data. In several cases, the availability of the lead concentration information improved interpretation of the seismic data, allowing us to avoid over/underestimation of the thickness of the recent layer. The use of both seismic and sediment core analysis provides a more detailed and reliable map of the distribution of recent sediment deposition and more accurate volume estimates.

  17. Spatial Analysis of the Level of Exposure to Seismic Hazards of Health Facilities in Mexico City, Mexico

    NASA Astrophysics Data System (ADS)

    Moran, S.; Novelo-Casanova, D. A.

    2011-12-01

    Although health facilities are essential infrastructure during disasters and emergencies, they are also usually highly vulnerable installations in the case of the occurrence of large and major earthquakes. Hospitals are one of the most complex critical facilities in modern cities and they are used as first response in emergency situations. The operability of a hospital must be maintained after the occurrence of a local strong earthquake in order to satisfy the need for medical care of the affected population. If a health facility is seriously damaged, it cannot fulfill its function when most is needed. In this case, hospitals become a casualty of the disaster. To identify the level of physical exposure of hospitals to seismic hazards in Mexico City, we analyzed their geographic location with respect to the seismic response of the different type of soils of the city from past earthquakes, mainly from the events that occurred on September 1985 (Ms= 8.0) and April 1989 (Ms= 6.9). Seismic wave amplification in this city is the result of the interaction of the incoming seismic waves with the soft and water saturated clay soils, on which a large part of Mexico City is built. The clay soils are remnants of the lake that existed in the Valley of Mexico and which has been drained gradually to accommodate the growing urban sprawl. Hospital facilities were converted from a simple database of names and locations into a map layer of resources. This resource layer was combined with other map layers showing areas of seismic microzonation in Mexico City. This overlay was then used to identify those hospitals that may be threatened by the occurrence of a large or major seismic event. We analyzed the public and private hospitals considered as main health facilities. Our results indicate that more than 50% of the hospitals are highly exposed to seismic hazards. Besides, in most of these health facilities we identified the lack of preventive measures and preparedness to reduce their

  18. Seasonal modulation of the seismicity in Provence (France): from early historical analysis to case studies and mechanisms

    NASA Astrophysics Data System (ADS)

    Le Dortz, K.; Bollinger, L.; Leroy, Y. M.

    2011-12-01

    Southeastern France is a region of moderate seismic activity characterised by shallow seismic events, regularly felt and reported by local inhabitants in several districts. An analysis of one of the very first earthquakes catalogues covering the region lead Alexis Perrey, a famous author of regional monographs and annual catalogues of seismicity, to propose in 1845 that the area was characterised by an annual periodicity of earthquakes. The seismic activity appears to be more predominant during the autumn and winter months than during summer months. Although not exempt of biases, his analysis is interesting, his so-called seismic curves being aftershock-depleted by a 'unit-earthquake-technique' he developed for the occasion. Indeed, several atypical earthquake swarms, lasting days or months, regularly associated by the observers with seasonal floods or catastrophic storms, happen in the region. Several localities have also been damaged by some of these seismic events that can be destructive. We first review Perrey's seismicity analysis. We then focus on the description of some of the Provence regions that seem prone to atypical shallow seismicity and have a significant weight on the regional catalogue. We finally discuss several mechanisms able to modulate the local and/or regional seismicity. Most of these mechanisms involve the local hydrology and influence in a subtle way the local stresses at shallow depths. Perrey, A. (1845) Mémoire sur les tremblements de terre ressentis dans le bassin du Rhône, Annales des sciences physiques et naturelles, d'agriculture et d'industrie, 265-346.

  19. Detection of ancient morphology and potential hydrocarbon traps using 3-D seismic data and attribute analysis

    SciTech Connect

    Heggland, R.

    1995-12-31

    This paper presents the use of seismic attributes on 3D data to reveal Tertiary and Cretaceous geological features in Norwegian block 9/2. Some of the features would hardly be possible to map using only 2D seismic data. The method which involves a precise interpretation of horizons, attribute analysis and manipulation of colour displays, may be useful when studying morphology, faults and hydrocarbon traps. The interval of interest in this study was from 0 to 1.5 s TWT. Horizontal displays (timeslices and attribute maps), seemed to highlight very nicely geological features such as shallow channels, fractures, karst topography and faults. The attributes used for mapping these features were amplitude, total reflection energy (a volume or time interval attribute), dip and azimuth. The choice of colour scale and manipulation of colour displays were also critical for the results. The data examples clearly demonstrate how it is possible to achieve a very detailed mapping of geological features using 3D seismic data and attribute analysis. The results of this study were useful for the understanding of hydrocarbon migration paths and hydrocarbon traps.

  20. A status report on the development of SAC2000: A new seismic analysis code

    SciTech Connect

    Goldstein, P.; Minner, L.

    1995-08-01

    We are developing a new Seismic Analysis Code (SAC2000) that will meet the research needs of the seismic research and treaty monitoring communities. Our first step in this development was to rewrite the original Seismic Analysis Code (SAC) -- a Fortran code that was approximately 140,000 lines long -- in the C programming language. This rewrite has resulted in a much more robust code that is faster, more efficient, and more portable than the original. We have implemented important processing capabilities such as convolution and binary monograms, and we have significantly enhanced several previously existing capabilities. For example, the spectrogram command now produces a correctly registered plot of the input time series and a color image of the output spectrogram. We have also added an image plotting capability with access to 17 predefined color tables or custom color tables. A rewritten version of the readcss command can now be used to access any of the documented css.3.0 database data formats, a capability that is particularly important to the Air Force Technical Applications Center (AFTAC) and the monitoring community. A much less visible, but extremely important contribution is the correction of numerous inconsistencies and errors that have evolved because of piecemeal development and limited maintenance since SAC was first written. We have also incorporated on-line documentation and have made SAC documentation available on the Internet via the world-wide-web at http://www-ep/tvp/sac.html.

  1. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  2. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  3. Source Mechanism, Stress Triggering, and Hazard Analysis of Induced Seismicity in Oil/Gas Fields in Oman and Kuwait

    NASA Astrophysics Data System (ADS)

    Gu, C.; Toksoz, M. N.; Ding, M.; Al-Enezi, A.; Al-Jeri, F.; Meng, C.

    2015-12-01

    Induced seismicity has drawn new attentions in both academia and industry in recent years as the increasing seismic activity in the regions of oil/gas fields due to fluid injection/extraction and hydraulic fracturing. Source mechanism and triggering stress of these induced earthquakes are of great importance for understanding their causes and the physics of the seismic processes in reservoirs. Previous research on the analysis of induced seismic events in conventional oil/gas fields assumed a double couple (DC) source mechanism. The induced seismic data in this study are from both Oman and Kuwait. For the Oman data, the induced seismicity is monitored by both surface network (0seismic data (0seismicity data, based on a full-waveform inversion method (Song and Toksöz, 2011). With the full moment tensor inversion results, Coulomb stress is calculated to investigate the triggering features of the induced seismicity data. Our results show a detailed evolution of 3D triggering stress in oil/gas fields from year 1999 to 2007 for Oman, and from year 2006 to 2015 for Kuwait. In addition, the local hazard corresponding to the induced seismicity in these oil/gas fields is assessed and compared to ground motion prediction due to large (M>5.0) regional tectonic earthquakes.

  4. Seismic monitoring of the Séchilienne Rockslide (French Alps): analysis of seismic signals and their correlation with rainfalls

    NASA Astrophysics Data System (ADS)

    Helmstetter, Agnès.; Garambois, Stéphane; Kasperski, Johan; Duranthon, Jean-Paul; Pothérat, Pierre

    2010-05-01

    In the French Alps, Séchilienne rockslide is one of the natural phenomena presenting the highest risk in terms of socio-economical outcomes. This rock slide has been officially recognized as active for a few decades, and has been instrumented since 1985 for surveillance purposes. The current very active volume of this rockslide is roughly estimated to be up to 3 millions m3, located on the border of a slowly moving mass reaching 50 to 100 millions m3. The velocity of the most active zone has reached 1.4 m/yr in 2008, about twice the value of 2000. To assess the potential of seismology to supplement the current monitoring system, presently based on displacements measurements, a seismic network was installed in May 2007. It consists in three seismological stations deployed as antennas together with 37 velocimeters. It was installed thanks to the OMIV French national Observatory on landslides. Besides its main role in the monitoring of the seismic activity within the landslide, such network also aims to assess potential seismic site effects in case of earthquakes. Finally, it could also be useful to detect and characterize possible seismic velocity changes over time, by using noise correlation methods which require large observation periods. The seismological network has now recorded several thousands events, mostly due to rockfalls, but also hundreds of local (within the landslide) and regional earthquakes. We show here that most part of the recorded events can be distinguished and classified using their signal characteristics (frequency, duration). Some of the events, which were generated by rock falls, were also recorded by a camera facing the landslide for large volume studies. Unfortunately the acquired images are presently unable to provide the crucial information on fallen volumes, what prevents any calibration attempt between seismic energy and rock fall volume. We also found that rock falls and micro-seismicity, which occur in burst of activity, were weakly

  5. Detection, location, and analysis of earthquakes using seismic surface waves (Beno Gutenberg Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Ekström, Göran

    2015-04-01

    For shallow sources, Love and Rayleigh waves are the largest seismic phases recorded at teleseismic distances. The utility of these waves for earthquake characterization was traditionally limited to magnitude estimation, since geographically variable dispersion makes it difficult to determine useful travel-time information from the waveforms. Path delays due to heterogeneity of several tens of seconds are typical for waves at 50 sec period, and these delays must be accounted for with precision and accuracy in order to extract propagation-phase and source-phase information. Advances in tomographic mapping of global surface-wave phase velocities, and continuous growth and improvements of seismographic networks around the world, now make possible new applications of surface waves for earthquake monitoring and analysis. Through continuous back propagation of the long-period seismic wave field recorded by globally distributed stations, nearly all shallow earthquakes greater than M=5 can be detected and located with a precision of 25 km. Some of the detected events do not appear in standard earthquake catalogs and correspond to non-tectonic earthquakes, including landslides, glacier calving, and volcanic events. With the improved ability to predict complex propagation effects of surface waves across a heterogeneous Earth, moment-tensor and force representations of seismic sources can be routinely determined for all earthquakes greater than M=5 by waveform fitting of surface waves. A current area of progress in the use of surface waves for earthquake studies is the determination of precise relative locations of remote seismicity by systematic cross correlation and analysis of surface waves generated by neighboring sources. Preliminary results indicate that a location precision of 5 km may be achievable in many areas of the world.

  6. Seismic bearing

    NASA Astrophysics Data System (ADS)

    Power, Dennis

    2009-05-01

    Textron Systems (Textron) has been using geophones for target detection for many years. This sensing capability was utilized for detection and classification purposes only. Recently Textron has been evaluating multiaxis geophones to calculate bearings and track targets more specifically personnel. This capability will not only aid the system in locating personnel in bearing space or cartesian space but also enhance detection and reduce false alarms. Textron has been involved in the testing and evaluation of several sensors at multiple sites. One of the challenges of calculating seismic bearing is an adequate signal to noise ratio. The sensor signal to noise ratio is a function of sensor coupling to the ground, seismic propagation and range to target. The goals of testing at multiple sites are to gain a good understanding of the maximum and minimum ranges for bearing and detection and to exploit that information to tailor sensor system emplacement to achieve desired performance. Test sites include 10A Site Devens, MA, McKenna Airfield Ft. Benning, GA and Yuma Proving Ground Yuma, AZ. Geophone sensors evaluated include a 28 Hz triax spike, a 15 Hz triax spike and a hybrid triax spike consisting of a 10 Hz vertical geophone and two 28 Hz horizontal geophones. The algorithm uses raw seismic data to calculate the bearings. All evaluated sensors have triaxial geophone configuration mounted to a spike housing/fixture. The suite of sensors also compares various types of geophones to evaluate benefits in lower bandwidth. The data products of these tests include raw geophone signals, seismic features, seismic bearings, seismic detection and GPS position truth data. The analyses produce Probability of Detection vs range, bearing accuracy vs range, and seismic feature level vs range. These analysis products are compared across test sites and sensor types.

  7. New Software for Long-Term Storage and Analysis of Seismic Wave Data

    NASA Astrophysics Data System (ADS)

    Cervelli, D. P.; Cervelli, P. F.; Murray, T. L.

    2004-12-01

    Large seismic networks generate a substantial quantity of data that must be first archived, and then disseminated, visualized, and analyzed, in real-time, in the office or from afar. To achieve these goals for the Alaska Volcano Observatory we developed two software packages: Winston, a database for storing seismic wave data, and Swarm, an application for analyzing and browsing the data. We also modified an existing package, Valve, an internet web-browser based interface to various data sets developed at the Hawaiian Volcano Observatory, to communicate with Winston. These programs provide users with the tools necessary to monitor many commonly used geophysical parameters. Winston, Wave Information Storage Network, uses a vendor-neutral SQL database to store seismic wave data. Winston's primary design goal was simple: develop a more robust, scalable, long-term replacement for the Earthworm waveserver. Access to data within the Winston database is through a scalable internet based server application, an Earthworm waveserver emulator, or directly via SQL queries. Some benefits of using an SQL database are easy backups and exports, speed, and reliability. Swarm, Seismic Wave Analysis and Real-time Monitor, is a stand-alone application that was designed to replace the traditional drum helicorder and computer wave viewer with an intuitive and interactive interface for rapidly assessing volcanic hazard, browsing through past data, and analyzing waveforms. Users can easily view waves in traditional analytic ways, such as frequency spectra or spectrograms, and employ standard analytic tools like filtering. Swarm allows efficient dissemination of data and breaks cross-disciplinary barriers by creating an accessible interface to seismic data for non-seismologists. Swarm currently operates with many seismic data sources including Earthworm waveservers and SEED files. Lastly, Swarm can be a valuable education and outreach tool by using its Kiosk Mode: a full-screen mode that

  8. Vulnerability analysis and passenger source prediction in urban rail transit networks.

    PubMed

    Wang, Junjie; Li, Yishuai; Liu, Jingyu; He, Kun; Wang, Pu

    2013-01-01

    Based on large-scale human mobility data collected in San Francisco and Boston, the morning peak urban rail transit (URT) ODs (origin-destination matrix) were estimated and the most vulnerable URT segments, those capable of causing the largest service interruptions, were identified. In both URT networks, a few highly vulnerable segments were observed. For this small group of vital segments, the impact of failure must be carefully evaluated. A bipartite URT usage network was developed and used to determine the inherent connections between urban rail transits and their passengers' travel demands. Although passengers' origins and destinations were easy to locate for a large number of URT segments, a few show very complicated spatial distributions. Based on the bipartite URT usage network, a new layer of the understanding of a URT segment's vulnerability can be achieved by taking the difficulty of addressing the failure of a given segment into account. Two proof-of-concept cases are described here: Possible transfer of passenger flow to the road network is here predicted in the cases of failures of two representative URT segments in San Francisco.

  9. Vulnerability Analysis and Passenger Source Prediction in Urban Rail Transit Networks

    PubMed Central

    Wang, Junjie; Li, Yishuai; Liu, Jingyu; He, Kun; Wang, Pu

    2013-01-01

    Based on large-scale human mobility data collected in San Francisco and Boston, the morning peak urban rail transit (URT) ODs (origin-destination matrix) were estimated and the most vulnerable URT segments, those capable of causing the largest service interruptions, were identified. In both URT networks, a few highly vulnerable segments were observed. For this small group of vital segments, the impact of failure must be carefully evaluated. A bipartite URT usage network was developed and used to determine the inherent connections between urban rail transits and their passengers' travel demands. Although passengers' origins and destinations were easy to locate for a large number of URT segments, a few show very complicated spatial distributions. Based on the bipartite URT usage network, a new layer of the understanding of a URT segment's vulnerability can be achieved by taking the difficulty of addressing the failure of a given segment into account. Two proof-of-concept cases are described here: Possible transfer of passenger flow to the road network is here predicted in the cases of failures of two representative URT segments in San Francisco. PMID:24260355

  10. A procedure for seismic risk reduction in Campania Region

    NASA Astrophysics Data System (ADS)

    Zuccaro, G.; Palmieri, M.; Maggiò, F.; Cicalese, S.; Grassi, V.; Rauci, M.

    2008-07-01

    The Campania Region has set and performed a peculiar procedure in the field of seismic risk reduction. Great attention has been paid to public strategic buildings such as town halls, civil protection buildings and schools. The Ordinance 3274 promulgate in the 2004 by the Italian central authority obliged the owners of strategic buildings to perform seismic analyses within 2008 in order to check the safety of the structures and the adequacy to the use. In the procedure the Campania region, instead of the local authorities, ensure the complete drafting of seismic checks through financial resources of the Italian Government. A regional scientific technical committee has been constituted, composed of scientific experts, academics in seismic engineering. The committee has drawn up guidelines for the processing of seismic analyses. At the same time, the Region has issued a public competition to select technical seismic engineering experts to appoint seismic analysis in accordance with guidelines. The scientific committee has the option of requiring additional documents and studies in order to approve the safety checks elaborated. The Committee is supported by a technical and administrative secretariat composed of a group of expert in seismic engineering. At the moment several seismic safety checks have been completed. The results will be presented in this paper. Moreover, the policy to mitigate the seismic risk, set by Campania region, was to spend the most of the financial resources available on structural strengthening of public strategic buildings rather than in safety checks. A first set of buildings of which the response under seismic action was already known by data and studies of vulnerability previously realised, were selected for immediate retrofitting designs. Secondly, an other set of buildings were identified for structural strengthening. These were selected by using the criteria specified in the Guide Line prepared by the Scientific Committee and based on

  11. A procedure for seismic risk reduction in Campania Region

    SciTech Connect

    Zuccaro, G.; Palmieri, M.; Cicalese, S.; Grassi, V.; Rauci, M.; Maggio, F.

    2008-07-08

    The Campania Region has set and performed a peculiar procedure in the field of seismic risk reduction. Great attention has been paid to public strategic buildings such as town halls, civil protection buildings and schools. The Ordinance 3274 promulgate in the 2004 by the Italian central authority obliged the owners of strategic buildings to perform seismic analyses within 2008 in order to check the safety of the structures and the adequacy to the use. In the procedure the Campania region, instead of the local authorities, ensure the complete drafting of seismic checks through financial resources of the Italian Government. A regional scientific technical committee has been constituted, composed of scientific experts, academics in seismic engineering. The committee has drawn up guidelines for the processing of seismic analyses. At the same time, the Region has issued a public competition to select technical seismic engineering experts to appoint seismic analysis in accordance with guidelines. The scientific committee has the option of requiring additional documents and studies in order to approve the safety checks elaborated. The Committee is supported by a technical and administrative secretariat composed of a group of expert in seismic engineering. At the moment several seismic safety checks have been completed. The results will be presented in this paper. Moreover, the policy to mitigate the seismic risk, set by Campania region, was to spend the most of the financial resources available on structural strengthening of public strategic buildings rather than in safety checks. A first set of buildings of which the response under seismic action was already known by data and studies of vulnerability previously realised, were selected for immediate retrofitting designs. Secondly, an other set of buildings were identified for structural strengthening. These were selected by using the criteria specified in the Guide Line prepared by the Scientific Committee and based on

  12. Sampling and Analysis Plan - Waste Treatment Plant Seismic Boreholes Project

    SciTech Connect

    Reidel, Steve P.

    2006-05-26

    This sampling and analysis plan (SAP) describes planned data collection activities for four entry boreholes through the sediment overlying the basalt, up to three new deep rotary boreholes through the basalt and sedimentary interbeds, and one corehole through the basalt and sedimentary interbeds at the Waste Treatment Plant (WTP) site. The SAP will be used in concert with the quality assurance plan for the project to guide the procedure development and data collection activities needed to support borehole drilling, geophysical measurements, and sampling. This SAP identifies the American Society of Testing Materials standards, Hanford Site procedures, and other guidance to be followed for data collection activities.

  13. Determination of controlling earthquakes from probabilistic seismic hazard analysis for nuclear reactor sites

    SciTech Connect

    Boissonnade, A.; Bernreuter, D.; Chokshi, N.; Murphy, A.

    1995-04-04

    Recently, the US Nuclear Regulatory Commission published, for public comments, a revision to 10 CFR Part 100. The proposed regulation acknowledges that uncertainties are inherent in estimates of the Safe Shutdown Earthquake Ground Motion (SSE) and requires that these uncertainties be addressed through an appropriate analysis. One element of this evaluation is the assessment of the controlling earthquake through the probabilistic seismic hazard analysis (PSHA) and its use in determining the SSE. This paper reviews the basis for the various key choices in characterizing the controlling earthquake.

  14. Seismic response analysis of an instrumented building structure

    USGS Publications Warehouse

    Li, H.-J.; Zhu, S.-Y.; Celebi, M.

    2003-01-01

    The Sheraton - Universal hotel, an instrumented building lying in North Hollywood, USA is selected for case study in this paper. The finite element method is used to produce a linear time - invariant structural model, and the SAP2000 program is employed for the time history analysis of the instrumented structure under the base excitation of strong motions recorded in the basement during the Northridge, California earthquake of 17 January 1994. The calculated structural responses are compared with the recorded data in both time domain and frequency domain, and the effects of structural parameters evaluation and indeterminate factors are discussed. Some features of structural response, such as the reason why the peak responses of acceleration in the ninth floor are larger than those in the sixteenth floor, are also explained.

  15. Seismic evaluation of rocking structures through performance assessment and fragility analysis

    NASA Astrophysics Data System (ADS)

    Vetr, Mohammad G.; Nouri, Abolfazl Riahi; Kalantari, Afshin

    2016-03-01

    Numerical studies have been conducted for low- and medium-rise rocking structures to investigate their efficiency as earthquake-resisting systems in comparison with conventional structures. Several non-linear time-history analyses have been performed to evaluate seismic performance of selected cases at desired ground shaking levels, based on key parameters such as total and flexural story drifts and residual deformations. The Far-field record set is selected as input ground motions and median peak values of key parameters are taken as best estimates of system response. In addition, in order to evaluate the probability of exceeding relevant damage states, analytical fragility curves have been developed based on the results of the incremental dynamic analysis procedure. Small exceedance probabilities and acceptable margins against collapse, together with minor associated damages in main structural members, can be considered as superior seismic performance for medium-rise rocking systems. Low-rise rocking systems could provide significant performance improvement over their conventional counterparts notwithstanding certain weaknesses in their seismic response.

  16. Considering both aleatory variability and epistemic variability in probabilistic seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Sung, Chih-Hsuan; Gao, Jia-Cian; Lee, Chyi-Tyi

    2015-04-01

    In the modern probabilistic seismic hazard analysis (PSHA), a standard deviation (sigma) of total variability was considered in the integration for seismic exceeding rate, and this lead to increased seismic hazard estimates. Epistemic uncertainty results from incomplete knowledge of the earthquake process and has nothing to do with neither the temporal variation nor the spatial variation of ground motions. It is not could be considered in the integration, epistemic variability may be included in the logic trees. This study uses Taiwan data as example to test a case in Taipei. Results reveal that if only the aleatory variability is considered in the integration, the hazard level could be reduced about 33% at the 475-year return period, and it reduced about 36% and 50% at 10000-year and 100000-year, respectively. However, if epistemic variability is considered in the logic trees besides the aleatory variability is considered in the integration, then the hazard level is similar to that from using total variability; it shows only a little bit smaller at long return period. Much effort in reducing the hazard level to a reasonable value still remains to be studied.

  17. Considering both aleatory variability and epistemic variability in probabilistic seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Gao, J. C.; Lee, C. T.

    2014-12-01

    In the modern probabilistic seismic hazard analysis (PSHA), a standard deviation (sigma) of total variability was considered in the integration for seismic exceeding rate, and this lead to increased seismic hazard estimates. Epistemic uncertainty results from incomplete knowledge of the earthquake process and has nothing to do with neither the temporal variation nor the spatial variation of ground motions. It is not could be considered in the integration, epistemic variability may be included in the logic trees. This study uses Taiwan data as example to test a case in Taipei. Results reveal that if only the aleatory variability is considered in the integration, the hazard level could be reduced about 33% at the 475-year return period, and it reduced about 36% and 50% at 10000-year and 100000-year, respectively. However, if epistemic variability is considered in the logic trees besides the aleatory variability is considered in the integration, then the hazard level is similar to that from using total variability; it shows only a little bit smaller at long return period. Much effort in reducing the hazard level to a reasonable value still remains to be studied.

  18. Limitations of quantitative analysis of deep crustal seismic reflection data: Examples from GLIMPCE

    USGS Publications Warehouse

    Lee, Myung W.; Hutchinson, Deborah R.

    1992-01-01

    Amplitude preservation in seismic reflection data can be obtained by a relative true amplitude (RTA) processing technique in which the relative strength of reflection amplitudes is preserved vertically as well as horizontally, after compensating for amplitude distortion by near-surface effects and propagation effects. Quantitative analysis of relative true amplitudes of the Great Lakes International Multidisciplinary Program on Crustal Evolution seismic data is hampered by large uncertainties in estimates of the water bottom reflection coefficient and the vertical amplitude correction and by inadequate noise suppression. Processing techniques such as deconvolution, F-K filtering, and migration significantly change the overall shape of amplitude curves and hence calculation of reflection coefficients and average reflectance. Thus lithological interpretation of deep crustal seismic data based on the absolute value of estimated reflection strength alone is meaningless. The relative strength of individual events, however, is preserved on curves generated at different stages in the processing. We suggest that qualitative comparisons of relative strength, if used carefully, provide a meaningful measure of variations in reflectivity. Simple theoretical models indicate that peg-leg multiples rather than water bottom multiples are the most severe source of noise contamination. These multiples are extremely difficult to remove when the water bottom reflection coefficient is large (>0.6), a condition that exists beneath parts of Lake Superior and most of Lake Huron.

  19. Independent Analysis of Seismicity and Rock fall Scenarios for the Yucca Mountain Repository

    SciTech Connect

    Apted, M.J.; Kemeny, J.M.; Martin, C.D.; James, R.J.

    2006-07-01

    Yucca Mountain is located in the somewhat seismically active Basin and Range province. Future seismic activity is identified by the US Nuclear Regulatory Commission and the US National Academy of Sciences as a key scenario for safety assessment of a proposed repository at Yucca Mountain. As part of its on-going program of conducting independent analyses of scientific and technical issues that could be important to the licensing of the Yucca Mountain repository, EPRI has conducted an analysis of the combined scenarios of seismic activity and stability of emplacement drifts with respect to the long-term repository safety. In this paper we present the results of 3D finite element simulations of both static and dynamic loading of a degraded waste package. For the static case, the expected maximum static load is determined by utilizing relationships between cave height and the bulking factor. A static load representing 30 meters of broken rock was simulated using the finite element model. For the dynamic case, block size and velocity data from the most recent Drift Degradation AMR are used. Based on this, a rock block with a volume of 3.11 m{sup 3} and with an impact velocity of 4.81 m/s was simulated using the finite element model. In both cases, the results indicate that the waste package remains intact. (authors)

  20. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    NASA Astrophysics Data System (ADS)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2016-09-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  1. Time lapse seismic signal analysis for Cranfield, MS, EOR and CCS site

    NASA Astrophysics Data System (ADS)

    Ditkof, J.; Caspari, E.; Pevzner, R.; Urosevic, M.; Meckel, T. A.; Hovorka, S. D.

    2012-12-01

    The Cranfield field located in Southwest Mississippi is an EOR and CCS project which has been under continuous CO2 injection by Denbury Onshore LLC since 2008. To date, more than 3 million tons of CO2 remain in the subsurface. In 2007 and 2010, 3D seismic surveys were shot and an initial 4D seismic response was characterized showing coherent amplitude anomalies in some areas which received large amounts of CO2, but not in others. Previous work used Gassmann fluid substitution at two different wells, 31F-2 observation well and the 28-1 injection well to predict a post-injection saturation curves and acoustic impedance change through the reservoir. Since this writing, a second injection well, the 44-2 well, was added to the analysis to improve the practically unconstrained inversion. The two seismic volumes were cross-equalized with an appropriate correlation coefficient through well ties. Acoustic impedance inversions were carried out on each survey resulting with higher acoustic impedance changes than predicted by Gassmann for the 28-1 and 44-2 injection wells. The time-lapse acoustic impedance however is similar to the difference calculated from a time-delay along a horizon below the reservoir.

  2. Seismic stratigraphy and subsidence analysis in the Barrow-Dampier subbasin, northwest Australia

    SciTech Connect

    Westphal, H.; Aigner, T.

    1997-10-01

    The objective of this study is to analyze the basin-fill history of the Barrow-Dampier Subbasin (Northwest Shelf, Australia) as a spectacular and complex example of the stratigraphic record of continental breakup. From the Permian until the Holocene, the Barrow-Dampier Subbasin underwent development from a continental sedimentary basin located on the Gondwana continent to a rift graben. When extensional movement ceased, the subbasin developed as a passive continental margin. This study, based on marine seismic data (about 600 km) and logs of 20 wells, includes a seismic stratigraphic analysis tied to wells in the vicinity, chronostratigraphic charts, and the calculation of subsidence curves. The basin fill is characterized by a hierarchically organized architecture. The largest scale are four tectonic-stratigraphic units: prerift (Upper Permian to Plienobachian), rift (until Callovian), postrift (until the Upper Cretaceous), and convergence (Neogene). The tectonostratigraphic units were built by 13 sequences of the time scale of second-order sequences. Several of the sequences coincide with discrete subsidence episodes on geohistory plots. For most sequences and sequence boundaries, either a eustatic or a tectonically enhanced origin could be established. A series of seismic facies maps for these sequences visualize the sedimentary basin evolution. Several of the second-order sequences can be subdivided into systems tracts or higher order sequences.

  3. Improved implementation of the fk and Capon methods for array analysis of seismic noise

    NASA Astrophysics Data System (ADS)

    Gal, M.; Reading, A. M.; Ellingsen, S. P.; Koper, K. D.; Gibbons, S. J.; Näsholm, S. P.

    2014-08-01

    The frequency-wavenumber (fk) and Capon methods are widely used in seismic array studies of background or ambient noise to infer the backazimuth and slowness of microseismic sources. We present an implementation of these techniques for the analysis of microseisms (0.05-2 Hz) which draws on array signal processing literature from a range of disciplines. The presented techniques avoid frequency mixing in the cross-power spectral density and therefore yield an accurate slowness vector estimation of the incoming seismic waves. Using synthetic data, we show explicitly how the frequency averaged broad-band approach can result in a slowness-shifted spectrum. The presented implementation performs the slowness estimations individually for each frequency bin and sums the resulting slowness spectra over a specific frequency range. This may be termed an incoherently averaged signal, or IAS, approach. We further modify the method through diagonal loading to ensure a robust solution. The synthetic data show good agreement between the analytically derived and inferred error in slowness. Results for real (observed) data are compared between the approximate and IAS methods for two different seismic arrays. The IAS method results in the improved resolution of features, particularly for the Capon spectrum, and enables, for instance, Rg and Lg arrivals from similar backazimuths to be separated in the case of real data.

  4. Non-extensive statistical analysis of seismicity in the area of Javakheti, Georgia

    NASA Astrophysics Data System (ADS)

    Matcharashvili, T.; Chelidze, T.; Javakhishvili, Z.; Jorjiashvili, N.; Fra Paleo, U.

    2011-10-01

    The distribution of earthquake magnitudes in the Javakheti highlands was analyzed using a non-extensive statistical approach. The earthquakes occurring from 1960 to 2008 in this seismically active area of Southern Caucasus were investigated. The seismic catalog was studied using different threshold magnitude values. Analyses of the whole time period of observations as well as of sub-catalogs of consecutive 10-year span time windows were performed. In every case non-extensive parameter q and value a, the physical quantity characterizing energy density, were calculated from the modified frequency-magnitude relationship. According to our analysis the magnitude sequence in the Javakheti area for the whole period of observation is characterized by a non-extensivity parameter q=1.81, in the upper limit of values reported elsewhere. While calculated non-extensivity parameters for consecutive 10-year windows fall within the range 1.6-1.7 reported worldwide. A significant increase of parameter q was identified in those 10-year sub-catalogs that included the strongest earthquakes within the period of observation. We suppose that this increase may be related to a more correlated behavior within the system of 'fault fragments' when a strong earthquake strikes or immediately after; during aftershock activity. Concurrently, smaller values of non-extensivity parameters qi, found during seismically relatively quiet times, could be associated to the decreased correlations within the system during the earthquake generation stage, under an essentially decreased tectonic stress. The behavior of the energy density characteristic a almost mirrors the variation of parameter q: increases for seismically quiet periods in the Javakheti area and decreases in periods when strong earthquakes occur. We suggest that decreases of energy density characteristic a may point to a prevalent contribution of large size fragments to fragment-asperity interaction under the influence of a rapidly released

  5. A Framework for the Validation of Probabilistic Seismic Hazard Analysis Maps Using Strong Ground Motion Data

    NASA Astrophysics Data System (ADS)

    Bydlon, S. A.; Beroza, G. C.

    2015-12-01

    Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.

  6. Preliminary Ambient Noise and Seismic Interferometry Analysis of the Laguna del Maule Volcanic Field, Chile

    NASA Astrophysics Data System (ADS)

    Wespestad, C.; Thurber, C. H.; Bennington, N. L.; Zeng, X.; Cardona, C.; Keranen, K. M.; Singer, B. S.

    2015-12-01

    Laguna del Maule Volcanic Field is a large, restless, youthful rhyolitic system in the Southern Andes of Chile. We present a preliminary examination of ambient noise data at this site from 12 University of Wisconsin and 6 OVDAS (Southern Andean Volcano Observatory) broadband seismometers for a 3 month period. Ambient noise tomography seeks to correlate pairs of stations, with one station acting as a virtual source and the other a receiver, generating empirical Green's functions between each pair. The noise correlation functions (NCFs) were computed for day-long and hour-long windows, then the final NCFs were obtained from stacking each time window set. The hour-long NCFs converged more rapidly, so this time window was chosen for use in later stages. This study used phase weighted stacking of the NCFs instead of linear stacking in order to achieve a better signal to noise ratio (SNR), although linearly stacked Green's functions were also created to confirm the improvement. Phase weighted stacking can detect signals with weak amplitudes much more clearly than linear stacking by finding coherence of signals in multiple frequency bins and down-weighting the importance of amplitude for correlation (Schimmel and Gallart, 2007). The Frequency-Time Analysis Technique was utilized to measure group velocity, and initial results show it to be about 2 km/s on average. Fluctuations of the average velocity between different station pairs across this dense array will provide a preliminary indication of the location and size of the magma system. This study also applied seismic interferometry using ambient noise to determine temporal changes in seismic velocity occurring at Laguna del Maule. Initial results show temporal changes in seismic velocity correlated to seasonal changes in the hydrologic cycle (rain, snow pack, snow melt, etc.). Current work focuses on identifying changes in seismic velocity associated with ongoing volcanic processes.

  7. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  8. ANALYSIS OF DAMAGE TO WASTE PACKAGES CAUSED BY SEISMIC EVENTS DURING POST-CLOSURE

    SciTech Connect

    Alves, S W; Blair, S C; Carlson, S R; Gerhard, M; Buscheck, T A

    2008-05-27

    This paper presents methodology and results of an analysis of damage due to seismic ground motion for waste packages emplaced in a nuclear waste repository at Yucca Mountain, Nevada. A series of three-dimensional rigid body kinematic simulations of waste packages, pallets, and drip shields subjected to seismic ground motions was performed. The simulations included strings of several waste packages and were used to characterize the number, location, and velocity of impacts that occur during seismic ground motion. Impacts were categorized as either waste package-to-waste package (WP-WP) or waste package-to-pallet (WP-P). In addition, a series of simulations was performed for WP-WP and WP-P impacts using a detailed representation of a single waste package. The detailed simulations were used to determine the amount of damage from individual impacts, and to form a damage catalog, indexed according to the type, angle, location and force/velocity of the impact. Finally, the results from the two analyses were combined to estimate the total damage to a waste package that may occur during an episode of seismic ground motion. This study addressed two waste package types, four levels of peak ground velocity (PGV), and 17 ground motions at each PGV. Selected aspects of waste package degradation, such as effective wall thickness and condition of the internals, were also considered. As expected, increasing the PGV level of the vibratory ground motion increases the damage to the waste packages. Results show that most of the damage is caused by WP-P impacts. TAD-bearing waste packages with intact internals are highly resistant to damage, even at a PGV of 4.07 m/s, which is the highest level analyzed.

  9. Seismic stability analysis of expanded MSW landfills using pseudo-static limit equilibrium method.

    PubMed

    Choudhury, Deepankar; Savoikar, Purnanand

    2011-02-01

    Capacity expansion of existing landfills is the most economical alternative to constructing new landfills where cost of land is prohibitive. From the safety point of view, the stability analyses of existing landfills expanded either vertically and/or laterally are required for different stages of construction, operation and during closure of a landfill. In the present study, a pseudo-static limit equilibrium seismic stability analysis was performed for a typical side-hill type municipal solid waste (MSW) landfill expanded using an engineered berm. Seismic stability analyses were performed for the two critical cases, namely when the failure surface passes below the berm (under berm) and when the failure surface passes over the back slope of the berm (over berm). Close-form solutions were developed for the upper bound and lower bound factor of safety and the yield acceleration of such slopes under both failure conditions. From parametric analyses it was observed that as the height of the berm increased, the factor of safety for both the over-berm failure and the under-berm failure conditions also increased. The average factor of safety and yield acceleration coefficient were found and the under-berm failure condition became critical for a back slope steeper than 1.7H : 1V. The average factor of safety decreased as both horizontal and vertical seismic accelerations increased. Comparisons between the present results and those in the literature for the static case showed good agreement and the present results of the pseudo-static seismic case were found to be of particular importance.

  10. Topological performance measures as surrogates for physical flow models for risk and vulnerability analysis for electric power systems.

    PubMed

    LaRocca, Sarah; Johansson, Jonas; Hassel, Henrik; Guikema, Seth

    2015-04-01

    Critical infrastructure systems must be both robust and resilient in order to ensure the functioning of society. To improve the performance of such systems, we often use risk and vulnerability analysis to find and address system weaknesses. A critical component of such analyses is the ability to accurately determine the negative consequences of various types of failures in the system. Numerous mathematical and simulation models exist that can be used to this end. However, there are relatively few studies comparing the implications of using different modeling approaches in the context of comprehensive risk analysis of critical infrastructures. In this article, we suggest a classification of these models, which span from simple topologically-oriented models to advanced physical-flow-based models. Here, we focus on electric power systems and present a study aimed at understanding the tradeoffs between simplicity and fidelity in models used in the context of risk analysis. Specifically, the purpose of this article is to compare performance estimates achieved with a spectrum of approaches typically used for risk and vulnerability analysis of electric power systems and evaluate if more simplified topological measures can be combined using statistical methods to be used as a surrogate for physical flow models. The results of our work provide guidance as to appropriate models or combinations of models to use when analyzing large-scale critical infrastructure systems, where simulation times quickly become insurmountable when using more advanced models, severely limiting the extent of analyses that can be performed.

  11. Seismic geometric attribute analysis for fracture characterization: New methodologies and applications

    NASA Astrophysics Data System (ADS)

    Di, Haibin

    In 3D subsurface exploration, detection of faults and fractures from 3D seismic data is vital to robust structural and stratigraphic analysis in the subsurface, and great efforts have been made in the development and application of various seismic attributes (e.g. coherence, semblance, curvature, and flexure). However, the existing algorithms and workflows are not accurate and efficient enough for robust fracture detection, especially in naturally fractured reservoirs with complicated structural geometry and fracture network. My Ph.D. research is proposing the following scopes of work to enhance our capability and to help improve the resolution on fracture characterization and prediction. For discontinuity attribute, previous methods have difficulty highlighting subtle discontinuities from seismic data in cases where the local amplitude variation is non-zero mean. This study proposes implementing a gray-level transformation and the Canny edge detector for improved imaging of discontinuities. Specifically, the new process transforms seismic signals to be zero mean and helps amplify subtle discontinuities, leading to an enhanced visualization for structural and stratigraphic details. Applications to various 3D seismic datasets demonstrate that the new algorithm is superior to previous discontinuity-detection methods. Integrating both discontinuity magnitude and discontinuity azimuth helps better define channels, faults and fractures, than the traditional similarity, amplitude gradient and semblance attributes. For flexure attribute, the existing algorithm is computationally intensive and limited by the lateral resolution for steeply-dipping formations. This study proposes a new and robust volume-based algorithm that evaluate flexure attribute more accurately and effectively. The algorithms first volumetrically fit a cubic surface by using a diamond 13-node grid cell to seismic data, and then compute flexure using the spatial derivatives of the built surface. To avoid

  12. Seismic clusters analysis in North-Eastern Italy by the nearest-neighbor approach

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Gentili, Stefania

    2016-04-01

    The main features of earthquake clusters in the Friuli Venezia Giulia Region (North Eastern Italy) are explored, with the aim to get some new insights on local scale patterns of seismicity in the area. The study is based on a systematic analysis of robustly and uniformly detected seismic clusters of small-to-medium magnitude events, as opposed to selected clusters analyzed in earlier studies. To characterize the features of seismicity for FVG, we take advantage of updated information from local OGS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics, Centre of Seismological Research, since 1977. A preliminary reappraisal of the earthquake bulletins is carried out, in order to identify possible missing events and to remove spurious records (e.g. duplicates and explosions). The area of sufficient completeness is outlined; for this purpose, different techniques are applied, including a comparative analysis with global ISC data, which are available in the region for large and moderate size earthquakes. Various techniques are considered to estimate the average parameters that characterize the earthquake occurrence in the region, including the b-value and the fractal dimension of epicenters distribution. Specifically, besides the classical Gutenberg-Richter Law, the Unified Scaling Law for Earthquakes, USLE, is applied. Using the updated and revised OGS data, a new formal method for detection of earthquake clusters, based on nearest-neighbor distances of events in space-time-energy domain, is applied. The bimodality of the distribution, which characterizes the earthquake nearest-neighbor distances, is used to decompose the seismic catalog into sequences of individual clusters and background seismicity. Accordingly, the method allows for a data-driven identification of main shocks (first event with the largest magnitude in the cluster), foreshocks and aftershocks. Average robust estimates of the USLE parameters (particularly, b

  13. Application of differential analysis of VLF signals for seismic-ionospheric precursor detection from multiple receivers

    NASA Astrophysics Data System (ADS)

    Skeberis, Christos; Zaharis, Zaharias; Xenos, Thomas; Contadakis, Michael; Stratakis, Dimitrios; Tommaso, Maggipinto; Biagi, Pier Francesco

    2015-04-01

    This study investigates the application of differential analysis on VLF signals emitted from a single transmitter and received by multiple stations in order to filter and detect disturbances that can be attributed to seismic-ionospheric precursor phenomena. The cross-correlation analysis applied on multiple VLF signals provides a way of discerning the nature of a given disturbance and accounts for more widespread geomagnetic interferences compared to local precursor phenomena. For the purpose of this paper, data acquired in Thessaloniki (40.59N, 22,78E) and in Heraklion (35.31N, 25.10E) from the VLF station in Tavolara, Italy (ICV station Lat. 40.923, Lon. 9.731) for a period of four months (September 2014 - December 2014) are used. The receivers have been developed by Elettronika Srl and are part of the International Network for Frontier Research on Earthquake Precursors (INFREP). A normalization process and an improved variant of the Hilbert-Huang transform are initially applied to the received VLF signals. The signals derived from the first two Intrinsic Mode Functions (IMF1 and IMF2) undergo a cross-correlation analysis and, in this way, time series from the two receivers can be compared. The efficacy of the processing method and the results produced by the proposed process are then discussed. Finally, results are presented along with an evaluation of the discrimination and detection capabilities of the method on disturbances of the received signals. Based upon the results, the merits of such a processing method are discussed to further improve the current method by using differential analysis to better classify between different disturbances but, more importantly, discriminate between points of interest in the provided spectra. This could provide an improved method of detecting disturbances attributed to seismic-ionospheric precursor phenomena and also contribute to a real-time method for correlating seismic activity with the observed disturbances.

  14. Statistical Analysis of Seismic Events Occurring on Piton de la Fournaise Volcano, La Réunion : Bringing Out Eruptive Precursors

    NASA Astrophysics Data System (ADS)

    Durand, V.; Le Bouteiller, P.; Mangeney, A.; Ferrazzini, V.; Kowalski, P.; Lauret, F.; Brunet, C.

    2015-12-01

    On Piton de la Fournaise volcano, La Reunion island, continuous seismic recordings allow to extract signals associated to rockfalls occuring inside the Dolomieu crater. Using the OVPF catalog, we have investigated these seismic signals in order to find how their characteristics could relate to physical characteristics of rockfalls. Here, we analyze such seismic signals from a statistical viewpoint. Data are first taken on an 8 month period that includes an eruption, from January to August 2014. For all the seismic signals associated to rockfalls in this period, 14 seismic and physical attributes are retrieved, allowing to perform a statistical method known as Principal Component Analysis (PCA). It is processed three times in this study : firstly with the 14 attributes, to highlight the main features that outline the data - which we find to be duration and seismic energy, as well as waveform and frequency content. The second PCA, based on 6 attributes, leads to the definition of several physical types of rockfalls, based on the k-means clustering method. Lastly, PCA and k-means clustering are done on 6 different, easily-computed seismic attributes, and reveal a specific behavior of one cluster of events just before the June 20th eruption. Based on this finding, 15 easily-retrievable numerical attributes are defined from the specific cluster pointed out in the 2014 study, and tested on 2 other datasets : from January to July 2015, they detect the approach of the February 4th, May 17th and July 31st eruptions. From August to December 2010, our attributes show precursory variations a few days before the October 14th and December 9th eruptions. We highlight the increase in a specific type of seismic events shortly before an eruption ; we believe they rather have a volcano-tectonic source but hardly distinguish themselves from rockfalls on seismic signals.

  15. Spectral Analysis of Broadband Seismic Array Data, Tien Shan

    NASA Astrophysics Data System (ADS)

    Shamshy, S.; Pavlis, G. L.

    2003-12-01

    We used a spectral analysis method to examine amplitude variations of body waves recorded in the Tien Shan region of central Asia. We used broadband data from the Kyrgyz Network (KNET), Kazakhstan Network (KZNET), and from a set of temporary, PASSCAL stations operated from 1997-2000 we refer to as the Ghengis array. A spectral ratio method similar to that used by Wilson and Pavlis (2000) was employed, but with station AAK used as a reference instead of the array median. Spectral ratios were estimated for all teleseismic events and a larger, intermediate depth events from the Hindu-Kush region for all three-components of ground motion and total signal strength on all components. Results are visualized by maps of amplitude for various frequency bands and through the 4-D animation method introduced by Wilson and Pavlis (2000). Data from Hindu-Kush events showed amplitude variations as much as a factor of 100 across the study area with a strong frequency dependence. The largest variations were at the highest frequencies observed near 15 Hz. Stations in the northwestern part of the Tien Shan array show little variation in amplitude relative to the reference station, AAK. In the central and eastern part of the array, the amplitude estimates are significantly smaller at all frequencies. In contrast, for stations in the western Tien Shan near the Talas-Fergana Fault, and the southern Tien Shan near the Tarim Basin, the amplitude values become much larger than the reference site. The teleseismic data show a different pattern and show a somewhat smaller, overall amplitude variation at comparable frequencies. The northern part of the array again shows small variations relative to the reference stations. There are some amplifications in the southern stations of the array, especially in the Tarim Basin. The higher frequency observations that show large amplifications at stations in the Tarim Basin are readily explained by site effects due to the thick deposits of sediments

  16. Seismic body wave separation in volcano-tectonic activity inferred by the Convolutive Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Capuano, Paolo; De Lauro, Enza; De Martino, Salvatore; Falanga, Mariarosaria; Petrosino, Simona

    2015-04-01

    One of the main challenge in volcano-seismological literature is to locate and characterize the source of volcano/tectonic seismic activity. This passes through the identification at least of the onset of the main phases, i.e. the body waves. Many efforts have been made to solve the problem of a clear separation of P and S phases both from a theoretical point of view and developing numerical algorithms suitable for specific cases (see, e.g., Küperkoch et al., 2012). Recently, a robust automatic procedure has been implemented for extracting the prominent seismic waveforms from continuously recorded signals and thus allowing for picking the main phases. The intuitive notion of maximum non-gaussianity is achieved adopting techniques which involve higher-order statistics in frequency domain., i.e, the Convolutive Independent Component Analysis (CICA). This technique is successful in the case of the blind source separation of convolutive mixtures. In seismological framework, indeed, seismic signals are thought as the convolution of a source function with path, site and the instrument response. In addition, time-delayed versions of the same source exist, due to multipath propagation typically caused by reverberations from some obstacle. In this work, we focus on the Volcano Tectonic (VT) activity at Campi Flegrei Caldera (Italy) during the 2006 ground uplift (Ciaramella et al., 2011). The activity was characterized approximately by 300 low-magnitude VT earthquakes (Md < 2; for the definition of duration magnitude, see Petrosino et al. 2008). Most of them were concentrated in distinct seismic sequences with hypocenters mainly clustered beneath the Solfatara-Accademia area, at depths ranging between 1 and 4 km b.s.l.. The obtained results show the clear separation of P and S phases: the technique not only allows the identification of the S-P time delay giving the timing of both phases but also provides the independent waveforms of the P and S phases. This is an enormous

  17. Salton Seismic Imaging Project Line 7: Data and Analysis to Date

    NASA Astrophysics Data System (ADS)

    Goldman, M.; Fuis, G.; Catchings, R. D.; Rymer, M. J.; Bauer, K.; Driscoll, N. W.; Kent, G.; Harding, A. J.; Kell, A. M.; Hole, J. A.; Stock, J. M.

    2012-12-01

    The Salton Seismic Imaging Project (SSIP) is a large-scale, active- and passive-source seismic-imaging project designed to image the San Andreas Fault (SAF) and adjacent basins (Imperial and Coachella Valleys) in southernmost California. Data and preliminary results from many of the seismic profiles are reported elsewhere (Fuis et al., Catchings et al., Rymer et al., this meeting). Here, we focus on SSIP Line 7, one of four 2-D NE-SW-oriented seismic profiles that were acquired across the SAF, parts of the Coachella Valley, and/or the Salton Sea. Seismic sources for Line 7 include both land-based downhole explosive sources and airgun sources within the Salton Sea. Data were recorded by 189 Texan seismographs on land (50 m spacing), 102 channels of a multi-channel cabled recording system near the San Andreas fault on land (10 m spacing), and nine ocean bottom seismographs (OBS) within the Salton Sea (1.3 km spacing). The Texans and OBS's recorded both airgun and explosive sources, and the cable array recorded explosions only. Data from the Texan and the multi-channel seismographs were organized as shotgathers, and the OBS data were arranged as receiver gathers. All data were merged into a single profile for analysis. The seismic profile is approximately 23 km long and crosses approximately normal to the SAF, but an approximately 2-km-long segment of the profile at the northeastern edge of the Salton Sea, does not have either seismograph or seismic source coverage due to limited OBS data. Because the gap in the seismic profile was within about 500 m of the surface trace of the SAF, imaging of the shallow part of the SAF was limited. First arrivals from all data sets were combined to develop a refraction tomography velocity image of the upper crust. From the surface to about 6 km depth, P-wave velocities range from about 2 km/s to about 6 km/s, with basement (~6 km/s) shallower northeast of the SAF. The SAF also marks the southwestern boundary of a relatively high

  18. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    NASA Astrophysics Data System (ADS)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  19. Attenuation of seismic waves obtained by coda waves analysis in the West Bohemia earthquake swarm region

    NASA Astrophysics Data System (ADS)

    Bachura, Martin; Fischer, Tomas

    2014-05-01

    with depth, where 1/Qc seems to be frequency independent in depth range of upper lithosphere. Lateral changes of 1/Qc were also reported - it decreases in the south-west direction from the Novy Kostel focal zone, where the attenuation is the highest. Results from more advanced methods that allow for separation of scattering and intrinsic loss show that intrinsic loss is a dominant factor for attenuating of seismic waves in the region. Determination of attenuation due to scattering appears ambiguous due to small hypocentral distances available for the analysis, where the effects of scattering in frequency range from 1 to 24 Hz are not significant.

  20. Integrating hematite (U-Th)/He dating, microtextural analysis, and thermomechanical modeling to date seismic slip

    NASA Astrophysics Data System (ADS)

    McDermott, R.; Ault, A. K.; Evans, J. P.; Reiners, P. W.; Shuster, D. L.

    2015-12-01

    Linking petrologic and geochronologic evidence for seismicity in the rock record is challenging, yet critical for understanding slip mechanics in natural faults, structural histories, and modern seismic hazards. We couple hematite (U-Th)/He (HeHe) dating with microtextural analysis and thermomechanical modeling to decipher this record from locally iridescent, hematite-coated fault surfaces in the seismogenic Wasatch fault zone (WFZ), Utah. Prior study of one fault surface linked textural evidence for elevated temperatures with a pattern of HeHe dates to hypothesize that this surface preserves evidence of multiple seismic slip events. New scanning electron microscopy (SEM) and HeHe data from a larger sample suite test this hypothesis. The SEM images reveal the presence of <500 nm polygonal hematite crystals at some iridescent regions, suggesting co- to post-seismic hematite annealing and recrystallization at temperatures >800 °C. Fault surface samples yield 3.8 ± 0.03 to 1.5 ± 0.1 Ma dates, with younger dates in iridescent regions. These results are younger than 88.5 ± 15.0 Ma and 10.8 ± 0.8 Ma dates from veins associated with initial hematite mineralization as well as new apatite (U-Th)/He dates of 4.0 ± 0.6 Ma-5.4±1.1 Ma that constrain the footwall thermal history. Reproducible but statistically different HeHe dates from samples on the same fault surface are consistent with prior observations. Collectively, these observations suggest that hematite He dates record rapid cooling from localized shear heating at asperities to temperatures hot enough to reset the hematite He system. Models incorporate rate-dependent friction and half-space cooling to constrain shear zone temperature evolution. Results reveal temperatures >800 °C are sufficient to reset hematite up to 200 μm from the fault surface and HeHe dates may represent patches of rate-strengthening friction during seismic slip. Ongoing work utilizes SEM to target aliquots with textural evidence for

  1. Dynamics of the Askja caldera landslide, July 2014, from seismic signal analysis

    NASA Astrophysics Data System (ADS)

    Schöpa, Anne; Burtin, Arnaud; Hovius, Niels; Green, Robert G.

    2016-04-01

    A voluminous landslide occurred at the Askja caldera in the Icelandic highlands on July 21st, 2014. The next day, flood marks of at least ten tsunami waves, that had reached the northern shore of the caldera lake, could be mapped out. The highest flood marks were found up to 60 m above the lake level close to famous tourist spots underlining the high hazard potential of the area. Since the landslide happened at night, no direct observations of the mass movement nor of the subsequent tsunami waves in the caldera lake were made. We present the analysis of seismic data from a network of 58 seismic stations that recorded data during the event. The seismic data give valuable information on the triggering, initiation, timing, and propagation of the landslide, with additional details on precursory signals before and oscillation waves in the caldera lake after the main landslide. From the set of seismic wave forms, characteristic features were extracted that could be used for early warning proposes. The seismic data reveals that the main slope failure along the southeastern caldera wall was a large, single event starting at 23.24 UTC. The main part of the energy was released in the first two minutes followed by smaller events, before the background noise level was re-established some 40 minutes after the main failure. Subsequent mass movements, much lower in amplitude, occurred during the following hours. About 20 minutes before the main failure, the background noise level started to rise. Ground velocities were up to three times higher that the background level with dominant frequencies between 2-4 Hz. The increase in background noise level is visible in stations up to 30 km away from the landslide area. This velocity increase is followed by a prominent velocity drop five minutes before the main failure. The spatial distribution of the velocity decrease with its centre at the detachment area of the landslide has an elliptical outline with a long axis oriented NE-SW. This

  2. Analysis of Seismic Anisotropy Across Central Anatolia by Shear Wave Splitting

    NASA Astrophysics Data System (ADS)

    Pamir, Dilekcan; Abgarmi, Bizhan; Arda Özacar, A.

    2014-05-01

    Analysis of Seismic Anisotropy Across Central Anatolia by Shear Wave Splitting Dilekcan Pamir, Bizhan Abgarmi, A. Arda Özacar Department of Geological Engineering, Middle East Technical University (METU), Dumlupinar Bulvari 1, 06800 Ankara, Turkey Central Anatolia holds the key to connect the theories about the ongoing tectonic escape, the African Plate subduction along Cyprus Arc and the indenter-style collision of Arabian Plate along Bitlis Suture. However, the shear wave splitting measurements which are needed to characterize seismic anisotropy are very sparse in the region. Recently, seismic data recorded by national seismic networks (KOERI, ERI-DAD) with dense coverage, provided a unique opportunity to analyze the effect of present slab geometry (slab tears, slab break-off) on mantle deformation and test different models of anisotropy forming mechanisms. In this study, the anisotropic structure beneath the Central Anatolia is investigated via splitting of SKS and SKKS phases recorded at 46 broadband seismic stations. Our measurements yielded 1171 well-constrained splitting and 433 null results. Overall, the region displays NE-SW trending fast splitting directions and delay times on the order of 1 sec. On the other hand, a large number of stations which are spatially correlated with Cyprus Slab, Neogene volcanism and major tectonic structures present significant back azimuthal variations on splitting parameters that cannot be explained by one-layered anisotropy with horizontal symmetry. Thus, we have modeled anisotropy for two-layered structures using a forward approach and identified NE-SW trending fast splitting directions with delay times close to 1 sec at the lower layer and N-S, NW-SE trending fast splitting with limited time delays (0.1 - 0.3 sec) at the upper layer. Fast directions and delay times of the lower layer are similar to one-layered anisotropy and parallel or sub-parallel to the absolute plate motions which favors asthenospheric flow model

  3. Cluster analysis and relative relocation of mining-induced seismicity using HAMNET data

    NASA Astrophysics Data System (ADS)

    Wehling-Benatelli, S.; Becker, D.; Bischoff, M.; Friederich, W.; Meier, T.

    2012-04-01

    Longwall mining activity in the Ruhr-coal mining district leads to mining-induced seismicity. For detailed studies seismicity of the single longwall panel S 109 beneath Hamm-Herringen in the eastern Ruhr area was monitored between June 2006 and July 2007. More than 7000 seismic events with magnitudes -1.7 ≤ ML ≤ 2.0 are localized in this period. 70% of the events occur in the vicinity of the moving longwall face. Moreover, the seismicity pattern shows spatial clustering of events in distances up to 500 m from the panel which is related to remnant pillars of old workings and tectonic features. Two sources with common location and rock failure mechanism are expected to show identical waveforms. Hence, similar waveforms suggest similarity of source properties. Waveform similarity can be quantified by cross-correlation. Similarity matrices have been established and build the basis of a cluster analysis presented here. We compare two approaches for cluster definition: a single-linkage approach and excerpting clusters by visual inspection of the sorted similarity matrices. Clusters are found as areas of high inter-event similarity in the depicted matrix. In contrast, the single-linkage approach assigns an event to the cluster if the similarity threshold v sl = 0.9 is exceeded to at least one other member. This method is more restrictive and, in general, leads to clusters with less members than visual inspection. Both methods exhibit clusters which show the same properties. The largest clusters are built by low-magnitude events (around ML ≈-0.6) directly at the longwall face at the mining level. Other clusters include events with magnitudes as large as ML,max = 1.8. Their locations tend to lie above or below the mining level in load-bearing sandstone layers. Mining accompanying events show face-parallel near vertical fault planes whereas more distant clusters have typical solutions of remnant pillar failure with a medium dip angle. Relative relocation of the events

  4. FINITE ELEMENT ANALYSIS OF JNES/NUPEC SEISMIC SHEAR WALL CYCLIC AND SHAKING TABLE TEST DATA.

    SciTech Connect

    XU,J.; NIE, J.; HOFMAYER, C.; ALI, S.

    2007-04-12

    This paper describes a finite element analysis to predict the JNES/NUPEC cyclic and shaking table RC shear wall test data, as part of a collaborative agreement between the U.S. NRC and JNES to study seismic issues important to the safe operation of commercial nuclear power plant (NPP) structures, systems and components (SSC). The analyses described in this paper were performed using ANACAP reinforced concrete models. The paper describes the ANACAP analysis models and discusses the analysis comparisons with the test data. The ANACAP capability for modeling nonlinear cyclic characteristics of reinforced concrete shear wall structures was confirmed by the close comparisons between the ANACAP analysis results and the JNES/NUPEC cyclic test data. Reasonable agreement between the analysis results and the test data was demonstrated for the hysteresis loops and the shear force orbits, in terms of both the overall shape and the cycle-to-cycle comparisons. The ANACAP simulation analysis of the JNES/NUPEC shaking table test was also performed, which demonstrated that the ANACAP dynamic analysis with concrete material model is able to capture the progressive degrading behavior of the shear wall as indicated from the test data. The ANACAP analysis also predicted the incipient failure of the shear wall, reasonably close to the actual failure declared for the test specimen. In summary, the analyses of the JNES/NUPEC cyclic and shaking table RC shear wall tests presented in this paper have demonstrated the state-of-the-art analysis capability for determining the seismic capacity of RC shear wall structures.

  5. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  6. Vulnerability assessment of mining subsidence hazards.

    PubMed

    Deck, Olivier; Verdel, Thierry; Salmon, Romuald

    2009-10-01

    Between 1996 and 1999, five mining subsidence events occurred in the iron-ore field in Lorraine, France, and damaged several hundred buildings. Because of the thousand hectares of undermined areas, an assessment of the vulnerability of buildings and land is necessary for risk management. Risk assessment methods changed from initial risk management decisions that took place immediately after the mining subsidence to the risk assessment studies that are currently under consideration. These changes reveal much about the complexity of the vulnerability concept and about difficulties in developing simple and relevant methods for its assessment. The objective of this article is to present this process, suggest improvements on the basis of theoretical definitions of the vulnerability, and give an operational example of vulnerability assessment in the seismic field. The vulnerability is divided into three components: weakness, stakes value, and resilience. Final improvements take into account these three components and constitute an original method of assessing the vulnerability of a city to subsidence.

  7. Quantitative analysis of seismic wave propagation anomalies in azimuth and apparent slowness at Deception Island volcano (Antarctica) using seismic arrays

    NASA Astrophysics Data System (ADS)

    Yeguas, A. García.; Almendros, J.; Abella, R.; Ibáñez, J. M.

    2011-02-01

    We analyse shot data recorded by eight seismic arrays during an active-source seismic experiment carried out at Deception Island (Antarctica) in 2005 January. For each source we estimate the apparent slowness and propagation azimuth of the first wave arrival. Since both source and receiver positions are accurately known, we are able to interpret the results in terms of the effect of the heterogeneities of the medium on wave propagation. The results show the presence of significant propagation anomalies. Nearby shots produce large apparent slowness values above 0.6 s km-1, while distant shots produce small values, down to about 0.15-0.20 s km-1. These values are different for each array, which shows the importance of the local structure under the receiver. The spatial distributions of apparent slowness are not radial as we would expect in a flat-layered medium. And again, these distributions are different for each array. The azimuth anomalies defined as the difference between the empirical estimates and the values expected in a 1-D model (i.e. the source-array directions) suggest ubiquitous wave front distortions. We have detected both positive and negative anomalies. For some shot-array geometries, azimuth anomalies are quite large with values up to 60°. The distribution of the anomalies depends on the position of the array. Some of these features can be interpreted in terms of a shallow magma chamber and shallow rigid bodies imaged by high-resolution seismic tomography. However several details remain unexplained. Further work is required, including modelling of synthetic wavefields on realistic models of Deception Island and/or apparent slowness vector tomography.

  8. Performance Analysis of Tandem-L Mission for Modeling Volcanic and Seismic Deformation Sources

    NASA Astrophysics Data System (ADS)

    Ansari, Homa; Goel, Kanika; Parizzi, Alessandro; Sudhaus, Henriette; Adam, Nico; Eineder, Michael

    2015-04-01

    Although a great number of publications have focused on the application of InSAR in deformation source modeling as well as the development of different algorithms in this regard, little investigation has been dedicated to the sensitivity analysis of the InSAR in deformation source modeling. Our purpose is to address this issue by analyzing the reliability of InSAR in modeling the deformation sources due to landslides, seismic and volcanic activities, with special focus on the L band SAR measurements. The sensitivity analysis is considered for three commonly used geophysical models in case of subsidence, seismic and volcanic activities; namely, the Gaussian subsidence bowl, Okada and Mogi point source, respectively. In each of the cases, the InSAR sensitivity is analytically formulated and its performance is investigated using simulated SAR data. The investigations are carried out using stochastic error propagation approaches to infer the precision of the models' parameters as well as their mutual covariance. The limiting factors in SAR interferometry are categorized in two groups and investigated separately in sensitivity analysis; with the first dealing with the geometrical limits imposed by the side looking geometry of the SAR measurements and the second focusing on the InSAR stochastic characteristics in the L band.

  9. Seismic Studies

    SciTech Connect

    R. Quittmeyer

    2006-09-25

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at

  10. Multi Canister Overpack (MCO) Handling Machine Independent Review of Seismic Structural Analysis

    SciTech Connect

    SWENSON, C.E.

    2000-09-22

    The following separate reports and correspondence pertains to the independent review of the seismic analysis. The original analysis was performed by GEC-Alsthom Engineering Systems Limited (GEC-ESL) under subcontract to Foster-Wheeler Environmental Corporation (FWEC) who was the prime integration contractor to the Spent Nuclear Fuel Project for the Multi-Canister Overpack (MCO) Handling Machine (MHM). The original analysis was performed to the Design Basis Earthquake (DBE) response spectra using 5% damping as required in specification, HNF-S-0468 for the 90% Design Report in June 1997. The independent review was performed by Fluor-Daniel (Irvine) under a separate task from their scope as Architect-Engineer of the Canister Storage Building (CSB) in 1997. The comments were issued in April 1998. Later in 1997, the response spectra of the Canister Storage Building (CSB) was revised according to a new soil-structure interaction analysis and accordingly revised the response spectra for the MHM and utilized 7% damping in accordance with American Society of Mechanical Engineers (ASME) NOG-1, ''Rules for Construction of Overhead and Gantry Cranes (Top Running Bridge, Multiple Girder).'' The analysis was re-performed to check critical areas but because manufacturing was underway, designs were not altered unless necessary. FWEC responded to SNF Project correspondence on the review comments in two separate letters enclosed. The dispositions were reviewed and accepted. Attached are supplier source surveillance reports on the procedures and process by the engineering group performing the analysis and structural design. All calculation and analysis results are contained in the MHM Final Design Report which is part of the Vendor Information File 50100. Subsequent to the MHM supplier engineering analysis, there was a separate analyses for nuclear safety accident concerns that used the electronic input data files provided by FWEC/GEC-ESL and are contained in document SNF-6248

  11. Analysis of the 23 June 2001 Southern Peru Earthquake Using Locally Recorded Seismic Data

    NASA Astrophysics Data System (ADS)

    Tavera, H.; Comte, D.; Boroschek, R.; Dorbath, L.; Portugal, D.; Haessler, H.; Montes, H.; Bernal, I.; Antayhua, Y.; Salas, H.; Inza, A.; Rodriguez, S.; Glass, B.; Correa, E.; Balmaceda, I.; Meneses, C.

    2001-12-01

    The 23 June 2001, Mw=8.4 southern Peru earthquake ruptured the northern and central part of the previous large earthquake occurred on 13 August 1868, Mw ~9. A detailed analysis of the aftershock sequence was possible due to the deployment of a temporary seismic network along the coast in the Arequipa and Moquegua districts, complementing the Peruvian permanent stations. The deployed temporary network included 10 short period three component stations from the U. of Chile-IRD-France and 7 broad-band seismic stations from the Instituto Geofísico del Perú. This network operated during the first weeks after the mainshock and recorded the major aftershocks like the larger one occurred on 7 July 2001, Mw=7.5, this event defines the southern limit of the rupture area of the 2001 Peruvian earthquake. The majority of the aftershocks shows a thrusting fault focal mechanisms according with the average convergence direction of the subducting Nazca plate, however, normal faulting events are also present in the aftershock sequence like the 5 July 2001, Mw=6.6 one. The depth distribution of the events permitted a detailed definition of the Wadati-Benioff zone in the region. The segment between Ilo and Tacna did not participated in the rupture process of the 2001 southern Peru earthquake. Seismicity located near the political Peruvian-Chilean boundary was reliable determined using the data recorded by the northern Chile permanent network. Analysis of the mainshock and aftershock acelerograms recorded in Arica, northern Chile are also included. The occurrence of the 1995 Antofagasta (Mw=8.0) and the 2001 southern Peru earthquakes suggests that the probability of having a major earthquake in the northern Chile region increased, considering that the previous large earthquake in this region happened in 1877 (Mw ~9), and since that time no earthquake with magnitude Mw>8 had occurred inside of the 1877 estimated rupture area (between Arica and Antofagasta).

  12. Strong Ground-Motion Prediction in Seismic Hazard Analysis: PEGASOS and Beyond

    NASA Astrophysics Data System (ADS)

    Scherbaum, F.; Bommer, J. J.; Cotton, F.; Bungum, H.; Sabetta, F.

    2005-12-01

    The SSHAC Level 4 approach to probabilistic seismic hazard analysis (PSHA), which could be considered to define the state-of-the-art in PSHA using multiple expert opinions, has been fully applied only twice, firstly in the multi-year Yucca Mountain study and subsequently (2002-2004) in the PEGASOS project. The authors of this paper participated as ground-motion experts in this latter project, the objective of which was comprehensive seismic hazard analysis for four nuclear power plant sites in Switzerland, considering annual exceedance frequencies down to 1/10000000. Following SSHAC procedure, particular emphasis was put on capturing both the aleatory and epistemic uncertainties. As a consequence, ground motion prediction was performed by combining several empirical ground motion models within a logic tree framework with the weights on each logic tree branch expressing the personal degree-of-belief of each ground-motion expert. In the present paper, we critically review the current state of ground motion prediction methodology in PSHA in particular for regions of low seismicity. One of the toughest lessons from PEGASOS was that in systematically and rigorously applying the laws of uncertainty propagation to all of the required conversions and adjustments of ground motion models, a huge price has to be paid in an ever-growing aleatory variability. Once this path has been followed, these large sigma values will drive the hazard, particularly for low annual frequencies of exceedance. Therefore, from a post-PEGASOS perspective, the key issues in the context of ground-motion prediction for PSHA for the near future are to better understand the aleatory variability of ground motion and to develop suites of ground-motion prediction equations that employ the same parameter definitions. The latter is a global rather than a regional challenge which might be a desirable long-term goal for projects similar to the PEER NGA (Pacific Earthquake Engineering Research Center, Next

  13. Ice shelf structure derived from dispersion curve analysis of ambient seismic noise, Ross Ice Shelf, Antarctica

    NASA Astrophysics Data System (ADS)

    Diez, A.; Bromirski, P. D.; Gerstoft, P.; Stephen, R. A.; Anthony, R. E.; Aster, R. C.; Cai, C.; Nyblade, A.; Wiens, D. A.

    2016-05-01

    An L-configured, three-component short period seismic array was deployed on the Ross Ice Shelf, Antarctica during November 2014. Polarization analysis of ambient noise data from these stations shows linearly polarized waves for frequency bands between 0.2 and 2 Hz. A spectral peak at about 1.6 Hz is interpreted as the resonance frequency of the water column and is used to estimate the water layer thickness below the ice shelf. The frequency band from 4 to 18 Hz is dominated by Rayleigh and Love waves propagating from the north that, based on daily temporal variations, we conclude were generated by field camp activity. Frequency-slowness plots were calculated using beamforming. Resulting Love and Rayleigh wave dispersion curves were inverted for the shear wave velocity profile within the firn and ice to ˜150 m depth. The derived density profile allows estimation of the pore close-off depth and the firn-air content thickness. Separate inversions of Rayleigh and Love wave dispersion curves give different shear wave velocity profiles within the firn. We attribute this difference to an effective anisotropy due to fine layering. The layered structure of firn, ice, water and the seafloor results in a characteristic dispersion curve below 7 Hz. Forward modelling the observed Rayleigh wave dispersion curves using representative firn, ice, water and sediment structures indicates that Rayleigh waves are observed when wavelengths are long enough to span the distance from the ice shelf surface to the seafloor. The forward modelling shows that analysis of seismic data from an ice shelf provides the possibility of resolving ice shelf thickness, water column thickness and the physical properties of the ice shelf and underlying seafloor using passive-source seismic data.

  14. Analysis and modeling of high-resolution multicomponent seismic reflection data

    NASA Astrophysics Data System (ADS)

    Guy, Erich D.

    The facts that seismic body-wave types are sensitive to different physical properties, seismic sources radiate polarized waves, and seismic receivers are sensitive to the polarization of scattered body-waves and coherent noise, mean that it is important to consider recording and analyzing different wave-types and data components prior to high-resolution reflection surveys. In this dissertation, important aspects of elastic-wave propagation relevant to high-resolution multicomponent surveying have been analyzed experimentally and numerically, and methodologies have been tested and developed that will improve near-surface imaging and characterization. Factors affecting the ability of common-mode P- and S-wave reflection surveys for mapping features in the near-surface are described and illustrated through analyses of experimental field data and modeling. It is demonstrated through comparisons of known subsurface conditions and processed stacked sections, that combined P- and S-wave common-mode reflection information can allow a geologic sequence to be imaged more effectively than by using solely P- or S-wave reflection information. Near-surface mode-converted seismic reflection imaging potential was tested experimentally and evaluated through modeling. Modeling results demonstrate that potential advantages of near-surface mode-conversion imaging can be realized in theory. Analyses of acquired multicomponent data however demonstrate that mode-conversion imaging could not be accomplished in the field study area, due to the low amplitudes of events and the presence of noise in field data. Analysis methods are presented that can be used for assessing converted-wave imaging potential in future reflection studies. Factors affecting the ability of SH-wave reflection measurements for allowing near-surface interfaces and discontinuities to be effectively imaged are described. A SH-wave reflection data analysis workflow is presented that provides a methodology for delineating

  15. Best estimate method versus evaluation method: a comparison of two techniques in evaluating seismic analysis and design. Technical report

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-07-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the tradditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC)--seismic input, soil-structure interaction, major structural response, and subsystem response--are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on the model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evauation Method is also demonstrated.

  16. Cyber Security for the Spaceport Command and Control System: Vulnerability Management and Compliance Analysis

    NASA Technical Reports Server (NTRS)

    Gunawan, Ryan A.

    2016-01-01

    With the rapid development of the Internet, the number of malicious threats to organizations is continually increasing. In June of 2015, the United States Office of Personnel Management (OPM) had a data breach resulting in the compromise of millions of government employee records. The National Aeronautics and Space Administration (NASA) is not exempt from these attacks. Cyber security is becoming a critical facet to the discussion of moving forward with projects. The Spaceport Command and Control System (SCCS) project at the Kennedy Space Center (KSC) aims to develop the launch control system for the next generation launch vehicle in the coming decades. There are many ways to increase the security of the network it uses, from vulnerability management to ensuring operating system images are compliant with securely configured baselines recommended by the United States Government.

  17. Thermal Analysis of the Vulnerability of the Spacesuit Battery Design to Short-Circuit Conditions (Presentation)

    SciTech Connect

    Kim, G. H.; Chaney, L.; Smith, K.; Pesaran, A.; Darcy, E.

    2010-04-22

    NREL researchers created a mathematical model of a full 16p-5s spacesuit battery for NASA that captures electrical/thermal behavior during shorts to assess the vulnerability of the battery to pack-internal (cell-external) shorts. They found that relocating the short from battery pack-external (experimental validation) to pack-internal (modeling study) causes substantial additional heating of cells, which can lead to cell thermal runaway. All three layers of the bank-to-bank separator must fail for the pack-internal short scenario to occur. This finding emphasizes the imperative of battery pack assembly cleanliness. The design is tolerant to pack-internal shorts when stored at 0% state of charge.

  18. RESILIENCE IN VULNERABLE POPULATIONS WITH TYPE 2 DIABETES MELLITUS AND HYPERTENSION: A SYSTEMATIC REVIEW AND META-ANALYSIS

    PubMed Central

    Pesantes, M. Amalia; Lazo-Porras, María; Abu Dabrh, Abd Moain; Avila-Ramirez, Jaime R.; Caycho, Maria; Villamonte, Georgina Y.; Sanchez-Perez, Grecia P.; Málaga, Germán; Bernabé-Ortiz, Antonio; Miranda, J. Jaime

    2015-01-01

    Background Patients with chronic conditions and limited access to healthcare experience stressful challenges due to the burden of managing both their conditions and their daily life demands. Resilience provides a mechanism of adapting to stressful experiences. We conducted a systematic review and meta-analysis to synthesize the evidence about interventions to enhance resiliency in managing hypertension or type-2 diabetes in vulnerable populations, and to assess the efficacy of these interventions on clinical outcomes. Methods We searched multiple databases from early inception through February 2015 including randomized controlled trials that enrolled patients with type-2 diabetes or hypertension. All interventions that targeted resilience in vulnerable populations were included. Data were synthesized to describe the characteristics and efficacy of resilience interventions. We pooled the total effects by calculating standardized mean difference using the random-effects model. Results The final search yielded seventeen studies. All studies were conducted in the United States and generally targeted minority participants. Resiliency interventions used diverse strategies; discussion groups or workshops were the most common approach. Conclusions Interventions aimed at enhancing the resiliency of patients from vulnerable groups are diverse. Outcomes were not fully conclusive. There was some evidence that resilience interventions had a positive effect on HbA1c levels, but not blood pressure. The incorporation of resiliency-oriented interventions into the arsenal of prevention and management of chronic conditions appears to be an opportunity that remains to be better investigated and exploited, and there is need to pursue further understanding of the core components of any intervention that claims to enhance resilience. PMID:26239007

  19. Seismic attribute analysis to enhance detection of thin gold-bearing reefs: South Deep gold mine, Witwatersrand basin, South Africa

    NASA Astrophysics Data System (ADS)

    Manzi, M. S. D.; Hein, K. A. A.; Durrheim, R.; King, N.

    2013-11-01

    The gold-bearing Upper Elsburg Reef clastic wedge (UER) in the South Deep gold mine in the Witwatersrand basin (South Africa) hosts the highly auriferous basal conglomerate known as the Elsburg Conglomerate (EC) reef. The reef is less than 20 m thick and together with quartzite and conglomerate beds in the UER (1-120 m thick) is below the seismic tuning thickness, or the dominant quarter wavelength. They are extremely difficult to identify on migrated seismic sections using traditional amplitude interpretations. In order to enhance the detection of the EC reef and its subcrop position against the overlying Ventersdorp Contact Reef (VCR), complex-trace seismic attributes, or instantaneous attributes and volume attribute analysis were applied on prestack time migrated (PSTM) seismic sections. In particular, the instantaneous phase and paraphase allowed the clear identification of the continuity of the EC reef, and overlapping and interfering wavelets produced by the convergence of VCR and the EC reef. In addition, these attributes increased confidence in the interpretation of the EC, in particular its offsets (faults), and its depth. A high correlation between the seismically determined depth of the EC reef and borehole intersections was observed, with several depth discrepancies below the vertical seismic resolution limit (~ 25 m). This information can now be incorporated into the current mine geological model, thus improving the resource evaluation of the Upper Elsburg Reef in the South Deep gold mine.

  20. Structure of Suasselkä Postglacial Fault in northern Finland obtained by analysis of ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Afonin, Nikita; Kozlovskaya, Elena

    2016-04-01

    Understanding inner structure of seismogenic faults and their ability to reactivate is particularly important in investigating the continental intraplate seismicity regime. In our study we address this problem using analysis of ambient seismic noise recorded by the temporary DAFNE array in northern Fennoscandian Shield. The main purpose of the DAFNE/FINLAND passive seismic array experiment was to characterize the present-day seismicity of the Suasselkä post-glacial fault (SPGF) that was proposed as one potential target for the DAFNE (Drilling Active Faults in Northern Europe) project. The DAFNE/FINLAND array comprised the area of about 20 to 100 km and consisted of 8 short-period and 4 broad-band 3-component autonomous seismic stations installed in the close vicinity of the fault area. The array recorded continuous seismic data during September, 2011-May, 2013. Recordings of the array have being analyzed in order to identify and locate natural earthquakes from the fault area and to discriminate them from the blasts in the Kittilä Gold Mine. As a result, we found several dozens of natural seismic events originating from the fault area, which proves that the fault is still seismically active. In order to study the inner structure of the SPGF we use cross-correlation of ambient seismic noise recorded by the array. Analysis of azimuthal distribution of noise sources demonstrated that that during the time interval under consideration the distribution of noise sources is close to the uniform one. The continuous data were processed in several steps including single station data analysis, instrument response removal and time-domain stacking. The data were used to estimate empirical Green's functions between pairs of stations in the frequency band of 0.1-1 Hz and to calculate correspondent surface wave dispersion curves. After that S-wave velocity models were obtained as a result of dispersion curves inversion using Geopsy software. The results suggest that the area of

  1. Analysis of the cross-correlation between water level and seismicity at Açu reservoir (Brazil)

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; do Nascimento, Aderson F.; Bezerra, Francisco H. R.; Ferreira, Joaquim M.

    2015-09-01

    The main objective of the study is the analysis of cross-correlation between water level and seismicity recorded over 10 years at Açu reservoir (Brazil) to unravel the physical mechanisms driving the observed seismicity. The reservoir is a shallow (< 35 m deep) earth-filled dam, which is located in a rifted crystalline basement. Robust power spectral density of both monthly water level and earthquake counts show a significant yearly cycle, indicating the clear link between the two variables in terms of loading/unloading of the dam. The application of the singular spectrum analysis (SSA) has allowed to extract from both time series their significant component. The cross-correlation analysis confirms that the evolution of seismicity is mainly due to pore pressure diffusion and other processes due to fracture compaction (undrained response) are not strongly present in our data.

  2. Observed Seismic Vulnerability of Italian Buildings

    SciTech Connect

    Rota, Maria; Magenes, Guido; Penna, Andrea; Strobbia, Claudio L.

    2008-07-08

    A very large database of post-earthquake building inspections carried out after the main Italian events of the last 30 years has been processed in order to derive fragility curves for 23 building typologies, mostly referring to masonry structures. The records (more than 91000) of this very complete and homogeneous dataset have been converted into a single damage scale with five levels of damage, plus the case of no damage. For each affected municipality a value of PGA and Housner Intensity (I{sub H}) has been evaluated using attenuation laws. Damage probability matrices have been then extracted. These experimental data have been fitted through lognormal fragility curves using an advanced nonlinear regression algorithm also taking into account the relative reliability of each point by the bootstrap technique. The significant concentration of experimental data at low levels of ground motion, associated to the selected analytical expression, determine the peculiar shape of some of the curves, with a very steep initial branch followed by an almost horizontal curve for increasing values of ground motion. Explanations and possible solutions are discussed.

  3. Comparative analysis of seismic response characteristics of pile-soil-structure interaction system

    NASA Astrophysics Data System (ADS)

    Kong, Desen; Luan, Maotian; Wang, Weiming

    2006-01-01

    The study on the earthquake-resistant performance of a pile-soil-structure interaction system is a relatively complicated and primarily important issue in civil engineering practice. In this paper, a computational model and computation procedures for pile-supported structures, which can duly consider the pile-soil interaction effect, are established by the finite element method. Numerical implementation is made in the time domain. A simplified approximation for the seismic response analysis of pile-soil-structure systems is briefly presented. Then a comparative study is performed for an engineering example with numerical results computed respectively by the finite element method and the simplified method. Through comparative analysis, it is shown that the results obtained by the simplified method well agree with those achieved by the finite element method. The numerical results and findings will offer instructive guidelines for earthquake-resistant analysis and design of pile-supported structures.

  4. Time-lapse seismic waveform modelling and attribute analysis using hydromechanical models for a deep reservoir undergoing depletion

    NASA Astrophysics Data System (ADS)

    He, Y.-X.; Angus, D. A.; Blanchard, T. D.; Wang, G.-L.; Yuan, S.-Y.; Garcia, A.

    2016-04-01

    Extraction of fluids from subsurface reservoirs induces changes in pore pressure, leading not only to geomechanical changes, but also perturbations in seismic velocities and hence observable seismic attributes. Time-lapse seismic analysis can be used to estimate changes in subsurface hydromechanical properties and thus act as a monitoring tool for geological reservoirs. The ability to observe and quantify changes in fluid, stress and strain using seismic techniques has important implications for monitoring risk not only for petroleum applications but also for geological storage of CO2 and nuclear waste scenarios. In this paper, we integrate hydromechanical simulation results with rock physics models and full-waveform seismic modelling to assess time-lapse seismic attribute resolution for dynamic reservoir characterization and hydromechanical model calibration. The time-lapse seismic simulations use a dynamic elastic reservoir model based on a North Sea deep reservoir undergoing large pressure changes. The time-lapse seismic traveltime shifts and time strains calculated from the modelled and processed synthetic data sets (i.e. pre-stack and post-stack data) are in a reasonable agreement with the true earth models, indicating the feasibility of using 1-D strain rock physics transform and time-lapse seismic processing methodology. Estimated vertical traveltime shifts for the overburden and the majority of the reservoir are within ±1 ms of the true earth model values, indicating that the time-lapse technique is sufficiently accurate for predicting overburden velocity changes and hence geomechanical effects. Characterization of deeper structure below the overburden becomes less accurate, where more advanced time-lapse seismic processing and migration is needed to handle the complex geometry and strong lateral induced velocity changes. Nevertheless, both migrated full-offset pre-stack and near-offset post-stack data image the general features of both the overburden and

  5. Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Kitov, Ivan

    2015-04-01

    Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the

  6. Intraplate seismicity in Canada: Complex network analysis of spatio-temporal recurrences

    NASA Astrophysics Data System (ADS)

    Vasudevan, Kris; Eaton, David; Davidsen, Jörn

    2010-05-01

    Intraplate seismicity in certain regions of Ontario, Quebec, and Nunavut in Canada is a subject of this abstract. The reasons for the occurrence and periodicity of the intraplate earthquakes are not as well understood as the interplate earthquakes. Here, we undertake a complex network analysis with a view to extract information from the physical structure of the network of recurrent seismic events that occur in space and time that can provide insights concerning the causality of seismic events. To this end, we have identified three areas for our study as defined by the following ranges of latitude/longitude values: area 1: 45o-48o/74o-80o; area 2: 42o-45o/76o-81o; area 2: 51o-55o/77o-83o; area 4: 45o-57o/80o-98o; and area 3: 56o-70o/65o-95o. In this work, using a recently proposed definition of 'recurrences' based on record breaking processes (Phys. Rev. E 77, 066107, 2008), we have constructed digraphs of the data extracted from the five areas (http://earthquakescanada.nrcan.gc.ca) with attributes drawn from the location of the events, the time of occurrences and the magnitude of the events. For a quantitative insight into the digraphs of the recurring events in space and time, we have examined the probability distributions of space-interval and time-interval recurrences for different magnitudes of earthquakes, the network properties such as the in-degree as well as the out-degree distributions for different magnitudes, the clustering coefficient, and the degree correlations between a given event and its recurrences. Since there is an uncertainty in spatial locations of earthquakes, we have allowed for uncertainty in recurrences as well to generate a new suite of digraphs for error analysis. Furthermore, to test for the presence of non-trivial spatiotemporal correlations and causal connections, we have carried out a series of Monte-Carlo simulations by reshuffling the spatial locations and magnitudes of the earthquakes without altering the time of occurrences

  7. Scalar and vector probabilistic seismic hazard analysis: application for Algiers City

    NASA Astrophysics Data System (ADS)

    Faouzi, Gherboudj; Nasser, Laouami

    2014-04-01

    This study deals with the application of probabilistic seismic hazard analysis (PSHA) for a rock site located in Algiers city. For this purpose, recent ground motion prediction equations developed in the world for similar sismotectonic context are used through logic tree in PSHA framework; the obtained results reflect clearly the high seismicity of the considered region. Moreover, deaggregation analysis is conducted to obtain the mean scenario in terms of magnitude and distance. In addition to the scalar-PSHA, a new method named vector-PSHA developed in recent years is performed in this study. Based on the multivariate probability theory, the software used in scalar approach is modified allowing the application of this approach for a real site in Algiers city with a vector of two and three parameters of intensity measure. The results are presented in terms of the joint annual rate of exceeding several thresholds such as PGA, PSA( T) of multiple vibration periods, peak ground velocity and Arias intensity and comparison between results of PSHA and V-PSHA is done.

  8. Numerical analysis of seismic events distributions on the planetary scale and celestial bodies astrometrical parameters

    NASA Astrophysics Data System (ADS)

    Bulatova, Dr.

    2012-04-01

    Modern research in the domains of Earth sciences is developing from the descriptions of each individual natural phenomena to the systematic complex research in interdisciplinary areas. For studies of its kind in the form numerical analysis of three-dimensional (3D) systems, the author proposes space-time Technology (STT), based on a Ptolemaic geocentric system, consist of two modules, each with its own coordinate system: (1) - 3D model of a Earth, the coordinates of which provides databases of the Earth's events (here seismic), and (2) - a compact model of the relative motion of celestial bodies in space - time on Earth known as the "Method of a moving source" (MDS), which was developed in MDS (Bulatova, 1998-2000) for the 3D space. Module (2) was developed as a continuation of the geocentric Ptolemaic system of the world, built on the astronomical parameters heavenly bodies. Based on the aggregation data of Space and Earth Sciences, systematization, and cooperative analysis, this is an attempt to establish a cause-effect relationship between the position of celestial bodies (Moon, Sun) and Earth's seismic events.

  9. Nonlinear dynamic analysis of multi-base seismically isolated structures with uplift potential II: verification examples

    NASA Astrophysics Data System (ADS)

    Roussis, Panayiotis C.; Tsopelas, Panos C.; Constantinou, Michael C.

    2010-03-01

    The work presented in this paper serves as numerical verification of the analytical model developed in the companion paper for nonlinear dynamic analysis of multi-base seismically isolated structures. To this end, two numerical examples have been analyzed using the computational algorithm incorporated into program 3D-BASIS-ME-MB, developed on the basis of the newly-formulated analytical model. The first example concerns a seven-story model structure that was tested on the earthquake simulator at the University at Buffalo and was also used as a verification example for program SAP2000. The second example concerns a two-tower, multi-story structure with a split-level seismic-isolation system. For purposes of verification, key results produced by 3D-BASIS-ME-MB are compared to experimental results, or results obtained from other structural/finite element programs. In both examples, the analyzed structure is excited under conditions of bearing uplift, thus yielding a case of much interest in verifying the capabilities of the developed analysis tool.

  10. Investigation of the nonlinear seismic behavior of knee braced frames using the incremental dynamic analysis method

    NASA Astrophysics Data System (ADS)

    Sheidaii, Mohammad Reza; TahamouliRoudsari, Mehrzad; Gordini, Mehrdad

    2016-06-01

    In knee braced frames, the braces are attached to the knee element rather than the intersection of beams and columns. This bracing system is widely used and preferred over the other commonly used systems for reasons such as having lateral stiffness while having adequate ductility, damage concentration on the second degree convenience of repairing and replacing of these elements after Earthquake. The lateral stiffness of this system is supplied by the bracing member and the ductility of the frame attached to the knee length is supplied through the bending or shear yield of the knee member. In this paper, the nonlinear seismic behavior of knee braced frame systems has been investigated using incremental dynamic analysis (IDA) and the effects of the number of stories in a building, length and the moment of inertia of the knee member on the seismic behavior, elastic stiffness, ductility and the probability of failure of these systems has been determined. In the incremental dynamic analysis, after plotting the IDA diagrams of the accelerograms, the collapse diagrams in the limit states are determined. These diagrams yield that for a constant knee length with reduced moment of inertia, the probability of collapse in limit states heightens and also for a constant knee moment of inertia with increasing length, the probability of collapse in limit states increases.

  11. A Bayesian method to mine spatial data sets to evaluate the vulnerability of human beings to catastrophic risk.

    PubMed

    Li, Lianfa; Wang, Jinfeng; Leung, Hareton; Zhao, Sisi

    2012-06-01

    Vulnerability of human beings exposed to a catastrophic disaster is affected by multiple factors that include hazard intensity, environment, and individual characteristics. The traditional approach to vulnerability assessment, based on the aggregate-area method and unsupervised learning, cannot incorporate spatial information; thus, vulnerability can be only roughly assessed. In this article, we propose Bayesian network (BN) and spatial analysis techniques to mine spatial data sets to evaluate the vulnerability of human beings. In our approach, spatial analysis is leveraged to preprocess the data; for example, kernel density analysis (KDA) and accumulative road cost surface modeling (ARCSM) are employed to quantify the influence of geofeatures on vulnerability and relate such influence to spatial distance. The knowledge- and data-based BN provides a consistent platform to integrate a variety of factors, including those extracted by KDA and ARCSM to model vulnerability uncertainty. We also consider the model's uncertainty and use the Bayesian model average and Occam's Window to average the multiple models obtained by our approach to robust prediction of the risk and vulnerability. We compare our approach with other probabilistic models in the case study of seismic risk and conclude that our approach is a good means to mining spatial data sets for evaluating vulnerability.

  12. Seismic tipping analysis of a spent nuclear fuel shipping cask sitting on a crush pad

    SciTech Connect

    Uldrich, E.D.; Hawkes, B.D.

    1998-04-01

    A crush pad has been designed and analyzed to absorb the kinetic energy of an accidentally dropped spent nuclear fuel shipping cask into a 44 ft. deep cask unloading pool. Conventional analysis techniques available for evaluating a cask for tipping due to lateral seismic forces assume that the cask rests on a rigid surface. In this analysis, the cask (110 tons) sits on a stainless steel encased (0.25 in. top plate), polyurethane foam (4 ft. thick) crush pad. As the cask tends to rock due to horizontal seismic forces, the contact area between the cask and the crush pad is reduced, increasing the bearing stress, and causing the pivoting corner of the cask to depress into the crush pad. As the crush pad depresses under the cask corner, the pivot point shifts from the corner toward the cask center, which facilitates rocking and potential tipping of the cask. Subsequent rocking of the cask may deepen the depression, further contributing to the likelihood of cask tip over. However, as the depression is created, the crush pad is absorbing energy from the rocking cask. Potential tip over of the cask was evaluated by performing a non-linear, dynamic, finite element analysis with acceleration time history input. This time history analysis captured the effect of a deforming crush pad, and also eliminated conservatisms of the conventional approaches. For comparison purposes, this analysis was also performed with the cask sitting on a solid stainless steel crush pad. Results indicate that the conventional methods are quite conservative relative to the more exacting time history analysis. They also indicate that the rocking motion is less on the foam crush pad than on the solid stainless steel pad.

  13. Facility Environmental Vulnerability Assessment

    SciTech Connect

    Van Hoesen, S.D.

    2001-07-09

    From mid-April through the end of June 2001, a Facility Environmental Vulnerability Assessment (FEVA) was performed at Oak Ridge National Laboratory (ORNL). The primary goal of this FEVA was to establish an environmental vulnerability baseline at ORNL that could be used to support the Laboratory planning process and place environmental vulnerabilities in perspective. The information developed during the FEVA was intended to provide the basis for management to initiate immediate, near-term, and long-term actions to respond to the identified vulnerabilities. It was expected that further evaluation of the vulnerabilities identified during the FEVA could be carried out to support a more quantitative characterization of the sources, evaluation of contaminant pathways, and definition of risks. The FEVA was modeled after the Battelle-supported response to the problems identified at the High Flux Beam Reactor at Brookhaven National Laboratory. This FEVA report satisfies Corrective Action 3A1 contained in the Corrective Action Plan in Response to Independent Review of the High Flux Isotope Reactor Tritium Leak at the Oak Ridge National Laboratory, submitted to the Department of Energy (DOE) ORNL Site Office Manager on April 16, 2001. This assessment successfully achieved its primary goal as defined by Laboratory management. The assessment team was able to develop information about sources and pathway analyses although the following factors impacted the team's ability to provide additional quantitative information: the complexity and scope of the facilities, infrastructure, and programs; the significantly degraded physical condition of the facilities and infrastructure; the large number of known environmental vulnerabilities; the scope of legacy contamination issues [not currently addressed in the Environmental Management (EM) Program]; the lack of facility process and environmental pathway analysis performed by the accountable line management or facility owner; and poor

  14. Seismic facies analysis of lacustrine system: Paleocene upper Fort Union Formation, Wind River basin, Wyoming

    SciTech Connect

    Liro, L.M.; Pardus, Y.C.

    1989-03-01

    The authors interpreted seismic reflection data, supported by well control, to reconstruct the stratigraphic development of Paleocene Lake Waltman in the Wind River basin of Wyoming. After dividing the upper Fort Union into eight seismic sequences, the authors mapped seismic attributes (amplitude, continuity, and frequency) within each sequence. Interpretation of the variation in seismic attributes allowed them to detail delta development and encroachment into Lake Waltman during deposition of the upper Fort Union Formation. These deltas are interpreted as high-energy, well-differentiated lobate forms with distinct clinoform morphology on seismic data. Prograding delta-front facies are easily identified on seismic data as higher amplitude, continuous events within the clinoforms. Seismic data clearly demonstrate the time-Transgressive nature of this facies. Downdip of these clinoforms, homogeneous shales, as evidenced by low-amplitude, generally continuous seismic events, accumulated in an interpreted quiet, areally extensive lacustrine setting. Seismic definition of the lateral extent of this lacustrine facies is excellent, allowing them to effectively delineate changes in the lake morphology during deposition of the upper Fort Union Formation. Encasing the upper Fort Union lacustrine deposits are fluvial-alluvial deposits, interpreted from discontinuous, variable-amplitude seismic facies. The authors highlight the correlation of seismic facies data and interpretation to well log data in the Frenchie Draw field to emphasize the accuracy of depositional environment prediction from seismic data.

  15. A Comparative Study on Seismic Analysis of Bangladesh National Building Code (BNBC) with Other Building Codes

    NASA Astrophysics Data System (ADS)

    Bari, Md. S.; Das, T.

    2013-09-01

    Tectonic framework of Bangladesh and adjoining areas indicate that Bangladesh lies well within an active seismic zone. The after effect of earthquake is more severe in an underdeveloped and a densely populated country like ours than any other developed countries. Bangladesh National Building Code (BNBC) was first established in 1993 to provide guidelines for design and construction of new structure subject to earthquake ground motions in order to minimize the risk to life for all structures. A revision of BNBC 1993 is undergoing to make this up to date with other international building codes. This paper aims at the comparison of various provisions of seismic analysis as given in building codes of different countries. This comparison will give an idea regarding where our country stands when it comes to safety against earth quake. Primarily, various seismic parameters in BNBC 2010 (draft) have been studied and compared with that of BNBC 1993. Later, both 1993 and 2010 edition of BNBC codes have been compared graphically with building codes of other countries such as National Building Code of India 2005 (NBC-India 2005), American Society of Civil Engineering 7-05 (ASCE 7-05). The base shear/weight ratios have been plotted against the height of the building. The investigation in this paper reveals that BNBC 1993 has the least base shear among all the codes. Factored Base shear values of BNBC 2010 are found to have increased significantly than that of BNBC 1993 for low rise buildings (≤20 m) around the country than its predecessor. Despite revision of the code, BNBC 2010 (draft) still suggests less base shear values when compared to the Indian and American code. Therefore, this increase in factor of safety against the earthquake imposed by the proposed BNBC 2010 code by suggesting higher values of base shear is appreciable.

  16. Time Lapse Storey Building Early Monitoring Based on Rapid Seismic Response Analysis in Indonesia

    NASA Astrophysics Data System (ADS)

    Julius, A. M.

    2015-12-01

    Within the last decade, advances in the acquisition, processing and transmission of data from seismic monitoring has contributed to the growth in the number structures instrumented with such systems. An equally important factor for such growth can be attributed to the demands by stakeholders to find rapid answers to important questions related to the functionality or state of "health" of structures during and immediately of a seismic events. Consequently, this study aims to monitor the storey building based on seismic response i. e. earthquake and tremor analysis at short time lapse using accelerographs data. This study used one of storey building (X) in Jakarta city that suffered the effects of Kebumen earthquake January 25th 2014, Pandeglang earthquake July 9th 2014, and Lebak earthquake November 8th 2014. Tremors used in this study are tremors after the three following earthquakes. Data processing used to determine peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), spectral acceleration (SA), spectral velocity (SV), spectral displacement (SD), A/V ratio, acceleration amplification and effective duration (te). Then determine the natural frequency (f0) and peak of H/V ratio using H/V ratio method. The earthquakes data processing result shows the value of peak ground motion, spectrum response, A/V ratio and acceleration amplification increases with height, while the value of the effective duration decreases. Then, tremors data processing result one month after each earthquakes shows the natural frequency of building in constant value. Increasing of peak ground motion, spectrum response, A/V ratio, acceleration amplification, then decrease of effective duration following the increase of building floors shows that the building construction supports the increasing of shaking and strongly influenced by local site effect. The constant value of building natural frequency shows the building still in good performance. This

  17. Broadband analysis of landslides seismic signal : example of the Oso-Steelhead landslide and other recent events

    NASA Astrophysics Data System (ADS)

    Hibert, C.; Stark, C. P.; Ekstrom, G.

    2014-12-01

    Landslide failures on the scale of mountains are spectacular, dangerous, and spontaneous, making direct observations hard to obtain. Measurement of their dynamic properties during runout is a high research priority, but a logistical and technical challenge. Seismology has begun to help in several important ways. Taking advantage of broadband seismic stations, recent advances now allow: (i) the seismic detection and location of large landslides in near-real-time, even for events in very remote areas that may have remain undetected, such as the 2014 Mt La Perouse supraglacial failure in Alaska; (ii) inversion of long-period waves generated by large landslides to yield an estimate of the forces imparted by the bulk accelerating mass; (iii) inference of the landslide mass, its center-of-mass velocity over time, and its trajectory.Key questions persist, such as: What can the short-period seismic data tell us about the high-frequency impacts taking place within the granular flow and along its boundaries with the underlying bedrock? And how does this seismicity relate to the bulk acceleration of the landslide and the long-period seismicity generated by it?Our recent work on the joint analysis of short- and long-period seismic signals generated by past and recent events, such as the Bingham Canyon Mine and the Oso-Steelhead landslides, provides new insights to tackle these issues. Qualitative comparison between short-period signal features and kinematic parameters inferred from long-period surface wave inversion helps to refine interpretation of the source dynamics and to understand the different mechanisms for the origin of the short-period wave radiation. Our new results also suggest that quantitative relationships can be derived from this joint analysis, in particular between the short-period seismic signal envelope and the inferred momentum of the center-of-mass. In the future, these quantitative relationships may help to constrain and calibrate parameters used in

  18. 3-D seismic analysis of a carbonate platform in the Molasse Basin - reef distribution and internal separation with seismic attributes

    NASA Astrophysics Data System (ADS)

    von Hartmann, Hartwig; Buness, Hermann; Krawczyk, Charlotte M.; Schulz, Rüdiger

    2012-10-01

    Carbonate platforms differ from clastic sedimentary environments by a greater heterogeneity, so that key horizons for mapping and compartmentalisation of the reservoir are generally missing. We show that different seismic attributes help to compete with these difficulties and to identify different carbonate facies within the platform. The Upper Jurassic carbonate platform in Southern Germany in the Molasse Basin is a main exploration target for hydrogeothermal projects. Knowledge about the distribution of different carbonate facies within the platform, which is overprinted by faults, is important for a realistic reservoir simulation. The platform with an average thickness of 600 meters was artificially divided into four layers of equal thickness. Within each layer the characteristic seismic pattern was visualized by different attributes (travel time mapping, spectral decomposition), allowing additionally for further depositional classification. Within the uppermost layer the coral reef distribution could be mapped. The reefs form several complexes of up to 12 square kilometres in size. The surrounding slope and trough areas are identified as well. Within the platform , the distribution of sponge reefs could be visualized. They form either amalgamations in distinct areas, or are spread as small singular structures with diameters of approximately less than hundred meters. Comparing tectonic elements and reef distribution within the whole platform reveals that the early topography triggered the reef distribution, while these lithologic inhomogenities influenced later on the local shape of tectonic lineaments. The fault system which dominates the structural style in the area is visible in the different transformations but does not obscure the facies distribution, which hindered former interpretations of the data set. In this way a reservoir model can incorporate now the first time the reef distribution within an area.

  19. Analysis of the seismic wavefield in the Moesian Platform (Bucharest area)

    NASA Astrophysics Data System (ADS)

    -Florinela Manea, Elena; Hobiger, Manuel-Thomas; Michel, Clotaire; Fäh, Donat; -Ortanza Cioflan, Carmen

    2016-04-01

    Bucharest is located in the center of the Moesian platform, in a large and deep sedimentary basin (450 km long, 300 km wide and in some places up to 20 km depth). During large earthquakes generated by the Vrancea seismic zone, located approximately 140 km to the North, the ground motion recorded in Bucharest area is characterized by predominant long periods and large amplification. This phenomenon has been explained by the influence of both source mechanism (azimuth and type of incident waves) and mechanical properties of the local structure (geological layering and geometry). The main goal of our study is to better characterize and understand the seismic wave field produced by earthquakes in the area of Bucharest. We want to identify the contribution of different seismic surface waves, such as the ones produced at the edges of the large sedimentary basin or multipath interference waves (Airy phases of Love and Rayleigh waves) to the ground motion. The data from a 35 km diameter array (URS experiment) installed by the National Institute for Earth Physics during 10 months in 2003 and 2004 in the urban area of Bucharest and adjacent zones was used. In order to perform the wave field characterization of the URS array, the MUSIQUE technique was used. This technique consists in a combination of the classical MUSIC and the quaternion-MUSIC algorithms and analyzes the three-component signals of all sensors of a seismic array together in order to analyze the Love and Rayleigh wave dispersion curves as well as the Rayleigh wave ellipticity curve. The analysis includes 20 regional earthquakes with Mw >3 and 5 teleseismic events with Mw> 7 that have enough energy at low frequency (0.1 - 1 Hz), i.e. in the resolution range of the array. For all events, the greatest energy is coming from the backazimuth of the source and the wave field is dominated by Love waves. The results of the array analyses clearly indicate a significant scattering corresponding to 2D or 3D effects in the

  20. Preliminary Analysis of the CASES GPS Receiver Performance during Simulated Seismic Displacements

    NASA Astrophysics Data System (ADS)

    De la Rosa-Perkins, A.; Reynolds, A.; Crowley, G.; Azeem, I.

    2014-12-01

    We explore the ability of a new GPS software receiver, called CASES (Connected Autonomous Space Environment Sensor), to measure seismic displacements in realtime. Improvements in GPS technology over the last 20 years allow for precise measurement of ground motion during seismic events. For example, GPS data has been used to calculate displacement histories at an earthquake's epicenter and fault slip estimations with great accuracy. This is supported by the ability to measure displacements directly using GPS, bypassing the double integration that accelerometers require, and by higher clipping limits than seismometers. The CASES receiver developed by ASTRA in collaboration with Cornell University and the University of Texas, Austin represents a new geodetic-quality software-based GPS receiver that measures ionospheric space weather in addition to the usual navigation solution. To demonstrate, in a controlled environment, the ability of the CASES receiver to measure seismic displacements, we simulated ground motions similar to those generated during earthquakes, using a shake box instrumented with an accelerometer and a GPS antenna. The accelerometer measured the box's actual displacement. The box moved on a manually controlled axis that underwent varied one-dimensional motions (from mm to cm) at different frequencies and amplitudes. The CASES receiver was configured to optimize the accuracy of the position solution. We quantified the CASES GPS receiver performance by comparing the GPS solutions against the accelerometer data using various statistical analysis methods. The results of these tests will be presented. The CASES receiver is designed with multiple methods of accessing the data in realtime, ranging from internet connection, blue-tooth, cell-phone modem and Iridium modem. Because the CASES receiver measures ionospheric space weather in addition to the usual navigation solution, CASES provides not only the seimic signal, but also the ionospheric space weather

  1. Bayesian uncertainty analysis for advanced seismic imaging - Application to the Mentelle Basin, Australia

    NASA Astrophysics Data System (ADS)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2016-04-01

    multivariate posterior distribution. The novelty of our approach and the major difference compared to the traditional semblance spectrum velocity analysis procedure is the calculation of uncertainty of the output model. As the model is able to estimate the credibility intervals of the corresponding interval velocities, we can produce the most probable PSDM images in an iterative manner. The depths extracted using our statistical algorithm are in very good agreement with the key horizons retrieved from the drilled core DSDP-258, showing that the Bayesian model is able to control the depth migration of the seismic data and estimate the uncertainty to the drilling targets.

  2. Vulnerability assessment at a national level in Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N.; Arabidze, V.; Varazanashvili, O.; Gugeshashvili, T.

    2012-04-01

    Vulnerability assessment at a national level in Georgia Nino Tsereteli, Vakhtang Arabidze, Otar Varazanashvili, Tengiz Gugeshashvili The risk always exists when cities are built on. Population growth in cities and urbanization in natural hazard-prone zones leads to infrastructure expansion. The goal of the society is to construct natural hazards resistant infrastructure and minimize the expected losses. This is a complicated task as there is always knowledge deficiency on real seismic hazard and vulnerability. Assessment of vulnerability is vital in risk analysis, as vulnerability is defined in many different ways. Work presented here mostly deals with assessment of infrastructure's and population vulnerability at national level in Georgia. This work was initiated by NATO SFP project "seismic Hazard and Risk Assessment for Southern Caucasus - Eastern Turkey Energy Corridors" and the two work packages WP4 (seismic risk) and WP5 (city scenarios) of risk module of EMME (Earthquake Model of the Middle East Region) project. First step was creation databases (inventory) of elements at risk in GIS. Element at risk were the buildings, population, pipelines. The inventories was studied and Created in GIS for the following categories: Building material, number of stories, number of entrances, condition of building, building period. For pipelines pipe tipe (continous or segmented), material, pipe diameter. Very important is to estimate the initial cost of building for assessment of economic losses. From this purpose the attempt was done and the algorithm of this estimation were prepared taking into account obtained the inventory. Build quality, reliability and durability are of special importance to corresponding state agencies and include different aesthetic, engineering, practical, social, technological and economical aspects. The necessity that all of these aspects satisfy existing normative requirements becomes evident as the building and structures come into exploitation

  3. Recommendations for damping and treatment of modeling uncertainty in seismic analysis of CANDU nuclear power plant

    SciTech Connect

    Usmani, S.A.; Baughman, P.D.

    1996-12-01

    The seismic analysis of the CANDU nuclear power plant is governed by Canadian Standard series N289. However, the dynamic analysis of some equipment and system such as the CANDU reactor and fueling machine must treat unique components not directly covered by the broad recommendations of these standards. This paper looks at the damping values and treatment of modeling uncertainty recommended by CSA N289.3, the current state of knowledge and expert opinion as reflected in several current standards, testing results, and the unique aspects of the CANDU system. Damping values are recommended for the component parts of the CANDU reactor and fueling machine system: reactor building, calandria vault, calandria, fuel channel, pressure tube, fueling machine and support structure. Recommendations for treatment of modeling and other uncertainties are also presented.

  4. Network topology, Transport dynamics, and Vulnerability Analysis in River Deltas: A Graph-Theoretic Approach

    NASA Astrophysics Data System (ADS)

    Tejedor, A.; Foufoula-Georgiou, E.; Longjas, A.; Zaliapin, I. V.

    2014-12-01

    River deltas are intricate landscapes with complex channel networks that self-organize to deliver water, sediment, and nutrients from the apex to the delta top and eventually to the coastal zone. The natural balance of material and energy fluxes which maintains a stable hydrologic, geomorphologic, and ecological state of a river delta, is often disrupted by external factors causing topological and dynamical changes in the delta structure and function. A formal quantitative framework for studying river delta topology and transport dynamics and their response to change is lacking. Here we present such a framework based on spectral graph theory and demonstrate its value in quantifying the complexity of the delta network topology, computing its steady state fluxes, and identifying upstream (contributing) and downstream (nourishment) areas from any point in the network. We use this framework to construct vulnerability maps that quantify the relative change of sediment and water delivery to the shoreline outlets in response to possible perturbations in hundreds of upstream links. This enables us to evaluate which links (hotspots) and what management scenarios would most influence flux delivery to the outlets, paving the way of systematically examining how local or spatially distributed delta interventions can be studied within a systems approach for delta sustainability.

  5. Shared vision, shared vulnerability: A content analysis of corporate social responsibility information on tobacco industry websites.

    PubMed

    McDaniel, Patricia A; Cadman, Brie; Malone, Ruth E

    2016-08-01

    Tobacco companies rely on corporate social responsibility (CSR) initiatives to improve their public image and advance their political objectives, which include thwarting or undermining tobacco control policies. For these reasons, implementation guidelines for the World Health Organization's Framework Convention on Tobacco Control (FCTC) recommend curtailing or prohibiting tobacco industry CSR. To understand how and where major tobacco companies focus their CSR resources, we explored CSR-related content on 4 US and 4 multinational tobacco company websites in February 2014. The websites described a range of CSR-related activities, many common across all companies, and no programs were unique to a particular company. The websites mentioned CSR activities in 58 countries, representing nearly every region of the world. Tobacco companies appear to have a shared vision about what constitutes CSR, due perhaps to shared vulnerabilities. Most countries that host tobacco company CSR programs are parties to the FCTC, highlighting the need for full implementation of the treaty, and for funding to monitor CSR activity, replace industry philanthropy, and enforce existing bans.

  6. Fractal simulation of urbanization for the analysis of vulnerability to natural hazards

    NASA Astrophysics Data System (ADS)

    Puissant, Anne; Sensier, Antoine; Tannier, Cécile; Malet, Jean-Philippe

    2016-04-01

    Since 50 years, mountain areas are affected by important land cover/use changes characterized by the decrease of pastoral activities, reforestation and urbanization with the development of tourism activities and infrastructures. These natural and anthropogenic transformations have an impact on the socio-economic activities but also on the exposure of the communities to natural hazards. In the context of the ANR Project SAMCO which aims at enhancing the overall resilience of societies on the impacts of mountain risks, the objective of this research was to help to determine where to locate new residential developments for different scenarios of land cover/use (based on the Prelude European Project) for the years 2030 and 2050. The Planning Support System (PSS), called MUP-City, based on a fractal multi-scale modeling approach is used because it allows taking into account local accessibility to some urban and rural amenities (Tannier et al., 2012). For this research, an experiment is performed on a mountain area in the French Alps (Barcelonnette Basin) to generate three scenarios of urban development with MUP-City at the scale of 1:10:000. The results are assessed by comparing the localization of residential developments with urban areas predicted by land cover and land use scenarios generated by cellular automata modelling (LCM and Dyna-clue) (Puissant et al., 2015). Based on these scenarios, the evolution of vulnerability is estimated.

  7. Shared vision, shared vulnerability: A content analysis of corporate social responsibility information on tobacco industry websites.

    PubMed

    McDaniel, Patricia A; Cadman, Brie; Malone, Ruth E

    2016-08-01

    Tobacco companies rely on corporate social responsibility (CSR) initiatives to improve their public image and advance their political objectives, which include thwarting or undermining tobacco control policies. For these reasons, implementation guidelines for the World Health Organization's Framework Convention on Tobacco Control (FCTC) recommend curtailing or prohibiting tobacco industry CSR. To understand how and where major tobacco companies focus their CSR resources, we explored CSR-related content on 4 US and 4 multinational tobacco company websites in February 2014. The websites described a range of CSR-related activities, many common across all companies, and no programs were unique to a particular company. The websites mentioned CSR activities in 58 countries, representing nearly every region of the world. Tobacco companies appear to have a shared vision about what constitutes CSR, due perhaps to shared vulnerabilities. Most countries that host tobacco company CSR programs are parties to the FCTC, highlighting the need for full implementation of the treaty, and for funding to monitor CSR activity, replace industry philanthropy, and enforce existing bans. PMID:27261411

  8. Mapping of active faults based on the analysis of high-resolution seismic reflection profiles in offshore Montenegro

    NASA Astrophysics Data System (ADS)

    Vucic, Ljiljana; Glavatovic, Branislav

    2014-05-01

    High-resolution seismic-reflection data analysis is considered as important tool for mapping of active tectonic faults, since seismic exploration methods on varied scales can image subsurface structures of different depth ranges. Mapping of active faults for the offshore area of Montenegro is performed in Petrel software, using reflection database consist of 2D profiles in length of about 3.500 kilometers and 311 square kilometers of 3D seismics, acquired from 1979 to 2003. Montenegro offshore area is influenced by recent tectonic activity with numerous faults, folded faults and over trusts. Based on reflection profiles analysis, the trust fault system offshore Montenegro is reveled, parallel to the coast and extending up to 15 kilometers from the offshore line. Then, the system of normal top carbonate fault planes is mapped and characterized on the southern Adriatic, with NE trending. The tectonic interpretation of the seismic reflection profiles in Montenegro point toward the existence of principally reverse tectonic forms in the carbonate sediments, covered by young Quaternary sandy sediments of thickness 1-3 kilometers. Also, reflective seismic data indicate the active uplifting of evaporite dome on about 10 kilometers of coastline.

  9. Analysis of post-blasting source mechanisms of mining-induced seismic events in Rudna copper mine, Poland.

    NASA Astrophysics Data System (ADS)

    Caputa, Alicja; Rudzinski, Lukasz; Talaga, Adam

    2016-04-01

    Copper ore exploitation in the Lower Silesian Copper District, Poland (LSCD), is connected with many specific hazards. The most hazardous one is induced seismicity and rockbursts which follow strong mining seismic events. One of the most effective method to reduce seismic activity is blasting in potentially hazardous mining panels. This way, small to moderate tremors are provoked and stress accumulation is substantially reduced. This work presents an analysis of post-blasting events using Full Moment Tensor (MT) inversion at the Rudna mine, Poland using signals dataset recorded on underground seismic network. We show that focal mechanisms for events that occurred after blasts exhibit common features in the MT solution. The strong isotropic and small Double Couple (DC) component of the MT, indicate that these events were provoked by detonations. On the other hand, post-blasting MT is considerably different than the MT obtained for common strong mining events. We believe that seismological analysis of provoked and unprovoked events can be a very useful tool in confirming the effectiveness of blasting in seismic hazard reduction in mining areas.

  10. Seismic analysis of the 4-meter telescope SST-GATE for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Dournaux, Jean-Laurent; Huet, Jean-Michel; Amans, Jean-Philippe; Dumas, Delphine; Blake, Simon; Sol, Hélène

    2014-07-01

    The Cherenkov Telescope Array (CTA) project aims to create a next generation Very High Energy (VHE)γ-ray telescope array, devoted to the observation in a wide band of energy, from a few tens of GeV to more than 100 TeV. Two sites are foreseen to view the whole sky, with the main one in the Southern Hemisphere where about 100 telescopes of three different classes, related to the specific energy region to be investigated, will be installed. Among these, the Small Size class of Telescopes, SSTs, are 4-meter telescopes and are devoted to the highest energy region, from 1 TeV to beyond 100 TeV. Some of these sites considered for CTA exhibit strong seismic constraints. At the Observatoire de Paris, we have designed a prototype of a Small Size Telescope named SST-GATE, based on the dual-mirror Schwarzschild-Couder optical formula, which was never before implemented in the design of a Cherenkov telescope. The integration of this telescope on the site of the Observatoire de Paris is currently in progress. Technical solutions exist in the literature to protect structures from dynamic loads caused by earthquakes without increasing the mass and cost of the structure. This paper presents a state of the art of these techniques by keeping in mind that the operational performance of the telescope should not be compromised. The preliminary seismic analysis of SSTGATE performed by the finite element method is described before.

  11. Seismic joint analysis for non-destructive testing of asphalt and concrete slabs

    USGS Publications Warehouse

    Ryden, N.; Park, C.B.

    2005-01-01

    A seismic approach is used to estimate the thickness and elastic stiffness constants of asphalt or concrete slabs. The overall concept of the approach utilizes the robustness of the multichannel seismic method. A multichannel-equivalent data set is compiled from multiple time series recorded from multiple hammer impacts at progressively different offsets from a fixed receiver. This multichannel simulation with one receiver (MSOR) replaces the true multichannel recording in a cost-effective and convenient manner. A recorded data set is first processed to evaluate the shear wave velocity through a wave field transformation, normally used in the multichannel analysis of surface waves (MASW) method, followed by a Lambwave inversion. Then, the same data set is used to evaluate compression wave velocity from a combined processing of the first-arrival picking and a linear regression. Finally, the amplitude spectra of the time series are used to evaluate the thickness by following the concepts utilized in the Impact Echo (IE) method. Due to the powerful signal extraction capabilities ensured by the multichannel processing schemes used, the entire procedure for all three evaluations can be fully automated and results can be obtained directly in the field. A field data set is used to demonstrate the proposed approach.

  12. Damage detection and quantification in a structural model under seismic excitation using time-frequency analysis

    NASA Astrophysics Data System (ADS)

    Chan, Chun-Kai; Loh, Chin-Hsiung; Wu, Tzu-Hsiu

    2015-04-01

    In civil engineering, health monitoring and damage detection are typically carry out by using a large amount of sensors. Typically, most methods require global measurements to extract the properties of the structure. However, some sensors, like LVDT, cannot be used due to in situ limitation so that the global deformation remains unknown. An experiment is used to demonstrate the proposed algorithms: a one-story 2-bay reinforce concrete frame under weak and strong seismic excitation. In this paper signal processing techniques and nonlinear identification are used and applied to the response measurements of seismic response of reinforced concrete structures subject to different level of earthquake excitations. Both modal-based and signal-based system identification and feature extraction techniques are used to study the nonlinear inelastic response of RC frame using both input and output response data or output only measurement. From the signal-based damage identification method, which include the enhancement of time-frequency analysis of acceleration responses and the estimation of permanent deformation using directly from acceleration response data. Finally, local deformation measurement from dense optical tractor is also use to quantify the damage of the RC frame structure.

  13. Seismic Hazard Analysis of Aizawl, India with a Focus on Water System Fragilities

    NASA Astrophysics Data System (ADS)

    Belair, G. M.; Tran, A. J.; Dreger, D. S.; Rodgers, J. E.

    2015-12-01

    GeoHazards International (GHI) has partnered with the University of California, Berkeley in a joint Civil Engineering and Earth Science summer internship program to investigate geologic hazards. This year the focus was on Aizawl, the capital of India's Mizoram state, situated on a ridge in the Burma Ranges. Nearby sources have the potential for large (M > 7) earthquakes that would be devastating to the approximately 300,000 people living in the city. Earthquake induced landslides also threaten the population as well as the city's lifelines. Fieldwork conducted in June 2015 identified hazards to vital water system components. The focus of this abstract is a review of the seismic hazards that affect Aizawl, with special attention paid to water system locations. To motivate action to reduce risk, GHI created an earthquake scenario describing effects of a M7 right-lateral strike-slip intraplate earthquake occurring 30 km below the city. We extended this analysis by exploring additional mapped faults as well as hypothetical blind reverse faults in terms of PGA, PGV, and PSA. Ground motions with hanging wall and directivity effects were also examined. Several attenuation relationships were used in order to assess the uncertainty in the ground motion parameters. Results were used to determine the likely seismic performance of water system components, and will be applied in future PSHA studies.

  14. Analysis of the seismicity activity of the volcano Ceboruco, Nayarit, Mexico

    NASA Astrophysics Data System (ADS)

    Rodriguez-Ayala, N. A.; Nunez-Cornu, F. J.; Escudero, C. R.; Zamora-Camacho, A.; Gomez, A.

    2014-12-01

    The Ceboruco is a stratovolcano is located in the state of Nayarit,Mexico (104 ° 30'31 .25 "W, 21 ° 7'28 .35" N, 2280msnm). This is an volcano active, as part of the Trans-Mexican Volcanic Belt, Nelson (1986) reports that it has had activity during the last 1000 years has averaged eruptions every 125 years or so, having last erupted in 1870, currently has fumarolic activity. In the past 20 years there has been an increase in the population and socio-economic activities around the volcano (Suárez Plascencia, 2013); which reason the Ceboruco study has become a necessity in several ways. Recent investigations of seismicity (Rodríguez Uribe et al., 2013) have classified the earthquakes in four families Ceboruco considering the waveform and spectral features. We present analysis included 57 days of seismicity from March to October 2012, in the period we located 97 events with arrivals of P and S waves clear, registered in at least three seasons, three components of the temporal network Ceboruco volcano.

  15. Analysis of Micro-Seismic Signals and Source Parameters of Eruptions Generated by Rapid Decompression of Volcanic Rocks

    NASA Astrophysics Data System (ADS)

    Arciniega-Ceballos, A.; Alatorre-Ibarguengoitia, M. A.; Scheu, B.; Dingwell, D. B.; Delgado Granados, H.

    2010-12-01

    , therefore the source time function of the system can be determined. We investigated the relationships between pressure and decompression time versus maximum amplitudes, the source duration, the decay of seismic waves and the counter-force, considering different combinations of gas and solids in the reservoir. From these relationships, the pre-eruption conduit state can be estimated in volcanic systems, and thus the ejection velocity can be calculated in order to evaluate the implications for hazard analysis. In addition, we discuss important considerations regarding the deduction of parametric scaling laws for volcanic explosions using field seismic data. This experimental approach and the high quality of seismic records allow us to obtain a direct measure of the source parameters of the physical mechanism and evaluate the viability of the theoretical single force model to quantify real volcanic eruptions.