Science.gov

Sample records for seismic vulnerability analysis

  1. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  2. Integrating Social impacts on Health and Health-Care Systems in Systemic Seismic Vulnerability Analysis

    NASA Astrophysics Data System (ADS)

    Kunz-Plapp, T.; Khazai, B.; Daniell, J. E.

    2012-04-01

    This paper presents a new method for modeling health impacts caused by earthquake damage which allows for integrating key social impacts on individual health and health-care systems and for implementing these impacts in quantitative systemic seismic vulnerability analysis. In current earthquake casualty estimation models, demand on health-care systems is estimated by quantifying the number of fatalities and severity of injuries based on empirical data correlating building damage with casualties. The expected number of injured people (sorted by priorities of emergency treatment) is combined together with post-earthquake reduction of functionality of health-care facilities such as hospitals to estimate the impact on healthcare systems. The aim here is to extend these models by developing a combined engineering and social science approach. Although social vulnerability is recognized as a key component for the consequences of disasters, social vulnerability as such, is seldom linked to common formal and quantitative seismic loss estimates of injured people which provide direct impact on emergency health care services. Yet, there is a consensus that factors which affect vulnerability and post-earthquake health of at-risk populations include demographic characteristics such as age, education, occupation and employment and that these factors can aggravate health impacts further. Similarly, there are different social influences on the performance of health care systems after an earthquake both on an individual as well as on an institutional level. To link social impacts of health and health-care services to a systemic seismic vulnerability analysis, a conceptual model of social impacts of earthquakes on health and the health care systems has been developed. We identified and tested appropriate social indicators for individual health impacts and for health care impacts based on literature research, using available European statistical data. The results will be used to

  3. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic

  4. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  5. Effect of β on Seismic Vulnerability Curve for RC Bridge Based on Double Damage Criterion

    NASA Astrophysics Data System (ADS)

    Qing-hai, Feng; Wan-cheng, Yuan

    2010-05-01

    In the analysis of seismic vulnerability curve based on double damage criterion, the randomness of structural parameter and randomness of seismic should be considered. Firstly, the distribution characteristics of structure capability and seismic demand are obtained based on IDA and PUSHOVER, secondly, the vulnerability of the bridge is gained based on ANN and MC and a vulnerability curve according to this bridge and seismic is drawn. Finally, the analysis for a continuous bridge is displayed as an example, and parametric analysis for the effect of β is done, which reflects the bridge vulnerability overall from the point of total probability, and in order to reduce the discreteness, large value of β are suggested.

  6. Effect of beta on Seismic Vulnerability Curve for RC Bridge Based on Double Damage Criterion

    SciTech Connect

    Feng Qinghai; Yuan Wancheng

    2010-05-21

    In the analysis of seismic vulnerability curve based on double damage criterion, the randomness of structural parameter and randomness of seismic should be considered. Firstly, the distribution characteristics of structure capability and seismic demand are obtained based on IDA and PUSHOVER, secondly, the vulnerability of the bridge is gained based on ANN and MC and a vulnerability curve according to this bridge and seismic is drawn. Finally, the analysis for a continuous bridge is displayed as an example, and parametric analysis for the effect of beta is done, which reflects the bridge vulnerability overall from the point of total probability, and in order to reduce the discreteness, large value of beta are suggested.

  7. Extreme seismicity and disaster risks: Hazard versus vulnerability (Invited)

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.

    2013-12-01

    Although the extreme nature of earthquakes has been known for millennia due to the resultant devastation from many of them, the vulnerability of our civilization to extreme seismic events is still growing. It is partly because of the increase in the number of high-risk objects and clustering of populations and infrastructure in the areas prone to seismic hazards. Today an earthquake may affect several hundreds thousand lives and cause significant damage up to hundred billion dollars; it can trigger an ecological catastrophe if occurs in close vicinity to a nuclear power plant. Two types of extreme natural events can be distinguished: (i) large magnitude low probability events, and (ii) the events leading to disasters. Although the first-type events may affect earthquake-prone countries directly or indirectly (as tsunamis, landslides etc.), the second-type events occur mainly in economically less-developed countries where the vulnerability is high and the resilience is low. Although earthquake hazards cannot be reduced, vulnerability to extreme events can be diminished by monitoring human systems and by relevant laws preventing an increase in vulnerability. Significant new knowledge should be gained on extreme seismicity through observations, monitoring, analysis, modeling, comprehensive hazard assessment, prediction, and interpretations to assist in disaster risk analysis. The advanced disaster risk communication skill should be developed to link scientists, emergency management authorities, and the public. Natural, social, economic, and political reasons leading to disasters due to earthquakes will be discussed.

  8. A Methodology for Assessing the Seismic Vulnerability of Highway Systems

    SciTech Connect

    Cirianni, Francis; Leonardi, Giovanni; Scopelliti, Francesco

    2008-07-08

    Modern society is totally dependent on a complex and articulated infrastructure network of vital importance for the existence of the urban settlements scattered on the territory. On these infrastructure systems, usually indicated with the term lifelines, are entrusted numerous services and indispensable functions of the normal urban and human activity.The systems of the lifelines represent an essential element in all the urbanised areas which are subject to seismic risk. It is important that, in these zones, they are planned according to opportune criteria based on two fundamental assumptions: a) determination of the best territorial localization, avoiding, within limits, the places of higher dangerousness; b) application of constructive technologies finalized to the reduction of the vulnerability.Therefore it is indispensable that in any modern process of seismic risk assessment the study of the networks is taken in the rightful consideration, to be integrated with the traditional analyses of the buildings.The present paper moves in this direction, dedicating particular attention to one kind of lifeline: the highway system, proposing a methodology of analysis finalized to the assessment of the seismic vulnerability of the system.

  9. Evaluation Of The Seismic Vulnerability of Fortified Structures

    SciTech Connect

    Baratta, Alessandro; Corbi, Ileana; Coppari, Sandro

    2008-07-08

    In the paper a prompt method to evaluate the seismic vulnerability of an ancient structure has been applied to the seismic vulnerability of the fortified structures in Italy, having as basics the elaboration of rather gross information about the state, the consistency and the history of the considered population of fabrics. The procedure proves to be rather effective and able to produce reliable results, despite the poor initial data.

  10. Seismic Vulnerability and Performance Level of confined brick walls

    SciTech Connect

    Ghalehnovi, M.; Rahdar, H. A.

    2008-07-08

    There has been an increase on the interest of Engineers and designers to use designing methods based on displacement and behavior (designing based on performance) Regarding to the importance of resisting structure design against dynamic loads such as earthquake, and inability to design according to prediction of nonlinear behavior element caused by nonlinear properties of constructional material.Economically speaking, easy carrying out and accessibility of masonry material have caused an enormous increase in masonry structures in villages, towns and cities. On the other hand, there is a necessity to study behavior and Seismic Vulnerability in these kinds of structures since Iran is located on the earthquake belt of Alpide.Different reasons such as environmental, economic, social, cultural and accessible constructional material have caused different kinds of constructional structures.In this study, some tied walls have been modeled with software and with relevant accelerator suitable with geology conditions under dynamic analysis to research on the Seismic Vulnerability and performance level of confined brick walls. Results from this analysis seem to be satisfactory after comparison of them with the values in Code ATC40, FEMA and standard 2800 of Iran.

  11. Evaluation of socio-spatial vulnerability of citydwellers and analysis of risk perception: industrial and seismic risks in Mulhouse

    NASA Astrophysics Data System (ADS)

    Glatron, S.; Beck, E.

    2008-10-01

    Social vulnerability has been studied for years with sociological, psychological and economical approaches. Our proposition focuses on perception and cognitive representations of risks by city dwellers living in a medium size urban area, namely Mulhouse (France). Perception, being part of the social vulnerability and resilience of the society to disasters, influences the potential damage; for example it leads to adequate or inadequate behaviour in the case of an emergency. As geographers, we assume that the spatial relationship to danger or hazard can be an important factor of vulnerability and we feel that the spatial dimension is a challenging question either for better knowledge or for operational reasons (e.g. management of preventive information). We interviewed 491 people, inhabitants and workers, regularly distributed within the urban area to get to know their opinion on hazards and security measures better. We designed and mapped a vulnerability index on the basis of their answers. The results show that the social vulnerability depends on the type of hazard, and that the distance to the source of danger influences the vulnerability, especially for hazards with a precise location (industrial for example). Moreover, the effectiveness of the information campaigns is doubtful, as the people living close to hazardous industries (target of specific preventive information) are surprisingly more vulnerable and less aware of industrial risk.

  12. Remote sensing techniques applied to seismic vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Juan Arranz, Jose; Torres, Yolanda; Hahgi, Azade; Gaspar-Escribano, Jorge

    2016-04-01

    Advances in remote sensing and photogrammetry techniques have increased the degree of accuracy and resolution in the record of the earth's surface. This has expanded the range of possible applications of these data. In this research, we have used these data to document the construction characteristics of the urban environment of Lorca, Spain. An exposure database has been created with the gathered information to be used in seismic vulnerability assessment. To this end, we have used data from photogrammetric flights at different periods, using both orthorectified images in the visible and infrared spectrum. Furthermore, the analysis is completed using LiDAR data. From the combination of these data, it has been possible to delineate the building footprints and characterize the constructions with attributes such as the approximate date of construction, area, type of roof and even building materials. To carry out the calculation, we have developed different algorithms to compare images from different times, segment images, classify LiDAR data, and use the infrared data in order to remove vegetation or to compute roof surfaces with height value, tilt and spectral fingerprint. In addition, the accuracy of our results has been validated with ground truth data. Keywords: LiDAR, remote sensing, seismic vulnerability, Lorca

  13. Approaches of Seismic Vulnerability Assessments in Near Real Time Systems

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2014-05-01

    Data on seismic vulnerability of existing building stock and other elements at risk are rather important for near real time earthquake loss estimations by global systems. These data together with information on regional peculiarities of seismic intensity attenuation and other factors contribute greatly to the reliability of strong event consequences estimated in emergency mode. There are different approaches for vulnerability functions' development and the empirical one is most often used. It is based on analysis of engineering consequences of past strong events when well documented descriptions of damage to different building types and other elements at risk are available for the earthquake prone area under consideration. In the case such data do not exist the information from macroseismic scales may be used. Any approach of vulnerability functions' development requires the proper classification of buildings and structures under consideration. According to national and international building codes, as well as macroseismic scales different buildings' classifications exist. As a result the global systems, such as Extremum and PAGER, as well as GEM project make use of the non-unified information on building stock distribution worldwide. The paper addresses the issues of buildings' classification and city models in terms of these classifications. Distribution of different buildings types in Extremum and PAGER/GEM systems is analyzed for earthquake prone countries. The comparison of city models revealed significant differences which influence greatly earthquake loss estimations in emergency mode. The paper describes the practice of city models' development which make use of space images and web technology in social networks. It is proposed to use the G8 country (and other) initiatives related to open data and transparency aimed at improving building stock distribution and global population databases.

  14. Seismic vulnerability and risk assessment of Kolkata City, India

    NASA Astrophysics Data System (ADS)

    Nath, S. K.; Adhikari, M. D.; Devaraj, N.; Maiti, S. K.

    2014-04-01

    The city of Kolkata is one of the most urbanized and densely populated regions in the world, which is a major industrial and commercial hub of the Eastern and Northeastern region of India. In order to classify the seismic risk zones of Kolkata we used seismic hazard exposures on the vulnerability components namely, landuse/landcover, population density, building typology, age and height. We microzoned seismic hazard of the City by integrating seismological, geological and geotechnical themes in GIS which in turn is integrated with the vulnerability components in a logic-tree framework to estimate both the socio-economic and structural risk of the City. In both the risk maps, three broad zones have been demarcated as "severe", "high" and "moderate". There had also been a risk-free zone in the City. The damage distribution in the City due to the 1934 Bihar-Nepal Earthquake of Mw 8.1 well matches with the risk regime. The design horizontal seismic coefficients for the City have been worked out for all the predominant periods which indicate suitability of "A", "B" and "C" type of structures. The cumulative damage probabilities in terms of "slight", "moderate", "extensive" and "complete" have also been assessed for the significant four model building types viz. RM2L, RM2M, URML and URMM for each structural seismic risk zone in the City. Both the Seismic Hazard and Risk maps are expected to play vital roles in the earthquake inflicted disaster mitigation and management of the city of Kolkata.

  15. Key geophysical indicators of seismic vulnerability in Kingston, Jamaica

    NASA Astrophysics Data System (ADS)

    Brown, L. A.; Hornbach, M. J.; Salazar, W.; Kennedy, M.

    2012-12-01

    Kingston, the major city and hub of all commercial and industrial activity in Jamaica, has a history of moderate seismic activity; however, two significant (>Mw 6) earthquakes (1692 and 1907) caused major devastation resulting in thousands of casualties. Both the 1692 and 1907 events also triggered widespread liquefaction and tsunamis within Kingston Harbor. Kingston remains vulnerable to these earthquakes today because the city sits on 200-m to 600-m thick alluvial fan deposits adjacent to the Enriquillo-Plantain Garden Fault—the same fault system that activated during the Haiti 2010 earthquake. Recent GPS results suggest the potential for a Mw 7-7.5 earthquake near Kingston along the Enriquillo- Plantain Garden fault Zone (EPGFZ), the dominant east-west trending fault through Jamaica. Whether active strands EPGFZ extend through downtown Kingston remains unclear, however, recent sonar mapping in Kingston harbor show evidence for active faulting, with offshore faults connecting to proposed active on-land fault systems that extend through populated areas of the city. Seismic "Chirp" reflections also shows evidence for multiple recent (Holocene) submarine slide deposits in the harbor that may be associated with historic tsunamis. Using recently acquired chirp and sediment cores, we are currently studying the recurrence interval of earthquake events. We also recently performed a microtremor survey to identify areas prone to earthquake-induced ground shaking throughout the city of Kingston & St. Andrew parish. Data was collected at 200 points with a lateral spacing of 500 metres between each point. Our analysis shows significant variations in the fundamental frequency across the city and results clearly indicate areas of potential amplification, with areas surrounding Kingston harbor (much of which has been built on reclaimed land) showing the highest potential for ground amplification. The microtremor analysis suggests several high-density urban areas as well as key

  16. Rapid Assessment of Seismic Vulnerability in Palestinian Refugee Camps

    NASA Astrophysics Data System (ADS)

    Al-Dabbeek, Jalal N.; El-Kelani, Radwan J.

    Studies of historical and recorded earthquakes in Palestine demonstrate that damaging earthquakes are occurring frequently along the Dead Sea Transform: Earthquake of 11 July 1927 (ML 6.2), Earthquake of 11 February 2004 (ML 5.2). In order to reduce seismic vulnerability of buildings, losses in lives, properties and infrastructures, an attempt was made to estimate the percentage of damage degrees and losses at selected refugee camps: Al Ama`ri, Balata and Dhaishe. Assessing the vulnerability classes of building structures was carried out according to the European Macro-Seismic Scale 1998 (EMS-98) and the Fedral Emergency Management Agency (FEMA). The rapid assessment results showed that very heavy structural and non structural damages will occur in the common buildings of the investigated Refugee Camps (many buildings will suffer from damages grades 4 and 5). Bad quality of buildings in terms of design and construction, lack of uniformity, absence of spaces between the building and the limited width of roads will definitely increase the seismic vulnerability under the influence of moderate-strong (M 6-7) earthquakes in the future.

  17. Seismic vulnerability and risk assessment of Kolkata City, India

    NASA Astrophysics Data System (ADS)

    Nath, S. K.; Adhikari, M. D.; Devaraj, N.; Maiti, S. K.

    2015-06-01

    The city of Kolkata is one of the most urbanized and densely populated regions in the world and a major industrial and commercial hub of the eastern and northeastern region of India. In order to classify the seismic risk zones of Kolkata we used seismic hazard exposures on the vulnerability components, namely land use/land cover, population density, building typology, age and height. We microzoned seismic hazard of the city by integrating seismological, geological and geotechnical themes in GIS, which in turn are integrated with the vulnerability components in a logic-tree framework for the estimation of both the socioeconomic and structural risk of the city. In both the risk maps, three broad zones have been demarcated as "severe", "high" and "moderate". There had also been a risk-free zone in the city that is termed as "low". The damage distribution in the city due to the 1934 Bihar-Nepal earthquake of Mw = 8.1 matches satisfactorily well with the demarcated risk regime. The design horizontal seismic coefficients for the city have been worked out for all the fundamental periods that indicate suitability for "A", "B" and "C" type of structures. The cumulative damage probabilities in terms of "none", "slight", "moderate", "extensive" and "complete" have also been assessed for the predominantly four model building types viz. RM2L, RM2M, URML and URMM for each seismic structural risk zone in the city. Both the seismic hazard and risk maps are expected to play vital roles in the earthquake-inflicted disaster mitigation and management of the city of Kolkata.

  18. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Tóth, László; Kovács, Károly; Konecny, Pavel; Lednicka, Marketa; Spötl, Christoph; Bednárik, Martin; Brimich, Ladislav; Hegymegi, Erika; Novák, Attila

    2016-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. Obtaining an unbiased view of seismic hazard (and risk) is therefore very important. In principle, the best way to test Probabilistic Seismic Hazard Assessments (PSHA) is to compare them with observations that are entirely independent of the procedure used to produce PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occurring over time intervals that are at least as long as a seismic cycle. Long-term information can in principle be gained from intact stalagmites in natural caves. These formations survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that time period. Here we present such a stalagmite-based case study from the Little Carpathians of Slovakia. A specially shaped, intact and vulnerable stalagmite (IVSTM) in Plavecká priepast cave was examined in 2013. This IVSTM is suitable for estimating the upper limit of horizontal peak ground acceleration generated by pre-historic earthquakes. The approach, used in our study, yields significant new constraints on the seismic hazard, as tectonic structures close to Plavecká priepast cave did not generate strong paleoearthquakes in the last few thousand years. A particular importance of this study results from the seismic hazard of two close-by capitals: Vienna and Bratislava.

  19. Seismic vulnerability study Los Alamos Meson Physics Facility (LAMPF)

    SciTech Connect

    Salmon, M.; Goen, L.K.

    1995-12-01

    The Los Alamos Meson Physics Facility (LAMPF), located at TA-53 of Los Alamos National Laboratory (LANL), features an 800 MeV proton accelerator used for nuclear physics and materials science research. As part of the implementation of DOE Order 5480.25 and in preparation for DOE Order 5480.28, a seismic vulnerability study of the structures, systems, and components (SSCs) supporting the beam line from the accelerator building through to the ends of die various beam stops at LAMPF has been performed. The study was accomplished using the SQUG GIP methodology to assess the capability of the various SSCs to resist an evaluation basis earthquake. The evaluation basis earthquake was selected from site specific seismic hazard studies. The goals for the study were as follows: (1) identify SSCs which are vulnerable to seismic loads; and (2) ensure that those SSCs screened during die evaluation met the performance goals required for DOE Order 5480.28. The first goal was obtained by applying the SQUG GIP methodology to those SSCS represented in the experience data base. For those SSCs not represented in the data base, information was gathered and a significant amount of engineering judgment applied to determine whether to screen the SSC or to classify it as an outlier. To assure the performance goals required by DOE Order 5480.28 are met, modifications to the SQUG GIP methodology proposed by Salmon and Kennedy were used. The results of this study ire presented in this paper.

  20. Safeguard Vulnerability Analysis Program (SVAP)

    SciTech Connect

    Gilman, F.M.; Dittmore, M.H.; Orvis, W.J.; Wahler, P.S.

    1980-06-23

    This report gives an overview of the Safeguard Vulnerability Analysis Program (SVAP) developed at Lawrence Livermore National Laboratory. SVAP was designed as an automated method of analyzing the safeguard systems at nuclear facilities for vulnerabilities relating to the theft or diversion of nuclear materials. SVAP addresses one class of safeguard threat: theft or diversion of nuclear materials by nonviolent insiders, acting individually or in collusion. SVAP is a user-oriented tool which uses an interactive input medium for preprocessing the large amounts of safeguards data. Its output includes concise summary data as well as detailed vulnerability information.

  1. Effect of URM infills on seismic vulnerability of Indian code designed RC frame buildings

    NASA Astrophysics Data System (ADS)

    Haldar, Putul; Singh, Yogendra; Paul, D. K.

    2012-03-01

    Unreinforced Masonry (URM) is the most common partitioning material in framed buildings in India and many other countries. Although it is well-known that under lateral loading the behavior and modes of failure of the frame buildings change significantly due to infill-frame interaction, the general design practice is to treat infills as nonstructural elements and their stiffness, strength and interaction with the frame is often ignored, primarily because of difficulties in simulation and lack of modeling guidelines in design codes. The Indian Standard, like many other national codes, does not provide explicit insight into the anticipated performance and associated vulnerability of infilled frames. This paper presents an analytical study on the seismic performance and fragility analysis of Indian code-designed RC frame buildings with and without URM infills. Infills are modeled as diagonal struts as per ASCE 41 guidelines and various modes of failure are considered. HAZUS methodology along with nonlinear static analysis is used to compare the seismic vulnerability of bare and infilled frames. The comparative study suggests that URM infills result in a significant increase in the seismic vulnerability of RC frames and their effect needs to be properly incorporated in design codes.

  2. SANITARY VULNERABILITY OF A TERRITORIAL SYSTEM IN HIGH SEISMIC AREAS

    NASA Astrophysics Data System (ADS)

    Teramo, A.; Termini, D.; de Domenico, D.; Marino, A.; Marullo, A.; Saccà, C.; Teramo, M.

    2009-12-01

    An evaluation procedure of sanitary vulnerability of a territorial system falling within a high seismic risk area, related to casualty treatment capability of hospitals after an earthquake, is proposed. The goal of the study is aimed at highlighting hospital criticalities for the arrangement of a prevention policy on the basis of territorial, demographic and sanitary type specific analyses of a given area. This is the first step of a procedure of territorial context reading within a damage scenario, addressed to a verification of preparedness level of the territorial system to a sanitary emergency referable both to a natural disaster and anthropic one. The results of carried out surveys are shown, at a different scale, on several sample areas of Messina Province (Italy) territory, evaluating the consistency of damage scenario with the number of casualties, medical doctors, available beds for the implementation of a emergency sanitary circuit.

  3. Rapid assessment for seismic vulnerability of low and medium rise infilled RC frame buildings

    NASA Astrophysics Data System (ADS)

    Al-Nimry, Hanan; Resheidat, Musa; Qeran, Saddam

    2015-06-01

    An indexing method for rapid evaluation of the seismic vulnerability of infilled RC frame buildings in Jordan is proposed. The method aims at identifying low and medium rise residential buildings as safe or in need of further detailed evaluation. Following a rapid visual screening, the building is assigned a Basic Capacity Index (BCI); five performance modifiers are identified and multiplied by the BCI to arrive at the Capacity Index (CI) of the building. A Capacity Index lower than a limit CI value indicates that the screened building could experience moderate earthquake damage whereas a higher value implies that minor damage, if any, would take place. To establish the basic evaluation parameters; forty RC frame buildings were selected, designed and analyzed using static nonlinear analysis and incorporating the effect of infill walls. Effects of seismicity, local site conditions, horizontal irregularities (setbacks and re-entrant corners), vertical irregularities (soft story at ground floor level) and overhangs on the seismic performance of local buildings were examined. Assessment forms were designed and used to evaluate and rank 112 sample buildings. About 40% of the surveyed buildings were found to be in need of detailed evaluation to better define their seismic vulnerabilities.

  4. Fault zone regulation, seismic hazard, and social vulnerability in Los Angeles, California: Hazard or urban amenity?

    NASA Astrophysics Data System (ADS)

    Toké, Nathan A.; Boone, Christopher G.; Arrowsmith, J. Ramón

    2014-09-01

    Public perception and regulation of environmental hazards are important factors in the development and configuration of cities. Throughout California, probabilistic seismic hazard mapping and geologic investigations of active faults have spatially quantified earthquake hazard. In Los Angeles, these analyses have informed earthquake engineering, public awareness, the insurance industry, and the government regulation of developments near faults. Understanding the impact of natural hazards regulation on the social and built geography of cities is vital for informing future science and policy directions. We constructed a relative social vulnerability index classification for Los Angeles to examine the social condition within regions of significant seismic hazard, including areas regulated as Alquist-Priolo (AP) Act earthquake fault zones. Despite hazard disclosures, social vulnerability is lowest within AP regulatory zones and vulnerability increases with distance from them. Because the AP Act requires building setbacks from active faults, newer developments in these zones are bisected by parks. Parcel-level analysis demonstrates that homes adjacent to these fault zone parks are the most valuable in their neighborhoods. At a broad scale, a Landsat-based normalized difference vegetation index shows that greenness near AP zones is greater than the rest of the metropolitan area. In the parks-poor city of Los Angeles, fault zone regulation has contributed to the construction of park space within areas of earthquake hazard, thus transforming zones of natural hazard into amenities, attracting populations of relatively high social status, and demonstrating that the distribution of social vulnerability is sometimes more strongly tied to amenities than hazards.

  5. Integrated Estimation of Seismic Physical Vulnerability of Tehran Using Rule Based Granular Computing

    NASA Astrophysics Data System (ADS)

    Sheikhian, H.; Delavar, M. R.; Stein, A.

    2015-08-01

    Tehran, the capital of Iran, is surrounded by the North Tehran fault, the Mosha fault and the Rey fault. This exposes the city to possibly huge earthquakes followed by dramatic human loss and physical damage, in particular as it contains a large number of non-standard constructions and aged buildings. Estimation of the likely consequences of an earthquake facilitates mitigation of these losses. Mitigation of the earthquake fatalities may be achieved by promoting awareness of earthquake vulnerability and implementation of seismic vulnerability reduction measures. In this research, granular computing using generality and absolute support for rule extraction is applied. It uses coverage and entropy for rule prioritization. These rules are combined to form a granule tree that shows the order and relation of the extracted rules. In this way the seismic physical vulnerability is assessed, integrating the effects of the three major known faults. Effective parameters considered in the physical seismic vulnerability assessment are slope, seismic intensity, height and age of the buildings. Experts were asked to predict seismic vulnerability for 100 randomly selected samples among more than 3000 statistical units in Tehran. The integrated experts' point of views serve as input into granular computing. Non-redundant covering rules preserve the consistency in the model, which resulted in 84% accuracy in the seismic vulnerability assessment based on the validation of the predicted test data against expected vulnerability degree. The study concluded that granular computing is a useful method to assess the effects of earthquakes in an earthquake prone area.

  6. Seismic evaluation of vulnerability for SAMA educational buildings in Tehran

    SciTech Connect

    Amini, Omid Nassiri; Amiri, Javad Vaseghi

    2008-07-08

    Earthquake is a destructive phenomenon that trembles different parts of the earth yearly and causes many destructions. Iran is one of the (high seismicity) quack- prone parts of the world that has received a lot of pecuniary damages and life losses each year, schools are of the most important places to be protected during such crisis.There was no special surveillance on designing and building of school's building in Tehran till the late 70's, and as Tehran is on faults, instability of such buildings may cause irrecoverable pecuniary damages and especially life losses, therefore preventing this phenomenon is in an urgent need.For this purpose, some of the schools built during 67-78 mostly with Steel braced frame structures have been selected, first, by evaluating the selected Samples, gathering information and Visual Survey, the prepared questionnaires were filled out. With the use of ARIA and SABA (Venezuela) Methods, new modified combined method for qualified evaluations was found and used.Then, for quantified evaluation, with the use of computer 3D models and nonlinear statically analysis methods, a number of selected buildings of qualified evaluation, were reevaluated and finally with nonlinear dynamic analysis method the real behavior of structures on the earthquakes is studied.The results of qualified and quantified evaluations were compared and a proper Pattern for seismic evaluation of Educational buildings was presented. Otherwise the results can be a guidance for the person in charge of retrofitting or if necessary rebuilding the schools.

  7. Seismic evaluation of vulnerability for SAMA educational buildings in Tehran

    NASA Astrophysics Data System (ADS)

    Amini, Omid Nassiri; Amiri, Javad Vaseghi

    2008-07-01

    Earthquake is a destructive phenomenon that trembles different parts of the earth yearly and causes many destructions. Iran is one of the (high seismicity) quack- prone parts of the world that has received a lot of pecuniary damages and life losses each year, schools are of the most important places to be protected during such crisis. There was no special surveillance on designing and building of school's building in Tehran till the late 70's, and as Tehran is on faults, instability of such buildings may cause irrecoverable pecuniary damages and especially life losses, therefore preventing this phenomenon is in an urgent need. For this purpose, some of the schools built during 67-78 mostly with Steel braced frame structures have been selected, first, by evaluating the selected Samples, gathering information and Visual Survey, the prepared questionnaires were filled out. With the use of ARIA and SABA (Venezuela) Methods, new modified combined method for qualified evaluations was found and used. Then, for quantified evaluation, with the use of computer 3D models and nonlinear statically analysis methods, a number of selected buildings of qualified evaluation, were reevaluated and finally with nonlinear dynamic analysis method the real behavior of structures on the earthquakes is studied. The results of qualified and quantified evaluations were compared and a proper Pattern for seismic evaluation of Educational buildings was presented. Otherwise the results can be a guidance for the person in charge of retrofitting or if necessary rebuilding the schools.

  8. Use of expert judgment elicitation to estimate seismic vulnerability of selected building types

    USGS Publications Warehouse

    Jaiswal, K.S.; Aspinall, W.; Perkins, D.; Wald, D.; Porter, K.A.

    2012-01-01

    Pooling engineering input on earthquake building vulnerability through an expert judgment elicitation process requires careful deliberation. This article provides an overview of expert judgment procedures including the Delphi approach and the Cooke performance-based method to estimate the seismic vulnerability of a building category.

  9. Lunar seismic data analysis

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.; Latham, G. V.; Dorman, H. J.

    1982-01-01

    The scientific data transmitted continuously from all ALSEP (Apollo Lunar Surface Experiment Package) stations on the Moon and recorded on instrumentation tapes at receiving stations distributed around the Earth were processed. The processing produced sets of computer-compatible digital tapes, from which various other data sets convenient for analysis were generated. The seismograms were read, various types of seismic events were classified; the detected events were cataloged.

  10. Seismic Vulnerability Assessment Rest House Building TA-16-41

    SciTech Connect

    Cuesta, Isabel; Salmon, Michael W.

    2003-10-01

    The purpose of this report is to present the results of the evaluation completed on the Rest House Facility (TA-16-4111) in support of hazard analysis for a Documented Safety Assessment (DSA). The Rest House facility has been evaluated to verify the structural response to seismic, wind, and snow loads in support of the DynEx DSA. The structural analyses consider the structure and the following systems and/or components inside the facility as requested by facility management: cranes, lighting protection system, and fire protection system. The facility has been assigned to Natural Phenomena Hazards (NPH) Performance Category (PC) –3. The facility structure was evaluated to PC-3 criteria because it serves to confine hazardous material, and in the event of an accident, the facility cannot fail or collapse. Seismicinduced failure of the cranes, lighting, and fire-protection systems according to DOE-STD-1021-93 (Ref. 1) “may result in adverse release consequences greater than safety-class Structures, Systems, and Components (SSC) Evaluation Guideline limits but much less than those associated with PC-4 SSC.” Therefore, these items will be evaluated to PC-3 criteria as well. This report presents the results of those analyses and suggests recommendations to improve the seismic capacity of the systems and components cited above.

  11. Metadata for selecting or submitting generic seismic vulnerability functions via GEM's vulnerability database

    USGS Publications Warehouse

    Jaiswal, Kishor

    2013-01-01

    This memo lays out a procedure for the GEM software to offer an available vulnerability function for any acceptable set of attributes that the user specifies for a particular building category. The memo also provides general guidelines on how to submit the vulnerability or fragility functions to the GEM vulnerability repository, stipulating which attributes modelers must provide so that their vulnerability or fragility functions can be queried appropriately by the vulnerability database. An important objective is to provide users guidance on limitations and applicability by providing the associated modeling assumptions and applicability of each vulnerability or fragility function.

  12. Seismic vulnerability assessment of school buildings in Tehran city based on AHP and GIS

    NASA Astrophysics Data System (ADS)

    Panahi, M.; Rezaie, F.; Meshkani, S. A.

    2013-09-01

    The objective of the study was to evaluate the seismic vulnerability of school buildings in Tehran city based on analytical hierarchical process (AHP) and geographical information systems (GIS). Therefore, to this end, the peak ground acceleration, slope and soil liquefaction layers were used for preparation geotechnical map. Also, the construction materials of structures, year of construction, their quality and seismic resonance coefficient layers were defined as major affecting factors in structural vulnerability of schools. Then, the AHP method was applied to assess the priority rank and weight of criteria (layers) and alternatives (classes) of each criterion through pair wise comparison in all levels. Finally, geotechnical and structural spatial layers were overlaid to prepare the seismic vulnerability map of school buildings in Tehran city. The results indicated that only in 72 schools (about 3%) out of 2125 schools in the study area, the destruction rate is very high and therefore their reconstruction should be considered.

  13. Seismic vulnerability assessment of school buildings in Tehran city based on AHP and GIS

    NASA Astrophysics Data System (ADS)

    Panahi, M.; Rezaie, F.; Meshkani, S. A.

    2014-04-01

    The objective of the current study is to evaluate the seismic vulnerability of school buildings in Tehran city based on the analytic hierarchy process (AHP) and geographical information system (GIS). To this end, the peak ground acceleration, slope, and soil liquefaction layers were utilized for developing a geotechnical map. Also, the construction materials of structures, age of construction, the quality, and the seismic resonance coefficient layers were defined as major factors affecting the structural vulnerability of school buildings. Then, the AHP method was applied to assess the priority rank and weight of criteria (layers) and alternatives (classes) of each criterion via pairwise comparison in all levels. Finally, the geotechnical and structural spatial layers were overlaid to develop the seismic vulnerability map of school buildings in Tehran. The results indicated that only in 72 (about 3%) out of 2125 school buildings of the study area will the destruction rate be very high and therefore their reconstruction should seriously be considered.

  14. A S.M.A.R.T. system for the seismic vulnerability mitigation of Cultural Heritages

    NASA Astrophysics Data System (ADS)

    Montuori, Antonio; Costanzo, Antonio; Gaudiosi, Iolanda; Vecchio, Antonio; Minasi, Mario; Falcone, Sergio; La Piana, Carmelo; Stramondo, Salvatore; Casula, Giuseppe; Giovanna Bianchi, Maria; Fabrizia Buongiorno, Maria; Musacchio, Massimo; Doumaz, Fawzi; Ilaria Pannaccione Apa, Maria

    2016-04-01

    Both assessment and mitigation of seismic vulnerability connected to cultural heritages monitoring are non-trivial issues, based on the knowledge of structural and environmental factors potential impacting the cultural heritage. A holistic approach could be suitable to provide an effective monitoring of cultural heritages within their surroundings at different spatial and temporal scales. On the one hand, the analysis about geometrical and structural properties of monuments is important to assess their state of conservation, their response to external stresses as well as anomalies related to natural and/or anthropogenic phenomena (e.g. the aging of materials, seismic stresses, vibrational modes). On the other hand, the investigation of the surrounding area is relevant to assess environmental properties and natural phenomena (e.g. landslides, earthquakes, subsidence, seismic response) as well as their related impacts on the monuments. Within such a framework, a multi-disciplinary system has been developed and here presented for the monitoring of cultural heritages for seismic vulnerability assessment and mitigation purposes*. It merges geophysical investigations and modeling, in situ measurements and multi-platforms remote sensing sensors for the non-destructive and non-invasive multi-scales monitoring of historic buildings in a seismic-prone area. In detail, the system provides: a) the long-term and the regional-scale analysis of buildings' environment through the integration of seismogenic analysis, airborne magnetic surveys, space-borne Synthetic Aperture Radar (SAR) and multi-spectral sensors. They allow describing the sub-surface fault systems, the surface deformation processes and the land use mapping of the regional-scale area on an annual temporal span; b) the short-term and the basin-scale analysis of building's neighborhood through geological setting and geotechnical surveys, airborne Light Detection And Radar (LiDAR) and ground-based SAR sensors. They

  15. Seismic vulnerability: theory and application to Algerian buildings

    NASA Astrophysics Data System (ADS)

    Mebarki, Ahmed; Boukri, Mehdi; Laribi, Abderrahmane; Farsi, Mohammed; Belazougui, Mohamed; Kharchi, Fattoum

    2014-04-01

    results to the observed damages. For pre-earthquake analysis, the methodology widely used around the world relies on the prior calibration of the seismic response of the structures under given expected scenarios. As the structural response is governed by the constitutive materials and structural typology as well as the seismic input and soil conditions, the damage prediction depends intimately on the accuracy of the so-called fragility curve and response spectrum established for each type of structure (RC framed structures, confined or unconfined masonry, etc.) and soil (hard rock, soft soil, etc.). In the present study, the adaptation to Algerian buildings concerns the specific soil conditions as well as the structural dynamic response. The theoretical prediction of the expected damages is helpful for the calibration of the methodology. Thousands (˜3,700) of real structures and the damages caused by the earthquake (Algeria, Boumerdes: Mw = 6.8, May 21, 2003) are considered for the a posteriori calibration and validation process. The theoretical predictions show the importance of the elastic response spectrum, the local soil conditions, and the structural typology. Although the observed and predicted categories of damage are close, it appears that the existing form used for the visual damage inspection would still require further improvements, in order to allow easy evaluation and identification of the damage level. These methods coupled to databases, and GIS tools could be helpful for the local and technical authorities during the post-earthquake evaluation process: real time information on the damage extent at urban or regional scales as well as the extent of losses and the required resources for reconstruction, evacuation, strengthening, etc.

  16. Nuclear material production cycle vulnerability analysis

    SciTech Connect

    Bott, T.F.

    1996-07-01

    This paper discusses a method for rapidly and systematically identifying vulnerable equipment in a nuclear material or similar production process and ranking that equipment according to its attractiveness to a malevolent attacker. A multistep approach was used in the analysis. First, the entire production cycle was modeled as a flow diagram. This flow diagram was analyzed using graph theoretical methods to identify processes in the production cycle and their locations. Models of processes that were judged to be particularly vulnerable based on the cycle analysis then were developed in greater detail to identify equipment in that process that is vulnerable to intentional damage.

  17. Review on Rapid Seismic Vulnerability Assessment for Bulk of Buildings

    NASA Astrophysics Data System (ADS)

    Nanda, R. P.; Majhi, D. R.

    2013-09-01

    This paper provides a brief overview of rapid visual screening (RVS) procedures available in different countries with a comparison among all the methods. Seismic evaluation guidelines from, USA, Canada, Japan, New Zealand, India, Europe, Italy, UNDP, with other methods are reviewed from the perspective of their applicability to developing countries. The review shows clearly that some of the RVS procedures are unsuited for potential use in developing countries. It is expected that this comparative assessment of various evaluation schemes will help to identify the most essential components of such a procedure for use in India and other developing countries, which is not only robust, reliable but also easy to use with available resources. It appears that Federal Emergency Management Agency (FEMA) 154 and New Zealand Draft Code approaches can be suitably combined to develop a transparent, reasonably rigorous and generalized procedure for seismic evaluation of buildings in developing countries.

  18. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Kovács, Károly; Konecny, Pavel; Lednicka, Marketa; Bednárik, Martin; Brimich, Ladislav

    2015-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. This raises the important issue for society, how to react to the natural hazard: potential damages are huge, but infrastructure costs for addressing these hazards are huge as well. Furthermore, seismic hazard is only one of the many hazards facing society. Societal means need to be distributed in a reasonable manner - to assure that all of these hazards (natural as well as societal) are addressed appropriately. Obtaining an unbiased view of seismic hazard (and risk) is very important therefore. In principle, the best way to test PSHA models is to compare with observations that are entirely independent of the procedure used to produce the PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occuring over time intervals that are at least as long as a seismic cycle - if that exists. Such information would be very valuable, even if it concerned only a single site, namely that of a particularly sensitive infrastructure. Such a request may seem hopeless - but it is not. Long-term information can in principle be gained from intact stalagmites in natural caves. These have survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that period. We are focusing here on case studies in Austria, which has moderate seismicity, but a well-documented history of major earthquake-induced damage, e.g., Villach in 1348 and 1690, Vienna in 1590, Leoben in 1794, and Innsbruck in 1551, 1572, and 1589. Seismic intensities have reached levels up to 10. It is clearly important to know which "worst-case" damages to expect. We have identified sets of particularly sensitive stalagmites in the general vicinity of two major cities in

  19. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    SciTech Connect

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly

  20. Comparative Application of Capacity Models for Seismic Vulnerability Evaluation of Existing RC Structures

    SciTech Connect

    Faella, C.; Lima, C.; Martinelli, E.; Nigro, E.

    2008-07-08

    Seismic vulnerability assessment of existing buildings is one of the most common tasks in which Structural Engineers are currently engaged. Since, its is often a preliminary step to approach the issue of how to retrofit non-seismic designed and detailed structures, it plays a key role in the successful choice of the most suitable strengthening technique. In this framework, the basic information for both seismic assessment and retrofitting is related to the formulation of capacity models for structural members. Plenty of proposals, often contradictory under the quantitative standpoint, are currently available within the technical and scientific literature for defining the structural capacity in terms of force and displacements, possibly with reference to different parameters representing the seismic response. The present paper shortly reviews some of the models for capacity of RC members and compare them with reference to two case studies assumed as representative of a wide class of existing buildings.

  1. Seismic Analysis Capability in NASTRAN

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.

    1984-01-01

    Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.

  2. Generalized seismic analysis

    NASA Technical Reports Server (NTRS)

    Butler, Thomas G.

    1993-01-01

    There is a constant need to be able to solve for enforced motion of structures. Spacecraft need to be qualified for acceleration inputs. Truck cargoes need to be safeguarded from road mishaps. Office buildings need to withstand earthquake shocks. Marine machinery needs to be able to withstand hull shocks. All of these kinds of enforced motions are being grouped together under the heading of seismic inputs. Attempts have been made to cope with this problem over the years and they usually have ended up with some limiting or compromise conditions. The crudest approach was to limit the problem to acceleration occurring only at a base of a structure, constrained to be rigid. The analyst would assign arbitrarily outsized masses to base points. He would then calculate the magnitude of force to apply to the base mass (or masses) in order to produce the specified acceleration. He would of necessity have to sacrifice the determination of stresses in the vicinity of the base, because of the artificial nature of the input forces. The author followed the lead of John M. Biggs by using relative coordinates for a rigid base in a 1975 paper, and again in a 1981 paper . This method of relative coordinates was extended and made operational as DMAP ALTER packets to rigid formats 9, 10, 11, and 12 under contract N60921-82-C-0128. This method was presented at the twelfth NASTRAN Colloquium. Another analyst in the field developed a method that computed the forces from enforced motion then applied them as a forcing to the remaining unknowns after the knowns were partitioned off. The method was translated into DMAP ALTER's but was never made operational. All of this activity jelled into the current effort. Much thought was invested in working out ways to unshakle the analysis of enforced motions from the limitations that persisted.

  3. Soil depth mapping using seismic surface waves for the assessment of soil vulnerability to erosion.

    NASA Astrophysics Data System (ADS)

    Samyn, K.; Cerdan, O.; Grandjean, G.; Bitri, A.; Bernardie, S.; Ouvry, J. F.

    2009-04-01

    The purposes of the multidisciplinary DIGISOIL project are the integration and improvement of in situ and proximal technologies for the assessment of soil properties and soil degradation indicators. Foreseen developments concern sensor technologies, data processing and their integration to applications of (digital) soil mapping (DSM). Among available techniques, the seismic one is, in this study, particularly tested for characterising soil vulnerability to erosion. The spectral analysis of surface waves (SASW) method is an in situ seismic technique used for evaluation of the stiffnesses (G) and associated depth in layered systems. The method is based on the propagation of mechanically induced Rayleigh waves. By striking the ground surface with a hammer, seismic waves are generated, including surface Rayleigh waves. During their propagation, they are recorded by seismic receivers (geophone sensors) regularly spaced along a profile to produce a seismogram. The particularity of Rayleigh waves lies in the dependence of their velocity with frequency, a phenomenon called dispersion. A profile of Rayleigh wave velocity versus frequency, i.e., the dispersion curve, is calculated from each recorded seismogram before to be inverted to obtain the vertical profile of shear waves velocity. Then, the soil stiffness can easily be calculated from the shear velocity if the material density is estimated, and the soil stiffness as a function of depth can be obtained. This last information can be a good indicator to identify the soil bedrock limit. From a geometrical point of view, a SASW system adapted to soil characterisation is proposed in the DIGISOIL project. This system was tested for the digital mapping of the depth of loamy material in a catchment of the European loess belt. Parametric penetrometric studies are also conducted for the purpose of verifying the accuracy of the procedure and evaluating its limitations. The depth to bedrock determined by this procedure can then be

  4. Vulnerability

    NASA Technical Reports Server (NTRS)

    Taback, I.

    1979-01-01

    The discussion of vulnerability begins with a description of some of the electrical characteristics of fibers before definiting how vulnerability calculations are done. The vulnerability results secured to date are presented. The discussion touches on post exposure vulnerability. After a description of some shock hazard work now underway, the discussion leads into a description of the planned effort and some preliminary conclusions are presented.

  5. Nuclear material production cycle vulnerability analysis. Revision.

    SciTech Connect

    Bott, T.F.

    1996-10-01

    This paper discusses a method for rapidly and systematically identifying vulnerable equipment in a nuclear material or similar production process and ranking that equipment according to its attractiveness to a malevolent attacker. A multi-step approach was used in the analysis. First, the entire production cycle was modeled as a flow diagram. This flow diagram was analyzed using graph theoretical methods to identify processes in the production cycle and their locations. Models of processes that were judged to be particularly vulnerable based on the cycle analysis then were developed in greater detail to identify equipment in that process that is vulnerable to intentional damage. The information generated by this analysis may be used to devise protective features for critical equipment. The method uses directed graphs, fault trees, and evaluation matrices. Expert knowledge of plant engineers and operators is used to determine the critical equipment and evaluate its attractiveness to potential attackers. The vulnerability of equipment can be ranked and sorted according to any criterion desired and presented in a readily grasped format using matrices.

  6. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  7. Aircraft vulnerability analysis by modeling and simulation

    NASA Astrophysics Data System (ADS)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    Infrared missiles pose a significant threat to civilian and military aviation. ManPADS missiles are especially dangerous in the hands of rogue and undisciplined forces. Yet, not all the launched missiles hit their targets; the miss being either attributable to misuse of the weapon or to missile performance restrictions. This paper analyses some of the factors affecting aircraft vulnerability and demonstrates a structured analysis of the risk and aircraft vulnerability problem. The aircraft-missile engagement is a complex series of events, many of which are only partially understood. Aircraft and missile designers focus on the optimal design and performance of their respective systems, often testing only in a limited set of scenarios. Most missiles react to the contrast intensity, but the variability of the background is rarely considered. Finally, the vulnerability of the aircraft depends jointly on the missile's performance and the doctrine governing the missile's launch. These factors are considered in a holistic investigation. The view direction, altitude, time of day, sun position, latitude/longitude and terrain determine the background against which the aircraft is observed. Especially high gradients in sky radiance occur around the sun and on the horizon. This paper considers uncluttered background scenes (uniform terrain and clear sky) and presents examples of background radiance at all view angles across a sphere around the sensor. A detailed geometrical and spatially distributed radiometric model is used to model the aircraft. This model provides the signature at all possible view angles across the sphere around the aircraft. The signature is determined in absolute terms (no background) and in contrast terms (with background). It is shown that the background significantly affects the contrast signature as observed by the missile sensor. A simplified missile model is constructed by defining the thrust and mass profiles, maximum seeker tracking rate, maximum

  8. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  9. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has began using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. ASSESS analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. Its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weaknesses of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weaknesses, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  10. Vulnerability analysis methods for road networks

    NASA Astrophysics Data System (ADS)

    Bíl, Michal; Vodák, Rostislav; Kubeček, Jan; Rebok, Tomáš; Svoboda, Tomáš

    2014-05-01

    Road networks rank among the most important lifelines of modern society. They can be damaged by either random or intentional events. Roads are also often affected by natural hazards, the impacts of which are both direct and indirect. Whereas direct impacts (e.g. roads damaged by a landslide or due to flooding) are localized in close proximity to the natural hazard occurrence, the indirect impacts can entail widespread service disabilities and considerable travel delays. The change in flows in the network may affect the population living far from the places originally impacted by the natural disaster. These effects are primarily possible due to the intrinsic nature of this system. The consequences and extent of the indirect costs also depend on the set of road links which were damaged, because the road links differ in terms of their importance. The more robust (interconnected) the road network is, the less time is usually needed to secure the serviceability of an area hit by a disaster. These kinds of networks also demonstrate a higher degree of resilience. Evaluating road network structures is therefore essential in any type of vulnerability and resilience analysis. There are a range of approaches used for evaluation of the vulnerability of a network and for identification of the weakest road links. Only few of them are, however, capable of simulating the impacts of the simultaneous closure of numerous links, which often occurs during a disaster. The primary problem is that in the case of a disaster, which usually has a large regional extent, the road network may remain disconnected. The majority of the commonly used indices use direct computation of the shortest paths or time between OD (origin - destination) pairs and therefore cannot be applied when the network breaks up into two or more components. Since extensive break-ups often occur in cases of major disasters, it is important to study the network vulnerability in these cases as well, so that appropriate

  11. Seismic vulnerability of dwellings at Sete Cidades Volcano (S. Miguel Island, Azores)

    NASA Astrophysics Data System (ADS)

    Gomes, A.; Gaspar, J. L.; Queiroz, G.

    2006-01-01

    Since the settlement of S. Miguel Island (Azores), in the XV century, several earthquakes caused important human losses and severe damages on the island. Sete Cidades Volcano area, located in the westernmost part of the island, was attained by strong seismic crises of tectonic and volcanic origin and major events reached a maximum historical intensity of IX (European Macroseismic Scale 1998) in this zone. Aiming to evaluate the impact of a future major earthquakes, a field survey was carried out in ten parishes of Ponta Delgada County, located on the flanks of Sete Cidades volcano and inside it is caldera. A total of 7019 buildings were identified, being 4351 recognized as dwellings. The total number of inhabitants in the studied area is 11429. In this work, dwellings were classified according to their vulnerability to earthquakes (Classes A to F), using the structure types table of the EMS-98, adapted to the types of constructions made in the Azores. It was concluded that 76% (3306) of the houses belong to Class A, and 17% (740) to Class B, which are the classes of higher vulnerability. If the area is affected by a seismic event with intensity IX it is estimated, that 57% (2480) to 77% (3350) of the dwellings will partially or totally collapse and 15% (652) to 25% (1088) will need to be rehabilitated. In this scenario, considering the average of inhabitants per house for each parish, 82% (9372) to 92% (10515) of the population will be affected. The number of deaths, injured and dislodged people will pose severe problems to the civil protection authorities and will cause social and economic disruption in the entire archipelago.

  12. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  13. Vulnerability.

    PubMed

    Cunha, Thiago; Garrafa, Volnei

    2016-04-01

    Collating the concepts of vulnerability through five regional perspectives on bioethics from the United States, Europe, Latin America, Africa, and Asia, this article proposes a means of integration between the different approaches in order to seek a theoretical and normative basis for the field of global bioethics. It argues that only through opening continuous, critical, and self-critical dialogue within the international bioethical community will it be possible to achieve a sufficiently global understanding of vulnerability that is capable of identifying the means needed for addressing the conditions that leave certain groups and individuals more susceptible to "wounding" than others. PMID:26957445

  14. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  15. Analysis of the ambient seismic noise at Bulgarian seismic stations

    NASA Astrophysics Data System (ADS)

    Dimitrova, Liliya; Nikolova, Svetlana

    2010-05-01

    Modernization of Bulgarian National Seismological Network has been performed during a month in 2005. Broadband seismometers and 24-bits digital acquisition systems with dynamic range more than 132dB type DAS130-01 produced by RefTek Inc. were installed at the seismic stations from the existing analog network. In the present study the ambient seismic noise at Bulgarian National Digital Seismological Network (BNDSN) stations is evaluated. In order to compare the performance of the network against international standards the detail analysis of the seismic noise was performed using software and models that are applied in the international practice. The method of McNamara and Bulland was applied and the software code PDFSA was used to determine power spectral density function (PSD) of the background noise and to evaluate the probability density function (PDF). The levels of the ambient seismic noise were determined and the full range of the factors influencing the quality of the data and the performance of a seismic station was analyzed. The estimated PSD functions were compared against two models for high (NHNM) and low (NLNM) noise that are widely used in seismological practice for seismic station monitoring qualities assessment. The mode PDF are used to prepare annual, seasonal, diurnal and frequency analyses of the noise levels at BNDSN stations. The annual analysis shows that the noise levels at the Northern Bulgarian stations are higher than the ones at Central and Southern stations for the microseisms' periods (1sec -7sec). It is well observable at SS PRV and PSN located near Black sea. This is due to the different geological conditions of the seismic stations as well. For the periods of "cultural" noise the power distribution depends on the type of noise sources and as a rule is related to human activities at or near the Earth surface. Seismic stations MPE, VTS and MMB have least mode noise levels and the noisiest stations are PGB, PVL и JMB. The seasonal

  16. Seismic analysis of the LSST telescope

    NASA Astrophysics Data System (ADS)

    Neill, Douglas R.

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) will be located on the seismically active Chilean mountain of Cerro Pachón. The accelerations resulting from seismic events produce the most demanding load cases the telescope and its components must withstand. Seismic ground accelerations were applied to a comprehensive finite element analysis (FEA) model which included the telescope, its pier and the mountain top. Response accelerations for specific critical components (camera and secondary mirror assembly) on the telescope were determined by applying seismic accelerations in the form of Power Spectral Densities (PSD) to the FEA model. The PSDs were chosen based on the components design lives. Survival level accelerations were determined utilizing PSDs for seismic events with return periods 10 times the telescope's design life which is equivalent to a 10% chance of occurring over the lifetime. Since the telescope has a design life of 30 years it was analyzed for a return period of 300 years. Operational level seismic accelerations were determined using return periods of 5 times the lifetimes. Since the seismic accelerations provided by the Chilean design codes were provided in the form of Peak Spectral Accelerations (PSA), a method to convert between the two forms was developed. The accelerations are also affected by damping level. The LSST incorporates added damping to meets its rapid slew and settle requirements. This added damping also reduces the components' seismic accelerations. The analysis was repeated for the telescope horizon and zenith pointing. Closed form solutions were utilized to verify the results.

  17. Q analysis on reflection seismic data

    NASA Astrophysics Data System (ADS)

    Wang, Yanghua

    2004-09-01

    Q analysis refers to the procedure for estimating Q directly from a reflection seismic trace. Conventional Q analysis method compares two seismic wavelets selected from different depth (or time) levels, but picking ``clean'' wavelets without interferences from other wavelet and noise from a reflection seismic trace is really a problem. Therefore, instead of analysing individual wavelets, I perform Q analysis using the Gabor transform spectrum which reveals the frequency content changing with time in a seismic trace. I propose two Q analysis methods based on the attenuation function and compensation function, respectively, each of which may produce a series of average values of Q-1 (inverse Q), averaging between the recording surface (or the water bottom) and the subsurface time samples. But the latter is much more stable than the former one. I then calculate the interval or layered values of Q-1 by a constrained linear inversion, which produces a stable estimation of the interval-Q series.

  18. Seismic analysis of nuclear power plant structures

    NASA Technical Reports Server (NTRS)

    Go, J. C.

    1973-01-01

    Primary structures for nuclear power plants are designed to resist expected earthquakes of the site. Two intensities are referred to as Operating Basis Earthquake and Design Basis Earthquake. These structures are required to accommodate these seismic loadings without loss of their functional integrity. Thus, no plastic yield is allowed. The application of NASTRAN in analyzing some of these seismic induced structural dynamic problems is described. NASTRAN, with some modifications, can be used to analyze most structures that are subjected to seismic loads. A brief review of the formulation of seismic-induced structural dynamics is also presented. Two typical structural problems were selected to illustrate the application of the various methods of seismic structural analysis by the NASTRAN system.

  19. a Novel Approach to Support Majority Voting in Spatial Group Mcdm Using Density Induced Owa Operator for Seismic Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Moradi, M.; Delavar, M. R.; Moshiri, B.; Khamespanah, F.

    2014-10-01

    Being one of the most frightening disasters, earthquakes frequently cause huge damages to buildings, facilities and human beings. Although the prediction of characteristics of an earthquake seems to be impossible, its loss and damage is predictable in advance. Seismic loss estimation models tend to evaluate the extent to which the urban areas are vulnerable to earthquakes. Many factors contribute to the vulnerability of urban areas against earthquakes including age and height of buildings, the quality of the materials, the density of population and the location of flammable facilities. Therefore, seismic vulnerability assessment is a multi-criteria problem. A number of multi criteria decision making models have been proposed based on a single expert. The main objective of this paper is to propose a model which facilitates group multi criteria decision making based on the concept of majority voting. The main idea of majority voting is providing a computational tool to measure the degree to which different experts support each other's opinions and make a decision regarding this measure. The applicability of this model is examined in Tehran metropolitan area which is located in a seismically active region. The results indicate that neglecting the experts which get lower degrees of support from others enables the decision makers to avoid the extreme strategies. Moreover, a computational method is proposed to calculate the degree of optimism in the experts' opinions.

  20. Global analysis of urban surface water supply vulnerability

    NASA Astrophysics Data System (ADS)

    Padowski, Julie C.; Gorelick, Steven M.

    2014-10-01

    This study presents a global analysis of urban water supply vulnerability in 71 surface-water supplied cities, with populations exceeding 750 000 and lacking source water diversity. Vulnerability represents the failure of an urban supply-basin to simultaneously meet demands from human, environmental and agricultural users. We assess a baseline (2010) condition and a future scenario (2040) that considers increased demand from urban population growth and projected agricultural demand. We do not account for climate change, which can potentially exacerbate or reduce urban supply vulnerability. In 2010, 35% of large cities are vulnerable as they compete with agricultural users. By 2040, without additional measures 45% of cities are vulnerable due to increased agricultural and urban demands. Of the vulnerable cities in 2040, the majority are river-supplied with mean flows so low (1200 liters per person per day, l/p/d) that the cities experience ‘chronic water scarcity’ (1370 l/p/d). Reservoirs supply the majority of cities facing individual future threats, revealing that constructed storage potentially provides tenuous water security. In 2040, of the 32 vulnerable cities, 14 would reduce their vulnerability via reallocating water by reducing environmental flows, and 16 would similarly benefit by transferring water from irrigated agriculture. Approximately half remain vulnerable under either potential remedy.

  1. Vulnerability analysis of interdependent infrastructure systems: A methodological framework

    NASA Astrophysics Data System (ADS)

    Wang, Shuliang; Hong, Liu; Chen, Xueguang

    2012-06-01

    Infrastructure systems such as power and water supplies make up the cornerstone of modern society which is essential for the functioning of a society and its economy. They become more and more interconnected and interdependent with the development of scientific technology and social economy. Risk and vulnerability analysis of interdependent infrastructures for security considerations has become an important subject, and some achievements have been made in this area. Since different infrastructure systems have different structural and functional properties, there is no universal all-encompassing 'silver bullet solution' to the problem of analyzing the vulnerability associated with interdependent infrastructure systems. So a framework of analysis is required. This paper takes the power and water systems of a major city in China as an example and develops a framework for the analysis of the vulnerability of interdependent infrastructure systems. Four interface design strategies based on distance, betweenness, degree, and clustering coefficient are constructed. Then two types of vulnerability (long-term vulnerability and focused vulnerability) are illustrated and analyzed. Finally, a method for ranking critical components in interdependent infrastructures is given for protection purposes. It is concluded that the framework proposed here is useful for vulnerability analysis of interdependent systems and it will be helpful for the system owners to make better decisions on infrastructure design and protection.

  2. Seismic analysis of piping with nonlinear supports

    SciTech Connect

    Barta, D.A.; Huang, S.N.; Severud, L.K.

    1980-01-01

    The modeling and results of nonlinear time-history seismic analyses for three sizes of pipelines restrained by mechanical snubbes are presented. Numerous parametric analyses were conducted to obtain sensitivity information which identifies relative importance of the model and analysis ingredients. Special considerations for modeling the pipe clamps and the mechanical snubbers based on experimental characterization data are discussed. Comparisions are also given of seismic responses, loads and pipe stresses predicted by standard response spectra methods and the nonlinear time-history methods.

  3. Space Station Program threat and vulnerability analysis

    NASA Technical Reports Server (NTRS)

    Van Meter, Steven D.; Veatch, John D.

    1987-01-01

    An examination has been made of the physical security of the Space Station Program at the Kennedy Space Center in a peacetime environment, in order to furnish facility personnel with threat/vulnerability information. A risk-management approach is used to prioritize threat-target combinations that are characterized in terms of 'insiders' and 'outsiders'. Potential targets were identified and analyzed with a view to their attractiveness to an adversary, as well as to the consequentiality of the resulting damage.

  4. Verification the data on critical facilities inventory and vulnerability for seismic risk assessment taking into account possible accidents

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Aleksander

    2015-04-01

    The paper contains the results of the recent study that has been done by Seismological Center of IGE, Russian Academy of Sciences and Extreme Situations Research Center within the Russian Academy of Sciences Project "Theoretical and Methodological basis for seismic risk assessment taking into account technological accidents at local level; constructing the seismic risk maps for the Big Sochi City territory including the venue of Olympic Games facilities." The procedure of critical facilities inventory and vulnerability verification which makes use of space images and web technologies in social networks is presented. The numerical values of the criteria of accidents at fire and chemical hazardous facilities triggered by strong earthquakes are obtained. The seismic risk maps for Big Sochi City territory including the Olympic Games venue constructed taking into account new data on critical facilities obtained with application panorama photos of these facilities, space images of high resolution and web technologies. The obtained values of individual seismic risk taking into account secondary technological accidents exceed the values seismic risk without taking secondary hazard, return period T= 500 years, at 0.5-1.0 10-51/year.

  5. Statistical Seismic Landslide Analysis: an Update

    NASA Astrophysics Data System (ADS)

    Lee, Chyi-Tyi

    2015-04-01

    Landslides are secondary or induced features, whose recurrence is controlled by the repetition of triggering events, such as earthquakes or heavy rainfall. This makes seismic landslide hazard analysis more complicated than ordinary seismic hazard analysis, and it requires multi-stage analysis. First, susceptibility analysis is utilized to divide a region into successive classes. Then, it is necessary to construct a relationship between the probability of landslide failure and earthquake intensity for each susceptibility class for a region, or to find the probability of failure surface using the susceptibility value and earthquake intensity as independent variables at the study region. Then, hazard analysis for the exceedance probability of earthquake intensity is performed. Finally, an analysis of the spatial probability of landslide failure under a certain return-period earthquake is drawn. This study uses data for Chi-Chi earthquake induced landslides as the training data set to perform the susceptibility analysis and probability of failure surface analysis. A regular probabilistic seismic hazard analysis is also conducted to map different return-period Arias intensities. Finally a seismic landslide hazard map for the whole of Taiwan is provided.

  6. A Methodology For Flood Vulnerability Analysis In Complex Flood Scenarios

    NASA Astrophysics Data System (ADS)

    Figueiredo, R.; Martina, M. L. V.; Dottori, F.

    2015-12-01

    Nowadays, flood risk management is gaining importance in order to mitigate and prevent flood disasters, and consequently the analysis of flood vulnerability is becoming a key research topic. In this paper, we propose a methodology for large-scale analysis of flood vulnerability. The methodology is based on a GIS-based index, which considers local topography, terrain roughness and basic information about the flood scenario to reproduce the diffusive behaviour of floodplain flow. The methodology synthetizes the spatial distribution of index values into maps and curves, used to represent the vulnerability in the area of interest. Its application allows for considering different levels of complexity of flood scenarios, from localized flood defence failures to complex hazard scenarios involving river reaches. The components of the methodology are applied and tested in two floodplain areas in Northern Italy recently affected by floods. The results show that the methodology can provide an original and valuable insight of flood vulnerability variables and processes.

  7. Probabilistic seismic demand analysis of nonlinear structures

    NASA Astrophysics Data System (ADS)

    Shome, Nilesh

    Recent earthquakes in California have initiated improvement in current design philosophy and at present the civil engineering community is working towards development of performance-based earthquake engineering of structures. The objective of this study is to develop efficient, but accurate procedures for probabilistic analysis of nonlinear seismic behavior of structures. The proposed procedures help the near-term development of seismic-building assessments which require an estimation of seismic demand at a given intensity level. We also develop procedures to estimate the probability of exceedance of any specified nonlinear response level due to future ground motions at a specific site. This is referred as Probabilistic Seismic Demand Analysis (PSDA). The latter procedure prepares the way for the next stage development of seismic assessment that consider the uncertainties in nonlinear response and capacity. The proposed procedures require structure-specific nonlinear analyses for a relatively small set of recorded accelerograms and (site-specific or USGS-map-like) seismic hazard analyses. We have addressed some of the important issues of nonlinear seismic demand analysis, which are selection of records for structural analysis, the number of records to be used, scaling of records, etc. Initially these issues are studied through nonlinear analysis of structures for a number of magnitude-distance bins of records. Subsequently we introduce regression analysis of response results against spectral acceleration, magnitude, duration, etc., which helps to resolve these issues more systematically. We illustrate the demand-hazard calculations through two major example problems: a 5story and a 20-story SMRF building. Several simple, but quite accurate closed-form solutions have also been proposed to expedite the demand-hazard calculations. We find that vector-valued (e.g., 2-D) PSDA estimates demand hazard more accurately. This procedure, however, requires information about 2

  8. Vulnerability-attention analysis for space-related activities

    NASA Technical Reports Server (NTRS)

    Ford, Donnie; Hays, Dan; Lee, Sung Yong; Wolfsberger, John

    1988-01-01

    Techniques for representing and analyzing trouble spots in structures and processes are discussed. Identification of vulnerable areas usually depends more on particular and often detailed knowledge than on algorithmic or mathematical procedures. In some cases, machine inference can facilitate the identification. The analysis scheme proposed first establishes the geometry of the process, then marks areas that are conditionally vulnerable. This provides a basis for advice on the kinds of human attention or machine sensing and control that can make the risks tolerable.

  9. Malware Sandbox Analysis for Secure Observation of Vulnerability Exploitation

    NASA Astrophysics Data System (ADS)

    Yoshioka, Katsunari; Inoue, Daisuke; Eto, Masashi; Hoshizawa, Yuji; Nogawa, Hiroki; Nakao, Koji

    Exploiting vulnerabilities of remote systems is one of the fundamental behaviors of malware that determines their potential hazards. Understanding what kind of propagation tactics each malware uses is essential in incident response because such information directly links with countermeasures such as writing a signature for IDS. Although recently malware sandbox analysis has been studied intensively, little work is done on securely observing the vulnerability exploitation by malware. In this paper, we propose a novel sandbox analysis method for securely observing malware's vulnerability exploitation in a totally isolated environment. In our sandbox, we prepare two victim hosts. We first execute the sample malware on one of these hosts and then let it attack the other host which is running multiple vulnerable services. As a simple realization of the proposed method, we have implemented a sandbox using Nepenthes, a low-interaction honeypot, as the second victim. Because Nepenthes can emulate a variety of vulnerable services, we can efficiently observe the propagation of sample malware. In the experiments, among 382 samples whose scan capabilities are confirmed, 381 samples successfully started exploiting vulnerabilities of the second victim. This indicates the certain level of feasibility of the proposed method.

  10. Vulnerability Analysis Considerations for the Transportation of Special Nuclear Material

    SciTech Connect

    Nicholson, Lary G.; Purvis, James W.

    1999-07-21

    The vulnerability analysis methodology developed for fixed nuclear material sites has proven to be extremely effective in assessing associated transportation issues. The basic methods and techniques used are directly applicable to conducting a transportation vulnerability analysis. The purpose of this paper is to illustrate that the same physical protection elements (detection, delay, and response) are present, although the response force plays a dominant role in preventing the theft or sabotage of material. Transportation systems are continuously exposed to the general public whereas the fixed site location by its very nature restricts general public access.

  11. Seismic Vulnerability Assessment Waste Characterization Reduction and Repackaging Building, TA-50-69

    SciTech Connect

    M.W.Sullivan; J.Ruminer; I.Cuesta

    2003-02-02

    This report presents the results of the seismic structural analyses completed on the Waste Characterization Reduction and Repackaging (WCRR) Building in support of ongoing safety analyses. WCRR is designated as TA-50-69 at Los Alamos National Laboratory, Los Alamos, New Mexico. The facility has been evaluated against Department of Energy (DOE) seismic criteria for Natural Phenomena Hazards (NPH) Performance Category II (PC 2). The seismic capacities of two subsystems within the WCRR building, the material handling glove box and the lift rack immediately adjacent to the Glove Box are also documented, and the results are presented.

  12. Masonry Infilling Effect On Seismic Vulnerability and Performance Level of High Ductility RC Frames

    SciTech Connect

    Ghalehnovi, M.; Shahraki, H.

    2008-07-08

    In last years researchers preferred behavior-based design of structure to force-based one for designing and construction of the earthquake-resistance structures, this method is named performance based designing. The main goal of this method is designing of structure members for a certain performance or behavior. On the other hand in most of buildings, load bearing frames are infilled with masonry materials which leads to considerable changes in mechanical properties of frames. But usually infilling wall's effect has been ignored in nonlinear analysis of structures because of complication of the problem and lack of simple logical solution. As a result lateral stiffness, strength, ductility and performance of the structure will be computed with less accuracy. In this paper by use of Smooth hysteretic model for masonry infillings, some high ductile RC frames (4, 8 stories including 1, 2 and 3 spans) designed according to Iranian code are considered. They have been analyzed by nonlinear dynamic method in two states, with and without infilling. Then their performance has been determined with criteria of ATC 40 and compared with recommended performance in Iranian seismic code (standard No. 2800)

  13. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  14. A Preliminary Tsunami Vulnerability Analysis for Yenikapi Region in Istanbul

    NASA Astrophysics Data System (ADS)

    Ceren Cankaya, Zeynep; Suzen, Lutfi; Cevdet Yalciner, Ahmet; Kolat, Cagil; Aytore, Betul; Zaytsev, Andrey

    2015-04-01

    One of the main requirements during post disaster recovery operations is to maintain proper transportation and fluent communication at the disaster areas. Ports and harbors are the main transportation hubs which must work with proper performance at all times especially after the disasters. Resilience of coastal utilities after earthquakes and tsunamis have major importance for efficient and proper rescue and recovery operations soon after the disasters. Istanbul is a mega city with its various coastal utilities located at the north coast of the Sea of Marmara. At Yenikapi region of Istanbul, there are critical coastal utilities and vulnerable coastal structures and critical activities occur daily. Fishery ports, commercial ports, small craft harbors, passenger terminals of intercity maritime transportation, water front commercial and/or recreational structures are some of the examples of coastal utilization which are vulnerable against marine disasters. Therefore their vulnerability under tsunami or any other marine hazard to Yenikapi region of Istanbul is an important issue. In this study, a methodology of vulnerability analysis under tsunami attack is proposed with the applications to Yenikapi region. In the study, high resolution (1m) GIS database of Istanbul Metropolitan Municipality (IMM) is used and analyzed by using GIS implementation. The bathymetry and topography database and the vector dataset containing all buildings/structures/infrastructures in the study area are obtained for tsunami numerical modeling for the study area. GIS based tsunami vulnerability assessment is conducted by applying the Multi-criteria Decision Making Analysis (MCDA). The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability parameters in the region due to two different classifications i) vulnerability of buildings/structures and ii) vulnerability of (human) evacuation

  15. Betweenness as a Tool of Vulnerability Analysis of Power System

    NASA Astrophysics Data System (ADS)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-06-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  16. Vulnerability Analysis and Evaluation of Urban Road System in Tianjin

    NASA Astrophysics Data System (ADS)

    Liu, Y. Q.; Wu, X.

    In recent years, with the development of economy, the road construction of our country has entered into a period of rapid growth. The road transportation network has been expanding and the risk of disasters is increasing. In this paper we study the vulnerability of urban road system in Tianjin. After analyzed many risk factors of the urban road system security, including road construction, road traffic and the natural environment, we proposed an evaluation index of vulnerability of urban road system and established the corresponding evaluation index system. Based on the results of analysis and comprehensive evaluation, appropriate improvement measures and suggestions which may reduce the vulnerability of the road system and improve the safety and reliability of the road system are proposed.

  17. Reservoir permeability from seismic attribute analysis

    SciTech Connect

    Silin, Dmitriy; Goloshubin, G.; Silin, D.; Vingalov, V.; Takkand, G.; Latfullin, M.

    2008-02-15

    In case of porous fluid-saturated medium the Biot's poroelasticity theory predicts a movement of the pore fluid relative to the skeleton on seismic wave propagation through the medium. This phenomenon opens an opportunity for investigation of the flow properties of the hydrocarbon-saturated reservoirs. It is well known that relative fluid movement becomes negligible at seismic frequencies if porous material is homogeneous and well cemented. In this case the theory predicts an underestimated seismic wave velocity dispersion and attenuation. Based on Biot's theory, Helle et al. (2003) have numerically demonstrated the substantial effects on both velocity and attenuation by heterogeneous permeability and saturation in the rocks. Besides fluid flow effect, the effects of scattering (Gurevich, et al., 1997) play very important role in case of finely layered porous rocks and heterogeneous fluid saturation. We have used both fluid flow and scattering effects to derive a frequency-dependent seismic attribute which is proportional to fluid mobility and applied it for analysis of reservoir permeability.

  18. GIS modeling of seismic vulnerability of residential fabrics considering geotechnical, structural, social and physical distance indicators in Tehran using multi-criteria decision-making techniques

    NASA Astrophysics Data System (ADS)

    Rezaie, F.; Panahi, M.

    2015-03-01

    The main issue in determining seismic vulnerability is having a comprehensive view of all probable damages related to earthquake occurrence. Therefore, taking into account factors such as peak ground acceleration at the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education and the physical distance to hospitals (or medical care centers) and categorizing them into four indicators of geotechnical, structural, social and physical distance to needed facilities and from dangerous ones will provide us with a better and more exact outcome. To this end, this paper uses the analytic hierarchy process to study the importance of criteria or alternatives and uses the geographical information system to study the vulnerability of Tehran to an earthquake. This study focuses on the fact that Tehran is surrounded by three active and major faults: Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran is analyzed and classified into four levels: high, medium, low and safe. The results show that, regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey make, respectively, 6, 16 and 10% of Tehran highly vulnerable, while 34, 14 and 27% is safe.

  19. GIS modelling of seismic vulnerability of residential fabrics considering geotechnical, structural, social and physical distance indicators in Tehran city using multi-criteria decision-making (MCDM) techniques

    NASA Astrophysics Data System (ADS)

    Rezaie, F.; Panahi, M.

    2014-09-01

    The main issue in determining the seismic vulnerability is having a comprehensive view to all probable damages related to earthquake occurrence. Therefore, taking factors such as peak ground acceleration (PGA) in the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education, the physical distance to a hospitals (or medical care centers), etc. into account and categorized under four indicators of geotechnical, structural, social and physical distance to needed facilities and distance from dangerous ones will provide us with a better and more exact outcome. To this end in this paper using analytic hierarchy process (AHP), the amount of importance of criteria or alternatives are determined and using geographical information system (GIS), the vulnerability of Tehran metropolitan as a result of an earthquake, is studied. This study focuses on the fact that Tehran is surrounded by three active and major faults of the Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran city is analysed and classified into four levels including high, medium, low and safe. The results show that regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey respectively make 6, 16 and 10% of Tehran area highly vulnerable and also 34, 14 and 27% are safe.

  20. Vulnerability analysis for complex networks using aggressive abstraction.

    SciTech Connect

    Colbaugh, Richard; Glass, Kristin L.

    2010-06-01

    Large, complex networks are ubiquitous in nature and society, and there is great interest in developing rigorous, scalable methods for identifying and characterizing their vulnerabilities. This paper presents an approach for analyzing the dynamics of complex networks in which the network of interest is first abstracted to a much simpler, but mathematically equivalent, representation, the required analysis is performed on the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit vulnerability-preserving, finite state abstractions, and develop efficient algorithms for computing these abstractions. We then propose a vulnerability analysis methodology which combines these finite state abstractions with formal analytics from theoretical computer science to yield a comprehensive vulnerability analysis process for networks of realworld scale and complexity. The potential of the proposed approach is illustrated with a case study involving a realistic electric power grid model and also with brief discussions of biological and social network examples.

  1. WHE-PAGER Project: A new initiative in estimating global building inventory and its seismic vulnerability

    USGS Publications Warehouse

    Porter, K.A.; Jaiswal, K.S.; Wald, D.J.; Greene, M.; Comartin, Craig

    2008-01-01

    The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.

  2. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites for the surroundings of Katerloch cave, Austria

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Kovács, Károly; Kalmár, János

    2016-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. This raises the important issue for society, how to react to the natural hazard: potential damages are huge, and infrastructure costs for addressing these hazards are huge as well. Obtaining an unbiased view of seismic hazard (and risk) is very important therefore. In principle, the best way to test Probabilistic Seismic Hazard Assessments (PSHA) is to compare with observations that are entirely independent of the procedure used to produce the PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occurring over time intervals that are at least as long as a seismic cycle. Such information would be very valuable, even if it concerned only a single site. Long-term information can in principle be gained from intact stalagmites in natural karstic caves. These have survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that period. We are focusing here on a case study from the Katerloch cave close to the city of Graz, Austria. A specially-shaped (candle stick style: high, slim, and more or less cylindrical form) intact and vulnerable stalagmites (IVSTM) in the Katerloch cave has been examined in 2013 and 2014. This IVSTM is suitable for estimating the upper limit for horizontal peak ground acceleration generated by pre-historic earthquakes. For this cave, we have extensive information about ages (e.g., Boch et al., 2006, 2010). The approach, used in our study, yields significant new constraints on seismic hazard, as the intactness of the stalagmites suggests that tectonic structures close to Katerloch cave, i.p. the Mur-Mürz fault did not generate very strong paleoearthquakes in the last few thousand years

  3. Seismic Vulnerability Evaluations Within The Structural And Functional Survey Activities Of The COM Bases In Italy

    SciTech Connect

    Zuccaro, G.; Cacace, F.; Albanese, V.; Mercuri, C.; Papa, F.; Pizza, A. G.; Sergio, S.; Severino, M.

    2008-07-08

    The paper describes technical and functional surveys on COM buildings (Mixed Operative Centre). This activity started since 2005, with the contribution of both Italian Civil Protection Department and the Regions involved. The project aims to evaluate the efficiency of COM buildings, checking not only structural, architectonic and functional characteristics but also paying attention to surrounding real estate vulnerability, road network, railways, harbours, airports, area morphological and hydro-geological characteristics, hazardous activities, etc. The first survey was performed in eastern Sicily, before the European Civil Protection Exercise 'EUROSOT 2005'. Then, since 2006, a new survey campaign started in Abruzzo, Molise, Calabria and Puglia Regions. The more important issue of the activity was the vulnerability assessment. So this paper deals with a more refined vulnerability evaluation technique by means of the SAVE methodology, developed in the 1st task of SAVE project within the GNDT-DPC programme 2000-2002 (Zuccaro, 2005); the SAVE methodology has been already successfully employed in previous studies (i.e. school buildings intervention programme at national scale; list of strategic public buildings in Campania, Sicilia and Basilicata). In this paper, data elaborated by SAVE methodology are compared with expert evaluations derived from the direct inspections on COM buildings. This represents a useful exercise for the improvement either of the survey forms or of the methodology for the quick assessment of the vulnerability.

  4. Seismic Vulnerability Evaluations Within The Structural And Functional Survey Activities Of The COM Bases In Italy

    NASA Astrophysics Data System (ADS)

    Zuccaro, G.; Albanese, V.; Cacace, F.; Mercuri, C.; Papa, F.; Pizza, A. G.; Sergio, S.; Severino, M.

    2008-07-01

    The paper describes technical and functional surveys on COM buildings (Mixed Operative Centre). This activity started since 2005, with the contribution of both Italian Civil Protection Department and the Regions involved. The project aims to evaluate the efficiency of COM buildings, checking not only structural, architectonic and functional characteristics but also paying attention to surrounding real estate vulnerability, road network, railways, harbours, airports, area morphological and hydro-geological characteristics, hazardous activities, etc. The first survey was performed in eastern Sicily, before the European Civil Protection Exercise "EUROSOT 2005". Then, since 2006, a new survey campaign started in Abruzzo, Molise, Calabria and Puglia Regions. The more important issue of the activity was the vulnerability assessment. So this paper deals with a more refined vulnerability evaluation technique by means of the SAVE methodology, developed in the 1st task of SAVE project within the GNDT-DPC programme 2000-2002 (Zuccaro, 2005); the SAVE methodology has been already successfully employed in previous studies (i.e. school buildings intervention programme at national scale; list of strategic public buildings in Campania, Sicilia and Basilicata). In this paper, data elaborated by SAVE methodology are compared with expert evaluations derived from the direct inspections on COM buildings. This represents a useful exercise for the improvement either of the survey forms or of the methodology for the quick assessment of the vulnerability.

  5. Analysis of seismic events in and near Kuwait

    SciTech Connect

    Harris, D B; Mayeda, K M; Rodgers, A J; Ruppert, S D

    1999-05-11

    Seismic data for events in and around Kuwait were collected and analyzed. The authors estimated event moment, focal mechanism and depth by waveform modeling. Results showed that reliable seismic source parameters for events in and near Kuwait can be estimated from a single broadband three-component seismic station. This analysis will advance understanding of earthquake hazard in Kuwait.

  6. Natural Time Analysis of Seismicity: Recent Results

    NASA Astrophysics Data System (ADS)

    Varotsos, P.; Uyeda, S.; Sarlis, N. V.; Skordas, E. S.; Nagao, T.; Kamogawa, M.

    2013-12-01

    Natural time analysis introduced almost a decade ago[1] may uncover novel dynamic features hidden in the time series of complex systems and has been applied[2] to diverse fields. For a time series comprising N events, the natural time for the occurrence of the k-th event of energy Qk is defined by χk=k/N and the analysis is made by studying the evolution of the pair (χk,pk ), where pk=Qk/ΣQn is the normalized energy. In natural time analysis of seismicity, the variance κ1 of natural time χ weighted for pk calculated from seismic catalogues serves as an order parameter [2]. The Japan seismic catalog was analyzed in natural time by employing a sliding natural time window of fixed length comprised of the number of events that would occur in a few months. This is a crucial time scale since it corresponds to the average lead time of the observed Seismic Electric Signals (SES) activities [2]. The following results are obtained: First, the fluctuations of the order parameter of seismicity exhibit [3] a clearly detectable minimum approximately at the time of the initiation of the pronounced SES activity observed [4] almost two months before the onset of the volcanic-seismic swarm activity in 2000 in the Izu Island region, Japan. This is the first time that before the occurrence of major earthquakes, anomalous changes are found to appear almost simultaneously in two different geophysical observables. Second, these two phenomena were shown to be also linked in space[3]. Third, minima of the order parameter fluctuations of seismicity were observed [5] a few months before all shallow earthquakes of magnitude 7.6 or larger that occurred from 1 January 1984 to 11 March 2011 (the day of the M9 Tohoku earthquake) in Japanese area. Among these minima, the minimum before the M9 Tohoku earthquake was the deepest. Additional recent results are forwarded which shed more light on the importance of the aforementioned minima for earthquake prediction purposes. [1] Varotsos, P. A

  7. A Preliminary Tsunami vulnerability analysis for Bakirkoy district in Istanbul

    NASA Astrophysics Data System (ADS)

    Tufekci, Duygu; Lutfi Suzen, M.; Cevdet Yalciner, Ahmet; Zaytsev, Andrey

    2016-04-01

    Resilience of coastal utilities after earthquakes and tsunamis has major importance for efficient and proper rescue and recovery operations soon after the disasters. Vulnerability assessment of coastal areas under extreme events has major importance for preparedness and development of mitigation strategies. The Sea of Marmara has experienced numerous earthquakes as well as associated tsunamis. There are variety of coastal facilities such as ports, small craft harbors, and terminals for maritime transportation, water front roads and business centers mainly at North Coast of Marmara Sea in megacity Istanbul. A detailed vulnerability analysis for Yenikapi region and a detailed resilience analysis for Haydarpasa port in Istanbul have been studied in previously by Cankaya et al., (2015) and Aytore et al., (2015) in SATREPS project. In this study, the methodology of vulnerability analysis under tsunami attack given in Cankaya et al., (2015) is modified and applied to Bakirkoy district of Istanbul. Bakirkoy district is located at western part of Istanbul and faces to the North Coast of Marmara Sea from 28.77oE to 28.89oE. High resolution spatial dataset of Istanbul Metropolitan Municipality (IMM) is used and analyzed. The bathymetry and topography database and the spatial dataset containing all buildings/structures/infrastructures in the district are collated and utilized for tsunami numerical modeling and following vulnerability analysis. The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability assessment parameters in the district according to vulnerability and resilience are defined; and scored by implementation of a GIS based TVA with appropriate MCDA methods. The risk level is computed using tsunami intensity (level of flow depth from simulations) and TVA results at every location in Bakirkoy district. The preliminary results are presented and discussed

  8. Temperature-based Instanton Analysis: Identifying Vulnerability in Transmission Networks

    SciTech Connect

    Kersulis, Jonas; Hiskens, Ian; Chertkov, Michael; Backhaus, Scott N.; Bienstock, Daniel

    2015-04-08

    A time-coupled instanton method for characterizing transmission network vulnerability to wind generation fluctuation is presented. To extend prior instanton work to multiple-time-step analysis, line constraints are specified in terms of temperature rather than current. An optimization formulation is developed to express the minimum wind forecast deviation such that at least one line is driven to its thermal limit. Results are shown for an IEEE RTS-96 system with several wind-farms.

  9. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  10. Seismic Vulnerability Assessment for Montreal -An Application of HAZUS-MH4

    NASA Astrophysics Data System (ADS)

    Yu, Keyan

    2011-12-01

    Seismic loss estimation for Montreal, Canada is performed for a 2% in 50 years seismic hazard using the HAZUS-MH4 tool developed by US Federal Emergency Management. The software is manipulated to accept a Canadian setting for the Montreal study region, which includes 522 census tracts. The accuracy of loss estimations using HAZUS is dependent on the quality and quantity of data collection and preparation. The data collected for Montreal study region comprise: (1) the building inventory (2) hazard maps regarding soil amplification, liquefaction, and landslides (3) population distribution at three different times of the day (4) census demographic information and (5) synthetic ground motion contour maps using three different ground motion prediction equations. All these data are prepared and assembled into geodatabases that are compatible with the HAZUS software. The study estimated that roughly 5% of the building stock would be damaged with direct economic losses evaluated at 1.4 billion dollars for a scenario corresponding to the 2% in 50 years scenario. The maximum number of casualties associated with this scenario corresponds to a time of occurrence of 2pm and would result in approximately 500 people being injured. Epistemic uncertainty was considered by obtaining damage estimates for three attenuation functions that were developed for Eastern North America. The results indicate that loss estimates are highly sensitive to the choice of the attenuation function and suggests that epistemic uncertainty should be considered both for the definition of the hazard function and in loss estimation methodologies. The next steps in the study should be to increase the size of the survey area to the Greater Montreal which includes more than 3 million inhabitants and to perform more targeted studies for critical areas such as downtown Montreal, and the south-eastern tip of Montreal. The current study was performed mainly for the built environment; the next phase will need to

  11. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  12. K-means cluster analysis and seismicity partitioning for Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2014-07-01

    Pakistan and the western Himalaya is a region of high seismic activity located at the triple junction between the Arabian, Eurasian and Indian plates. Four devastating earthquakes have resulted in significant numbers of fatalities in Pakistan and the surrounding region in the past century (Quetta, 1935; Makran, 1945; Pattan, 1974 and the recent 2005 Kashmir earthquake). It is therefore necessary to develop an understanding of the spatial distribution of seismicity and the potential seismogenic sources across the region. This forms an important basis for the calculation of seismic hazard; a crucial input in seismic design codes needed to begin to effectively mitigate the high earthquake risk in Pakistan. The development of seismogenic source zones for seismic hazard analysis is driven by both geological and seismotectonic inputs. Despite the many developments in seismic hazard in recent decades, the manner in which seismotectonic information feeds the definition of the seismic source can, in many parts of the world including Pakistan and the surrounding regions, remain a subjective process driven primarily by expert judgment. Whilst much research is ongoing to map and characterise active faults in Pakistan, knowledge of the seismogenic properties of the active faults is still incomplete in much of the region. Consequently, seismicity, both historical and instrumental, remains a primary guide to the seismogenic sources of Pakistan. This study utilises a cluster analysis approach for the purposes of identifying spatial differences in seismicity, which can be utilised to form a basis for delineating seismogenic source regions. An effort is made to examine seismicity partitioning for Pakistan with respect to earthquake database, seismic cluster analysis and seismic partitions in a seismic hazard context. A magnitude homogenous earthquake catalogue has been compiled using various available earthquake data. The earthquake catalogue covers a time span from 1930 to 2007 and

  13. Analytical and Experimental Assessment of Seismic Vulnerability of Beam-Column Joints without Transverse Reinforcement in Concrete Buildings

    NASA Astrophysics Data System (ADS)

    Hassan, Wael Mohammed

    Beam-column joints in concrete buildings are key components to ensure structural integrity of building performance under seismic loading. Earthquake reconnaissance has reported the substantial damage that can result from inadequate beam-column joints. In some cases, failure of older-type corner joints appears to have led to building collapse. Since the 1960s, many advances have been made to improve seismic performance of building components, including beam-column joints. New design and detailing approaches are expected to produce new construction that will perform satisfactorily during strong earthquake shaking. Much less attention has been focused on beam-column joints of older construction that may be seismically vulnerable. Concrete buildings constructed prior to developing details for ductility in the 1970s normally lack joint transverse reinforcement. The available literature concerning the performance of such joints is relatively limited, but concerns about performance exist. The current study aimed to improve understanding and assessment of seismic performance of unconfined exterior and corner beam-column joints in existing buildings. An extensive literature survey was performed, leading to development of a database of about a hundred tests. Study of the data enabled identification of the most important parameters and the effect of each parameter on the seismic performance. The available analytical models and guidelines for strength and deformability assessment of unconfined joints were surveyed and evaluated. In particular, The ASCE 41 existing building document proved to be substantially conservative in joint shear strength estimation. Upon identifying deficiencies in these models, two new joint shear strength models, a bond capacity model, and two axial capacity models designed and tailored specifically for unconfined beam-column joints were developed. The proposed models strongly correlated with previous test results. In the laboratory testing phase of

  14. Structural reliability analysis and seismic risk assessment

    SciTech Connect

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed.

  15. A vulnerability analysis for a drought vulnerable catchment in South-Eastern Austria

    NASA Astrophysics Data System (ADS)

    Hohmann, Clara; Kirchengast, Gottfried; Birk, Steffen

    2016-04-01

    To detect uncertainties and thresholds in a drought vulnerable region we focus on a typical river catchment of the Austrian South-Eastern Alpine forelands with good data availability, the Raab valley. This mid-latitude region in the south-east of the Austrian state Styria (˜ 47° N, ˜ 16° E) exhibits a strong temperature increase over the last decades. Especially the mean summer temperatures (June to August) show a strong increase (˜ 0.7 °C per decade) over the last decades (1971 - 2015) (Kabas et al., Meteorol. Z. 20, 277-289, 2011; pers. comm., 2015). The Styrian Raab valley, with a catchment size of 986 km2, has already struggled with drought periods (e.g., summers of 1992, 2001 and 2003). Thus, it is important to know what happens if warm and dry periods occur more frequently. Therefore we analyze which sensitivities and related uncertainties exist, which thresholds might be crossed, and what the effects on the different components of the water balance equation are, in particular on runoff, soil moisture, groundwater recharge, and evapotranspiration. We use the mainly physics-based hydrological Water Flow and Balance Simulation Model (WaSiM), developed at ETH Zurich (Schulla, Diss., ETH Zurich, CH, 1997). The model is well established and widely used for hydrological modeling at a diversity of spatial and temporal resolutions. We choose a model set up which is as simple as possible but as complex as necessary to perform sensitivity studies on uncertainties and thresholds in the context of climate change. In order to assess the model performance under a wide range of conditions, the calibration and validation is performed with a split sample for dry and wet periods. With the calibrated and validated model we perform a low-flow vulnerability analysis ("stress test"), with focus on drought-related conditions. Therefore we simulate changes in weather and climate (e.g., 20% and 50% less precipitation, 2 °C and 5 °C higher temperature), changes in land use and

  16. Seismic vulnerability evaluation of axially loaded steel built-up laced members I: experimental results

    NASA Astrophysics Data System (ADS)

    Lee, Kangmin; Bruneau, Michel

    2008-06-01

    An experimental program was initiated to investigate the seismic performance of built-up laced steel brace members. Quasi-static testing of twelve typical steel built-up laced member (BLM) specimens was conducted. These were designed to span a range of parameters typically encountered for such members based on findings from a survey of commonly used shapes and details that have been historically used. The specimens were subdivided into groups of three different cross-sectional shapes, namely built-up I-shape section, and built-up box shapes buckling about the x or the y axis. Within each group, global and local buckling slenderness ratios had either kl/r values of 60 or 120, and b/t ratios of 8 or 16. The specific inelastic cyclic behavior germane to each specimen, and general observations on overall member hysteretic behavior as a function of the considered parameters, are reported. A companion paper (Lee and Bruneau 2008) investigates this observed response against predictions from analytical models, and behavior in the perspective of system performance.

  17. Seismic Isolation Working Meeting Gap Analysis Report

    SciTech Connect

    Justin Coleman; Piyush Sabharwall

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  18. Nonlinear Seismic Analysis of Morrow Point Dam

    SciTech Connect

    Noble, C R; Nuss, L K

    2004-02-20

    This research and development project was sponsored by the United States Bureau of Reclamation (USBR), who are best known for the dams, power plants, and canals it constructed in the 17 western states. The mission statement of the USBR's Dam Safety Office, located in Denver, Colorado, is ''to ensure Reclamation dams do not present unacceptable risk to people, property, and the environment.'' The Dam Safety Office does this by quickly identifying the dams which pose an increased threat to the public, and quickly completing the related analyses in order to make decisions that will safeguard the public and associated resources. The research study described in this report constitutes one element of USBR's research and development work to advance their computational and analysis capabilities for studying the response of dams to strong earthquake motions. This project focused on the seismic response of Morrow Point Dam, which is located 263 km southwest of Denver, Colorado.

  19. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  20. A transferable approach towards rapid inventory data capturing for seismic vulnerability assessment using open-source geospatial technologies

    NASA Astrophysics Data System (ADS)

    Wieland, M.; Pittore, M.; Parolai, S.; Zschau, J.

    2012-04-01

    Geospatial technologies are increasingly being used in pre-disaster vulnerability assessment and post-disaster impact assessment for different types of hazards. Especially the use of remote sensing data has been strongly promoted in recent years due to its capabilities of providing up-to-date information over large areas at a comparatively low cost with increasingly high spatial, temporal and spectral resolution. Despite its clear potentials, a purely remote sensing based approach has its limitations in that it is only capable of providing information about the birds-eye view of the objects of interest. The use of omnidirectional imaging in addition can provide the necessary street-view that furthermore allows for a rapid visual screening of a buildings façade. In this context, we propose an integrated approach to rapid inventory data capturing for the assessment of structural vulnerability of buildings in case of an earthquake. Globally available low-cost data sources are preferred and the tools are developed on an open-source basis to allow for a high degree of transferability and usability. On a neighbourhood scale medium spatial but high temporal and spectral resolution satellite images are analysed to outline areas of homogeneous urban structure. Following a proportional allocation scheme, for each urban structure type representative sample areas are selected for a more detailed analysis of the building stock with high resolution image data. On a building-by-building scale a ground-based, rapid visual survey is performed using an omnidirectional imaging system driven around with a car inside the identified sample areas. Processing of the acquired images allows for an extraction of vulnerability-related features of single buildings (e.g. building height, detection of soft-storeys). An analysis of high resolution satellite images provides with further inventory features (e.g. footprint area, shape irregularity). Since we are dealing with information coming from

  1. Seismic Initiating Event Analysis For a PBMR Plant

    SciTech Connect

    Van Graan, Henriette; Serbanescu, Dan; Combrink, Yolanda; Coman, Ovidiu

    2004-07-01

    Seismic Initiating Event (IE) analysis is one of the most important tasks that control the level of effort and quality of the whole Seismic Probabilistic Safety Assessment (SPRA). The typical problems are related to the following aspects: how the internal PRA model and its complexity can be used and how to control the number of PRA components for which fragility evaluation should be performed and finally to obtain a manageable number of significant cut-sets for seismic risk quantification. The answers to these questions are highly dependent on the possibility to improve the interface between the internal events analysis and the external events analysis at the design stage. (authors)

  2. An Analysis of the Mt. Meron Seismic Array

    SciTech Connect

    Pasyanos, M E; Ryall, F

    2008-01-10

    We have performed a quick analysis of the Mt. Meron seismic array to monitor regional seismic events in the Middle East. The Meron array is the only current array in the Levant and Arabian Peninsula and, as such, might be useful in contributing to event location, identification, and other analysis. Here, we provide a brief description of the array and a review of the travel time and array analysis done to assess its performance.

  3. Analysis of Brazilian data for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Drouet, S.; Assumpção, M.

    2013-05-01

    Seismic hazard analysis in Brazil is going to be re-assessed in the framework of the Global Earthquake Model (GEM) project. Since the last worldwide Global Seismic Hazard Analysis Project (GSHAP) there has been no specific study in this field in Brazil. Brazil is a stable continental region and is characterized by a low seismic activity. In this particular type of regions, seismic hazard assessment is a very hard task due to the limited amount of data available regarding the seismic sources, earthquake catalogue, or ground-motion amplitudes, and the uncertainties associated are very large. This study focuses on recorded data in South-East Brazil where broadband stations are installed, belonging to two networks: the network managed by the seismology group at the IAG-USP in São Paulo which exists since about 20 years, and the network managed by the Observatorio Nacional in Rio de Janeiro which has just been set up. The two networks are now integrated into the national network RSB (Rede Sismográfica Brasileira) which will also include stations from the rest of Brazil currently in installation by the Universities of Brasilia and Natal. There are a couple of events with magnitude greater than 3 recorded at these very sensitive stations, usually at rather large distances. At first sight these data may appear meaningless in the context of seismic hazard but they can help to improve different parts involved in the process. The analysis of the S-wave Fourier spectra can help to better resolve source, path and site effects in Brazil. For instance moment magnitudes can be computed from the flat part of the Fourier spectra. These magnitudes are of utmost importance in order to build an homogeneous catalogue in terms of moment magnitude. At the moment only body wave magnitude (or some equivalent scale) are determined routinely for the events in Brazil. Attenuation and site effect, especially the high-frequency attenuation known as the kappa effect will also help to

  4. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    SciTech Connect

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  5. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  6. Stochastic seismic analysis in the Messina strait area

    SciTech Connect

    Cacciola, P.; Maugeri, N.; Muscolino, G.

    2008-07-08

    After 1908 Messina earthquake significant progresses have been carried out in the field of earthquake engineering. Usually seismic action is represented via the so called elastic response spectrum or alternatively by time histories of ground motion acceleration. Due the random nature of the seismic action, alternative representations assume the seismic action as zero-mean Gaussian process fully defined by the so-called Power Spectral Density function. Aim of this paper is the comparative study of the response of linear behaving structures adopting the above representation of the seismic action using recorded earthquakes in the Messina strait area. In this regard, a handy method for determining the power spectral density function of recorded earthquakes is proposed. Numerical examples conducted on the existing space truss located in Torre Faro (Messina) will show the effectiveness of stochastic approach for coping with the seismic analysis of structures.

  7. A graph-based system for network-vulnerability analysis

    SciTech Connect

    Swiler, L.P.; Phillips, C.

    1998-06-01

    This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

  8. A graph-based network-vulnerability analysis system

    SciTech Connect

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-01-01

    This report presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

  9. A graph-based network-vulnerability analysis system

    SciTech Connect

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-05-03

    This paper presents a graph based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level of effort for the attacker, various graph algorithms such as shortest path algorithms can identify the attack paths with the highest probability of success.

  10. HANFORD DOUBLE SHELL TANK THERMAL AND SEISMIC PROJECT SEISMIC ANALYSIS OF HANFORD DOUBLE SHELL TANKS

    SciTech Connect

    MACKEY TC; RINKER MW; CARPENTER BG; HENDRIX C; ABATT FG

    2009-01-15

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Analyses. The original scope of the project was to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). Although Milestone M-48-14 has been met, Revision I is being issued to address external review comments with emphasis on changes in the modeling of anchor bolts connecting the concrete dome and the steel primary tank. The work statement provided to M&D (PNNL 2003) required that a nonlinear soil structure interaction (SSI) analysis be performed on the DSTs. The analysis is required to include the effects of sliding interfaces and fluid sloshing (fluid-structure interaction). SSI analysis has traditionally been treated by frequency domain computer codes such as SHAKE (Schnabel, et al. 1972) and SASSI (Lysmer et al. 1999a). Such frequency domain programs are limited to the analysis of linear systems. Because of the contact surfaces, the response of the DSTs to a seismic event is inherently nonlinear and consequently outside the range of applicability of the linear frequency domain programs. That is, the nonlinear response of the DSTs to seismic excitation requires the use of a time domain code. The capabilities and limitations of the commercial time domain codes ANSYS{reg_sign} and MSC Dytran{reg_sign} for performing seismic SSI analysis of the DSTs and the methodology required to perform the detailed seismic analysis of the DSTs has been addressed in Rinker et al (2006a). On the basis of the results reported in Rinker et al

  11. Seismic Damage Analysis of Aged Concrete Gravity Dams

    NASA Astrophysics Data System (ADS)

    Nayak, Parsuram; Maity, Damodar

    2013-08-01

    The design of a concrete gravity dam must provide the ability to withstand the seismic forces for which nonlinear behavior is expected. The nonlinear seismic response of the dam may be different due to aging, as the concrete gets degraded because of environmental factors and mechanical loadings. The present study investigates the evolution of tensile damages in aged concrete gravity dams, which is necessary to estimate the safety of existing dams towards future earthquake forces. The degraded material properties of the concrete with age, subjected to environmental factors and mechanical loadings, are determined introducing an isotropic degradation index. A concrete damaged plasticity model, which assumes both the compressive and tensile damage, is used to evaluate the nonlinear seismic response of the dam. Results show that the peak maximum principal stresses reduced at the neck due to aging effects in the concrete. It is observed that the neck region is the most vulnerable region to initiate damage for all cases of aged dams. The results show that there are severe damages to the structure at higher ages under seismic loadings. The proposed method can ensure the safety of dams during their entire design life considering the environmental factors and mechanical loadings affecting the materials as they age.

  12. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  13. Detecting seismic activity with a covariance matrix analysis of data recorded on seismic arrays

    NASA Astrophysics Data System (ADS)

    Seydoux, L.; Shapiro, N. M.; de Rosny, J.; Brenguier, F.; Landès, M.

    2016-03-01

    Modern seismic networks are recording the ground motion continuously at the Earth's surface, providing dense spatial samples of the seismic wavefield. The aim of our study is to analyse these records with statistical array-based approaches to identify coherent time-series as a function of time and frequency. Using ideas mainly brought from the random matrix theory, we analyse the spatial coherence of the seismic wavefield from the width of the covariance matrix eigenvalue distribution. We propose a robust detection method that could be used for the analysis of weak and emergent signals embedded in background noise, such as the volcanic or tectonic tremors and local microseismicity, without any prior knowledge about the studied wavefields. We apply our algorithm to the records of the seismic monitoring network of the Piton de la Fournaise volcano located at La Réunion Island and composed of 21 receivers with an aperture of ˜15 km. This array recorded many teleseismic earthquakes as well as seismovolcanic events during the year 2010. We show that the analysis of the wavefield at frequencies smaller than ˜0.1 Hz results in detection of the majority of teleseismic events from the Global Centroid Moment Tensor database. The seismic activity related to the Piton de la Fournaise volcano is well detected at frequencies above 1 Hz.

  14. Seismic analysis of reactor exhaust air filter compartment

    SciTech Connect

    Gong, Chung; Funderburk, E.L.; Jerrell, J.W.

    1990-09-24

    The Filter Compartment (FC) in this analysis is a generic reactor airborne activity confinement filter compartment which possesses all the essential physical and mechanical properties of the Savannah River Site (SRS) confinement filters of Reactor Buildings K, L, and P. The filters belong to the Airborne Activity Confinement System (AACS). These filters absorb a significant amount of radioactive effluents from the exhausting air. The seismic excitation is input indirectly from the output of the seismic analysis of the 105 exhaust stack building in the form of floor response spectra. However, the 105 exhaust stack building was analyzed for seismic motions defined by free-field ground response spectra with a ZPA (Zero Period Acceleration) of 0.2G for all three orthogonal components of ground motion and a shape consistent with USNRC Regulatory Guide 1.60. Based upon equivalent dynamic analysis of the FC, DuPont engineers suggested modifications on the existing FC with heavy I-section beams [1]. The scope of this ``phase I`` analysis, as requested by Seismic Engineering [2], is to carry out a ``scoping analysis`` of Frequency Analysis and Response Spectrum Analysis of the FC with DuPont suggested conceptual modifications. Our suggestion was that the existing FC without conceptual modifications be analyzed first. However, the schedule urgency of the project and with guidance from the previous seismic analysis established the priority to perform the analysis for the FC with modifications in the ``phase I`` calculations.

  15. Seismic analysis of reactor exhaust air filter compartment

    SciTech Connect

    Gong, Chung; Funderburk, E.L.; Jerrell, J.W.

    1990-09-24

    The Filter Compartment (FC) in this analysis is a generic reactor airborne activity confinement filter compartment which possesses all the essential physical and mechanical properties of the Savannah River Site (SRS) confinement filters of Reactor Buildings K, L, and P. The filters belong to the Airborne Activity Confinement System (AACS). These filters absorb a significant amount of radioactive effluents from the exhausting air. The seismic excitation is input indirectly from the output of the seismic analysis of the 105 exhaust stack building in the form of floor response spectra. However, the 105 exhaust stack building was analyzed for seismic motions defined by free-field ground response spectra with a ZPA (Zero Period Acceleration) of 0.2G for all three orthogonal components of ground motion and a shape consistent with USNRC Regulatory Guide 1.60. Based upon equivalent dynamic analysis of the FC, DuPont engineers suggested modifications on the existing FC with heavy I-section beams (1). The scope of this phase I'' analysis, as requested by Seismic Engineering (2), is to carry out a scoping analysis'' of Frequency Analysis and Response Spectrum Analysis of the FC with DuPont suggested conceptual modifications. Our suggestion was that the existing FC without conceptual modifications be analyzed first. However, the schedule urgency of the project and with guidance from the previous seismic analysis established the priority to perform the analysis for the FC with modifications in the phase I'' calculations.

  16. Seismic refraction analysis: the path forward

    USGS Publications Warehouse

    Haines, Seth S.; Zelt, Colin; Doll, William

    2012-01-01

    Seismic Refraction Methods: Unleashing the Potential and Understanding the Limitations; Tucson, Arizona, 29 March 2012 A workshop focused on seismic refraction methods took place on 29 May 2012, associated with the 2012 Symposium on the Application of Geophysics to Engineering and Environmental Problems. This workshop was convened to assess the current state of the science and discuss paths forward, with a primary focus on near-surface problems but with an eye on all applications. The agenda included talks on these topics from a number of experts interspersed with discussion and a dedicated discussion period to finish the day. Discussion proved lively at times, and workshop participants delved into many topics central to seismic refraction work.

  17. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  18. How Does Calcification Influence Plaque Vulnerability? Insights from Fatigue Analysis

    PubMed Central

    Wu, Baijian; Pei, Xuan; Li, Zhi-Yong

    2014-01-01

    Background. Calcification is commonly believed to be associated with cardiovascular disease burden. But whether or not the calcifications have a negative effect on plaque vulnerability is still under debate. Methods and Results. Fatigue rupture analysis and the fatigue life were used to evaluate the rupture risk. An idealized baseline model containing no calcification was first built. Based on the baseline model, we investigated the influence of calcification on rupture path and fatigue life by adding a circular calcification and changing its location within the fibrous cap area. Results show that 84.0% of calcified cases increase the fatigue life up to 11.4%. For rupture paths 10D far from the calcification, the life change is negligible. Calcifications close to lumen increase more fatigue life than those close to the lipid pool. Also, calcifications in the middle area of fibrous cap increase more fatigue life than those in the shoulder area. Conclusion. Calcifications may play a positive role in the plaque stability. The influence of the calcification only exists in a local area. Calcifications close to lumen may be influenced more than those close to lipid pool. And calcifications in the middle area of fibrous cap are seemly influenced more than those in the shoulder area. PMID:24955401

  19. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    SciTech Connect

    Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  20. Joint analysis of the seismic data and velocity gravity model

    NASA Astrophysics Data System (ADS)

    Belyakov, A. S.; Lavrov, V. S.; Muchamedov, V. A.; Nikolaev, A. V.

    2016-03-01

    We performed joint analysis of the seismic noises recorded at the Japanese Ogasawara station located on Titijima Island in the Philippine Sea using the STS-2 seismograph at the OSW station in the winter period of January 1-15, 2015, over the background of a velocity gravity model. The graphs prove the existence of a cause-and-effect relation between the seismic noise and gravity and allow us to consider it as a desired signal.

  1. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed

  2. Seismic analysis for translational failure of landfills with retaining walls.

    PubMed

    Feng, Shi-Jin; Gao, Li-Ya

    2010-11-01

    In the seismic impact zone, seismic force can be a major triggering mechanism for translational failures of landfills. The scope of this paper is to develop a three-part wedge method for seismic analysis of translational failures of landfills with retaining walls. The approximate solution of the factor of safety can be calculated. Unlike previous conventional limit equilibrium methods, the new method is capable of revealing the effects of both the solid waste shear strength and the retaining wall on the translational failures of landfills during earthquake. Parameter studies of the developed method show that the factor of safety decreases with the increase of the seismic coefficient, while it increases quickly with the increase of the minimum friction angle beneath waste mass for various horizontal seismic coefficients. Increasing the minimum friction angle beneath the waste mass appears to be more effective than any other parameters for increasing the factor of safety under the considered condition. Thus, selecting liner materials with higher friction angle will considerably reduce the potential for translational failures of landfills during earthquake. The factor of safety gradually increases with the increase of the height of retaining wall for various horizontal seismic coefficients. A higher retaining wall is beneficial to the seismic stability of the landfill. Simply ignoring the retaining wall will lead to serious underestimation of the factor of safety. Besides, the approximate solution of the yield acceleration coefficient of the landfill is also presented based on the calculated method. PMID:20541389

  3. Seismic analysis for the Lanzhou fireball

    NASA Astrophysics Data System (ADS)

    Dailu, Rong; Yarong, Li

    2007-08-01

    In this paper, we convert the Lanzhou fireball's trajectory using seismic data according to the analytical method presented in Pujol et al. (2005, 2006). Taking the same assumptions as Pujol et al., the position of the fireball burst at its terminal has been converted using a relative simple independent method. Both the trajectory and the position of burst are roughly coincident.

  4. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to

  5. Coupling induced seismic hazard analysis with reservoir design

    NASA Astrophysics Data System (ADS)

    Gischig, V.; Wiemer, S.; Alcolea, A. R.

    2013-12-01

    positive impact on seismic hazard. However, as smaller magnitudes contribute less to permeability enhancement the efficiency of stimulation is degraded in case of high b-value conditions. Nevertheless, target permeability enhancement can be still be achieved under high b-value condition without reaching an unacceptable seismic hazard level, if either initial permeability is already high or if several fractures are stimulated. The proposed modelling approach is a first step towards including induced seismic hazard analysis into the design of reservoir stimulation.

  6. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  7. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT SEISMIC ANALYSIS OF HANFORD DOUBLE SHELL TANKS

    SciTech Connect

    MACKEY, T.C.

    2006-03-17

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratory (PNNL) to perform seismic analysis of the Hanford Site double-shell tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project--DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST system at Hanford in support of Tri-Party Agreement Milestone M-48-14, The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). The work statement provided to M&D (PNNL 2003) required that the seismic analysis of the DSTs assess the impacts of potentially non-conservative assumptions in previous analyses and account for the additional soil mass due to the as-found soil density increase, the effects of material degradation, additional thermal profiles applied to the full structure including the soil-structure response with the footings, the non-rigid (low frequency) response of the tank roof, the asymmetric seismic-induced soil loading, the structural discontinuity between the concrete tank wall and the support footing and the sloshing of the tank waste. The seismic analysis considers the interaction of the tank with the surrounding soil and the effects of the primary tank contents. The DSTs and the surrounding soil are modeled as a system of finite elements. The depth and width of the soil incorporated into the analysis model are sufficient to obtain appropriately accurate analytical results. The analyses required to support the work statement differ from previous analysis of the DSTs in that the soil-structure interaction (SSI) model includes several (nonlinear) contact surfaces in the tank structure, and the contained waste must be modeled explicitly in order to capture the fluid-structure interaction behavior between the primary tank and contained

  8. A new passive seismic method based on seismic interferometry and multichannel analysis of surface waves

    NASA Astrophysics Data System (ADS)

    Cheng, Feng; Xia, Jianghai; Xu, Yixian; Xu, Zongbo; Pan, Yudi

    2015-06-01

    We proposed a new passive seismic method (PSM) based on seismic interferometry and multichannel analysis of surface waves (MASW) to meet the demand for increasing investigation depth by acquiring surface-wave data at a low-frequency range (1 Hz ≤ f ≤ 10 Hz). We utilize seismic interferometry to sort common virtual source gathers (CVSGs) from ambient noise and analyze obtained CVSGs to construct 2D shear-wave velocity (Vs) map using the MASW. Standard ambient noise processing procedures were applied to the computation of cross-correlations. To enhance signal to noise ratio (SNR) of the empirical Green's functions, a new weighted stacking method was implemented. In addition, we proposed a bidirectional shot mode based on the virtual source method to sort CVSGs repeatedly. The PSM was applied to two field data examples. For the test along Han River levee, the results of PSM were compared with the improved roadside passive MASW and spatial autocorrelation method (SPAC). For test in the Western Junggar Basin, PSM was applied to a 70 km long linear survey array with a prominent directional urban noise source and a 60 km-long Vs profile with 1.5 km in depth was mapped. Further, a comparison about the dispersion measurements was made between PSM and frequency-time analysis (FTAN) technique to assess the accuracy of PSM. These examples and comparisons demonstrated that this new method is efficient, flexible, and capable to study near-surface velocity structures based on seismic ambient noise.

  9. Weighted network analysis of earthquake seismic data

    NASA Astrophysics Data System (ADS)

    Chakraborty, Abhijit; Mukherjee, G.; Manna, S. S.

    2015-09-01

    Three different earthquake seismic data sets are used to construct the earthquake networks following the prescriptions of Abe and Suzuki (2004). It has been observed that different links of this network appear with highly different strengths. This prompted us to extend the study of earthquake networks by considering it as the weighted network. Different properties of such weighted network have been found to be quite different from those of their un-weighted counterparts.

  10. The Uncertainty in the Local Seismic Response Analysis

    SciTech Connect

    Pasculli, A.; Pugliese, A.; Romeo, R. W.; Sano, T.

    2008-07-08

    In the present paper is shown the influence on the local seismic response analysis exerted by considering dispersion and uncertainty in the seismic input as well as in the dynamic properties of soils. In a first attempt a 1D numerical model is developed accounting for both the aleatory nature of the input motion and the stochastic variability of the dynamic properties of soils. The seismic input is introduced in a non-conventional way through a power spectral density, for which an elastic response spectrum, derived--for instance--by a conventional seismic hazard analysis, is required with an appropriate level of reliability. The uncertainty in the geotechnical properties of soils are instead investigated through a well known simulation technique (Monte Carlo method) for the construction of statistical ensembles. The result of a conventional local seismic response analysis given by a deterministic elastic response spectrum is replaced, in our approach, by a set of statistical elastic response spectra, each one characterized by an appropriate level of probability to be reached or exceeded. The analyses have been carried out for a well documented real case-study. Lastly, we anticipate a 2D numerical analysis to investigate also the spatial variability of soil's properties.

  11. Real Option Cost Vulnerability Analysis of Electrical Infrastructure

    NASA Astrophysics Data System (ADS)

    Prime, Thomas; Knight, Phil

    2015-04-01

    Critical infrastructure such as electricity substations are vulnerable to various geo-hazards that arise from climate change. These geo-hazards range from increased vegetation growth to increased temperatures and flood inundation. Of all the identified geo-hazards, coastal flooding has the greatest impact, but to date has had a low probability of occurring. However, in the face of climate change, coastal flooding is likely to occur more often due to extreme water levels being experienced more frequently due to sea-level rise (SLR). Knowing what impact coastal flooding will have now and in the future on critical infrastructure such as electrical substations is important for long-term management. Using a flood inundation model, present day and future flood events have been simulated, from 1 in 1 year events up to 1 in 10,000 year events. The modelling makes an integrated assessment of impact by using sea-level and surge to simulate a storm tide. The geographical area the model covers is part of the Northwest UK coastline with a range of urban and rural areas. The ensemble of flood maps generated allows the identification of critical infrastructure exposed to coastal flooding. Vulnerability has be assessed using an Estimated Annual Damage (EAD) value. Sampling SLR annual probability distributions produces a projected "pathway" for SLR up to 2100. EAD is then calculated using a relationship derived from the flood model. Repeating the sampling process allows a distribution of EAD up to 2100 to be produced. These values are discounted to present day values using an appropriate discount rate. If the cost of building and maintain defences is also removed from this a Net Present Value (NPV) of building the defences can be calculated. This distribution of NPV can be used as part of a cost modelling process involving Real Options, A real option is the right but not obligation to undertake investment decisions. In terms of investment in critical infrastructure resilience this

  12. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  13. Development and implementation of a GIS-based tool for spatial modeling of seismic vulnerability of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, M.; Alesheikh, A. A.

    2012-12-01

    Achieving sustainable development in countries prone to earthquakes is possible with taking effective measures to reduce vulnerability to earthquakes. In this context, damage assessment of hypothetical earthquakes and planning for disaster management are important issues. Having a computer tool capable of estimating structural and human losses from earthquakes in a specific region may facilitate the decision-making process before and during disasters. Interoperability of this tool with wide-spread spatial analysis frameworks will expedite the data transferring process. In this study, the earthquake damage assessment (EDA) software tool is developed as an embedded extension within a GIS (geographic information system) environment for the city of Tehran, Iran. This GIS-based extension provides users with a familiar environment to estimate and observe the probable damages and fatalities of a deterministic earthquake scenario. The productivity of this tool is later demonstrated for southern Karoon parish, Region 10, Tehran. Three case studies for three active faults in the area and a comparison of the results with other research substantiated the reliability of this tool for additional earthquake scenarios.

  14. Vulnerability analysis for a drought Early Warning System

    NASA Astrophysics Data System (ADS)

    Angeluccetti, Irene; Demarchi, Alessandro; Perez, Francesca

    2014-05-01

    Early Warning Systems (EWS) for drought are often based on risk models that do not, or marginally, take into account the vulnerability factor. The multifaceted nature of drought (hydrological, meteorological, and agricultural) is source of coexistence for different ways to measure this phenomenon and its effects. The latter, together with the complexity of impacts generated by this hazard, causes the current underdevelopment of drought EWS compared to other hazards. In Least Developed Countries, where drought events causes the highest numbers of affected people, the importance of correct monitoring and forecasting is considered essential. Existing early warning and monitoring systems for drought produced at different geographic levels, provide only in a few cases an actual spatial model that tries to describe the cause-effect link between where the hazard is detected and where impacts occur. Integrate vulnerability information in such systems would permit to better estimate affected zones and livelihoods, improving the effectiveness of produced hazard-related datasets and maps. In fact, the need of simplification and, in general, of a direct applicability of scientific outputs is still a matter of concern for field experts and early warning products end-users. Even if the surplus of hazard related information produced right after catastrophic events has, in some cases, led to the creation of specific data-sharing platforms, the conveyed meaning and usefulness of each product has not yet been addressed. The present work is an attempt to fill this gap which is still an open issue for the scientific community as well as for the humanitarian aid world. The study aims at conceiving a simplified vulnerability model to embed into an existing EWS for drought, which is based on the monitoring of vegetation phenological parameters and the Standardized Precipitation Index, both produced using free satellite derived datasets. The proposed vulnerability model includes (i) a

  15. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

    NASA Astrophysics Data System (ADS)

    Liu, Aiwen; Chen, Kun; Wu, Jian

    2010-06-01

    The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

  16. Seismic component fragility data base for IPEEE

    SciTech Connect

    Bandyopadhyay, K.; Hofmayer, C.

    1990-01-01

    Seismic probabilistic risk assessment or a seismic margin study will require a reliable data base of seismic fragility of various equipment classes. Brookhaven National Laboratory (BNL) has selected a group of equipment and generically evaluated the seismic fragility of each equipment class by use of existing test data. This paper briefly discusses the evaluation methodology and the fragility results. The fragility analysis results when used in the Individual Plant Examination for External Events (IPEEE) Program for nuclear power plants are expected to provide insights into seismic vulnerabilities of equipment for earthquakes beyond the design basis. 3 refs., 1 fig., 1 tab.

  17. Structural Identification And Seismic Analysis Of An Existing Masonry Building

    SciTech Connect

    Del Monte, Emanuele; Galano, Luciano; Ortolani, Barbara; Vignoli, Andrea

    2008-07-08

    The paper presents the diagnostic investigation and the seismic analysis performed on an ancient masonry building in Florence. The building has historical interest and is subjected to conservative restrictions. The investigation involves a preliminary phase concerning the research of the historic documents and a second phase of execution of in situ and laboratory tests to detect the mechanical characteristics of the masonry. This investigation was conceived in order to obtain the 'LC2 Knowledge Level' and to perform the non-linear pushover analysis according to the new Italian Standards for seismic upgrading of existing masonry buildings.

  18. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    USGS Publications Warehouse

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  19. Fractal Segmentation and Clustering Analysis for Seismic Time Slices

    NASA Astrophysics Data System (ADS)

    Ronquillo, G.; Oleschko, K.; Korvin, G.; Arizabalo, R. D.

    2002-05-01

    Fractal analysis has become part of the standard approach for quantifying texture on gray-tone or colored images. In this research we introduce a multi-stage fractal procedure to segment, classify and measure the clustering patterns on seismic time slices from a 3-D seismic survey. Five fractal classifiers (c1)-(c5) were designed to yield standardized, unbiased and precise measures of the clustering of seismic signals. The classifiers were tested on seismic time slices from the AKAL field, Cantarell Oil Complex, Mexico. The generalized lacunarity (c1), fractal signature (c2), heterogeneity (c3), rugosity of boundaries (c4) and continuity resp. tortuosity (c5) of the clusters are shown to be efficient measures of the time-space variability of seismic signals. The Local Fractal Analysis (LFA) of time slices has proved to be a powerful edge detection filter to detect and enhance linear features, like faults or buried meandering rivers. The local fractal dimensions of the time slices were also compared with the self-affinity dimensions of the corresponding parts of porosity-logs. It is speculated that the spectral dimension of the negative-amplitude parts of the time-slice yields a measure of connectivity between the formation's high-porosity zones, and correlates with overall permeability.

  20. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  1. Seismically induced relay chatter risk analysis for the Advanced Test Reactor

    SciTech Connect

    Khericha, S.T.; Calley, M.B.; Farmer, F.G.; Eide, S.A.; Ravindra, M.K.; Campbell, R.D.

    1992-12-31

    A seismic probabilistic risk assessment (PRA) was performed as part of the Level I PRA for the Department of Energy (DOE) Advanced Test Reactor (ATR) located at the Idaho National Engineering Laboratory (INEL). This seismic PRA included a comprehensive and efficient seismically-induced relay chatter risk analysis. The key elements to this comprehensive and efficient seismically-induced relay chatter analysis included (1) screening procedures to identify the critical relays to be evaluated, (2) streamlined seismic fragility evaluation, and (3) comprehensive seismic risk evaluation using detailed event trees and fault trees. These key elements were performed to provide a core fuel damage frequency evaluation due to seismically induced relay chatter. A sensitivity analysis was performed to evaluate the impact of including seismically-induced relay chatter events in the seismic PRA. The systems analysis was performed by EG&G Idaho, Inc. and the fragilities for the relays were developed by EQE Engineering Consultants.

  2. Seismically induced relay chatter risk analysis for the Advanced Test Reactor

    SciTech Connect

    Khericha, S.T.; Calley, M.B.; Farmer, F.G. ); Eide, S.A. ); Ravindra, M.K.; Campbell, R.D. )

    1992-01-01

    A seismic probabilistic risk assessment (PRA) was performed as part of the Level I PRA for the Department of Energy (DOE) Advanced Test Reactor (ATR) located at the Idaho National Engineering Laboratory (INEL). This seismic PRA included a comprehensive and efficient seismically-induced relay chatter risk analysis. The key elements to this comprehensive and efficient seismically-induced relay chatter analysis included (1) screening procedures to identify the critical relays to be evaluated, (2) streamlined seismic fragility evaluation, and (3) comprehensive seismic risk evaluation using detailed event trees and fault trees. These key elements were performed to provide a core fuel damage frequency evaluation due to seismically induced relay chatter. A sensitivity analysis was performed to evaluate the impact of including seismically-induced relay chatter events in the seismic PRA. The systems analysis was performed by EG G Idaho, Inc. and the fragilities for the relays were developed by EQE Engineering Consultants.

  3. Seismic analysis of the mirror fusion test facility shielding vault

    SciTech Connect

    Gabrielsen, B.L.; Tsai, K.

    1981-04-01

    This report presents a seismic analysis of the vault in Building 431 at Lawrence Livermore National Laboratory which houses the mirror Fusion Test Facility. The shielding vault structure is approximately 120 ft long by 80 ft wide and is constructed of concrete blocks approximately 7 x 7 x 7 ft. The north and south walls are approximately 53 ft high and the east wall is approximately 29 ft high. These walls are supported on a monolithic concrete foundation that surrounds a 21-ft deep open pit. Since the 53-ft walls appeared to present the greatest seismic problem they were the first investigated.

  4. Sideband analysis and seismic detection in a large ring laser

    NASA Astrophysics Data System (ADS)

    Stedman, G. E.; Li, Z.; Bilger, H. R.

    1995-08-01

    A ring laser unlocked by the Earth's Sagnac effect has attained a frequency resolution of 1 part in 3 \\times 1021 and a rotational resolution of 300 prad. We discuss both theoretically and experimentally the sideband structure of the Earth rotation-induced spectral line induced in the microhertz-hertz region by frequency modulation associated with extra mechanical motion, such as seismic events. The relative sideband height is an absolute measure of the rotational amplitude of that Fourier component. An initial analysis is given of the ring laser record from the Arthur's Pass-Coleridge seismic event of 18 June 1994.

  5. Seismic Response Analysis and Design of Structure with Base Isolation

    SciTech Connect

    Rosko, Peter

    2010-05-21

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  6. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    DOE Data Explorer

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  7. Utilizing Semantic Big Data for realizing a National-scale Infrastructure Vulnerability Analysis System

    SciTech Connect

    Chinthavali, Supriya; Shankar, Mallikarjun

    2016-01-01

    Critical Infrastructure systems(CIs) such as energy, water, transportation and communication are highly interconnected and mutually dependent in complex ways. Robust modeling of CIs interconnections is crucial to identify vulnerabilities in the CIs. We present here a national-scale Infrastructure Vulnerability Analysis System (IVAS) vision leveraging Se- mantic Big Data (SBD) tools, Big Data, and Geographical Information Systems (GIS) tools. We survey existing ap- proaches on vulnerability analysis of critical infrastructures and discuss relevant systems and tools aligned with our vi- sion. Next, we present a generic system architecture and discuss challenges including: (1) Constructing and manag- ing a CI network-of-networks graph, (2) Performing analytic operations at scale, and (3) Interactive visualization of ana- lytic output to generate meaningful insights. We argue that this architecture acts as a baseline to realize a national-scale network based vulnerability analysis system.

  8. Principal Component Analysis for pattern recognition in volcano seismic spectra

    NASA Astrophysics Data System (ADS)

    Unglert, Katharina; Jellinek, A. Mark

    2016-04-01

    Variations in the spectral content of volcano seismicity can relate to changes in volcanic activity. Low-frequency seismic signals often precede or accompany volcanic eruptions. However, they are commonly manually identified in spectra or spectrograms, and their definition in spectral space differs from one volcanic setting to the next. Increasingly long time series of monitoring data at volcano observatories require automated tools to facilitate rapid processing and aid with pattern identification related to impending eruptions. Furthermore, knowledge transfer between volcanic settings is difficult if the methods to identify and analyze the characteristics of seismic signals differ. To address these challenges we have developed a pattern recognition technique based on a combination of Principal Component Analysis and hierarchical clustering applied to volcano seismic spectra. This technique can be used to characterize the dominant spectral components of volcano seismicity without the need for any a priori knowledge of different signal classes. Preliminary results from applying our method to volcanic tremor from a range of volcanoes including K¯ı lauea, Okmok, Pavlof, and Redoubt suggest that spectral patterns from K¯ı lauea and Okmok are similar, whereas at Pavlof and Redoubt spectra have their own, distinct patterns.

  9. Ground motion estimation and nonlinear seismic analysis

    SciTech Connect

    McCallen, D.B.; Hutchings, L.J.

    1995-08-14

    Site specific predictions of the dynamic response of structures to extreme earthquake ground motions are a critical component of seismic design for important structures. With the rapid development of computationally based methodologies and powerful computers over the past few years, engineers and scientists now have the capability to perform numerical simulations of many of the physical processes associated with the generation of earthquake ground motions and dynamic structural response. This paper describes application of a physics based, deterministic, computational approach for estimation of earthquake ground motions which relies on site measurements of frequently occurring small (i.e. M < 3 ) earthquakes. Case studies are presented which illustrate application of this methodology for two different sites, and nonlinear analyses of a typical six story steel frame office building are performed to illustrate the potential sensitivity of nonlinear response to site conditions and proximity to the causative fault.

  10. Seismic analysis applied to the delimiting of a gas reservoir

    SciTech Connect

    Ronquillo, G.; Navarro, M.; Lozada, M.; Tafolla, C.

    1996-08-01

    We present the results of correlating seismic models with petrophysical parameters and well logs to mark the limits of a gas reservoir in sand lenses. To fulfill the objectives of the study, we used a data processing sequence that included wavelet manipulation, complex trace attributes and pseudovelocities inversion, along with several quality control schemes to insure proper amplitude preservation. Based on the analysis and interpretation of the seismic sections, several areas of interest were selected to apply additional signal treatment as preconditioning for petrophysical inversion. Signal classification was performed to control the amplitudes along the horizons of interest, and to be able to find an indirect interpretation of lithologies. Additionally, seismic modeling was done to support the results obtained and to help integrate the interpretation. The study proved to be a good auxiliary tool in the location of the probable extension of the gas reservoir in sand lenses.

  11. Elastic structure and seismicity of Donegal (Ireland): insights from passive seismic analysis

    NASA Astrophysics Data System (ADS)

    Piana Agostinetti, Nicola

    2016-04-01

    Ireland's crust is the result of a complex geological history, which began in the Palaeozoic with the oblique closure of the Iapetus Ocean and, probably, it is still on-going. In the northwestern portion of the island, the geology of Donegal has been the subject of detailed geological investigation by many workers in the last century. The most widely represented rock types in Donegal are metasediments of Dalradian and Moinian age, invaded by several granites of Caledonian age (so called Donegal granite). Smaller and separate intrusions are present (e.g. Fanad Head). On the contrary, it is widely accepted that the the deep crustal structure of the northern portion of Ireland has been re-worked in more recent time. The several phases of lithospheric stretching associated to the opening of the Atlantic ocean interested such portion of Ireland, with the extrusion of flood basalts. Moreover, the presence of a hot, low-density asthenospheric plume spreading from Iceland has been suggested, with the formation of a thick high-velocity layer of magmatic underplated material at the base of the crust. Oddly, at present, Donegal is the only seismically active area in Ireland, with an average rate of one Mw=2-3 event every 3-4 years. In the last three years, passive seismic data have been recorded at 12 seismic stations deployed across the most seismically active area in Co. Donegal, with the aim of reconstructing the seismic structure down to the upper-mantle depth and of locating the microseismic activity within investigating volume. Both local and teleseismic events were recorded giving the opportunity of integrating results form different techniques for seismic data analysis, and jointly interpret them together with surface geology and mapped fault traces. Local events have been used to define constrain faulting volumes, focal mechanisms and to reconstruct a low-resolution 3D Vp and VpVs velocity models. Teleseismic events have been used to compute receiver function data

  12. Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis

    NASA Astrophysics Data System (ADS)

    Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter

    2013-04-01

    Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at

  13. Variability and Uncertainty in Probabilistic Seismic Hazard Analysis for the Island of Montreal

    NASA Astrophysics Data System (ADS)

    Elkady, Ahmed Mohamed Ahmed

    The current seismic design process for structures in Montreal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montreal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "epsilon" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montreal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.

  14. A watershed-based cumulative risk impact analysis: environmental vulnerability and impact criteria.

    PubMed

    Osowski, S L; Swick, J D; Carney, G R; Pena, H B; Danielson, J E; Parrish, D A

    2001-01-01

    Swine Concentrated Animal Feeding Operations (CAFOs) have received much attention in recent years. As a result, a watershed-based screening tool, the Cumulative Risk Index Analysis (CRIA), was developed to assess the cumulative impacts of multiple CAFO facilities in a watershed subunit. The CRIA formula calculates an index number based on: 1) the area of one or more facilities compared to the area of the watershed subunit, 2) the average of the environmental vulnerability criteria, and 3) the average of the industry-specific impact criteria. Each vulnerability or impact criterion is ranked on a 1 to 5 scale, with a low rank indicating low environmental vulnerability or impact and a high rank indicating high environmental vulnerability or impact. The individual criterion ranks, as well as the total CRIA score, can be used to focus the environmental analysis and facilitate discussions with industry, public, and other stakeholders in the Agency decision-making process. PMID:11214349

  15. The application of seismic risk-benefit analysis to land use planning in Taipei City.

    PubMed

    Hung, Hung-Chih; Chen, Liang-Chun

    2007-09-01

    In the developing countries of Asia local authorities rarely use risk analysis instruments as a decision-making support mechanism during planning and development procedures. The main purpose of this paper is to provide a methodology to enable planners to undertake such analyses. We illustrate a case study of seismic risk-benefit analysis for the city of Taipei, Taiwan, using available land use maps and surveys as well as a new tool developed by the National Science Council in Taiwan--the HAZ-Taiwan earthquake loss estimation system. We use three hypothetical earthquakes to estimate casualties and total and annualised direct economic losses, and to show their spatial distribution. We also characterise the distribution of vulnerability over the study area using cluster analysis. A risk-benefit ratio is calculated to express the levels of seismic risk attached to alternative land use plans. This paper suggests ways to perform earthquake risk evaluations and the authors intend to assist city planners to evaluate the appropriateness of their planning decisions. PMID:17714167

  16. Seismic Earth: Array Analysis of Broadband Seismograms

    NASA Astrophysics Data System (ADS)

    Levander, Alan; Nolet, Guust

    Seismology is one of the few means available to Earth scientists for probing the mechanical structure of the Earth's interior. The advent of modern seismic instrumentation at the end of the 19th century and its installation across the globe was shortly followed by mankind's first general understanding of the Earth's interior: The Croatian seismologist Andrija Mohorovičić discovered the crust-mantle boundary in central Europe in 1909, the German Beno Gutenberg determined the radius of the Earth's core in 1913, Great Britian's Sir Harold Jeffreys established its fluid character by 1926, and the Dane Inge Lehman discovered the solid inner core in 1936. It is notable that seismology, even in its earliest days, was an international science. Unlike much of the Earth sciences, seismology has its roots in physics, notably optics (many university seismology programs are, or initially were, attached to meteorology, astronomy, or physics departments), and draws from the literatures of imaging systems and statistical communications theory developed by, or employed in, astronomy, electrical engineering, medicine, ocean acoustics, and nondestructive materials testing. Seismology has close ties to petro-physics and mineral physics, the measurements of the disciplines being compared to infer the chemical and physical structure of the Earth's interior.

  17. Bayesian probability analysis for acoustic-seismic landmine detection

    NASA Astrophysics Data System (ADS)

    Xiang, Ning; Sabatier, James M.; Goggans, Paul M.

    2002-11-01

    Landmines buried in the subsurface induce distinct changes in the seismic vibration of the ground surface when an acoustic source insonifies the ground. A scanning laser Doppler vibrometer (SLDV) senses the acoustically-induced seismic vibration of the ground surface in a noncontact, remote manner. The SLDV-based acoustic-to-seismic coupling technology exhibits significant advantages over conventional sensors due to its capability for detecting both metal and nonmetal mines and its stand-off distance. The seismic vibration data scanned from the SLDV are preprocessed to form images. The detection of landmines relies primarily on an analysis of the target amplitude, size, shape, and frequency range. A parametric model has been established [Xiang and Sabatier, J. Acoust. Soc. Am. 110, 2740 (2001)] to describe the amplified surface vibration velocity induced by buried landmines within an appropriate frequency range. This model incorporates vibrational amplitude, size, position of landmines, and the background amplitude into a model-based analysis process in which Bayesian target detection and parameter estimation have been applied. Based on recent field measurement results, the landmine detection procedure within a Bayesian framework will be discussed. [Work supported by the United States Army Communications-Electronics Command, Night Vision and Electronic Sensors Directorate.

  18. Wind and seismic analysis for liquid-level gauge support

    SciTech Connect

    Ziada, A.H.

    1994-12-07

    A wind and seismic analysis was performed for the liquid-level gauge installation support stand. The analysis includes the stand and footing only. All of these supports are classified as safety class 3. The analysis was based on safety class 2 requirements for conservatism. Conventional hand calculations were performed to evaluate the stresses and overturning of the structure. The results and recommendations appear in Section 2.0. The configuration and loadings are discussed in Section 3.0; the analysis and evaluation appears in Section 4.0; and the detailed analysis is documented in Appendix A.

  19. An integrated analysis of controlled- and passive source seismic data

    NASA Astrophysics Data System (ADS)

    Rumpfhuber, Eva-Maria

    This dissertation consists of two parts, which include a study using passive source seismic data, and one using the dataset from a large-scale refraction/wide-angle reflection seismic experiment as the basis for an integrated analysis. The goal of the dissertation is the integration of the two different datasets and a combined interpretation of the results of the "Continental Dynamics of the Rocky Mountains" (CD-ROM) 1999 seismic experiment. I have determined the crustal structure using four different receiver function methods using data collected from the northern transect of the CD-ROM passive seismic experiment. The resulting migrated image and crustal thickness determinations confirm and define prior crustal thickness measurements based on the CD-ROM and Deep Probe datasets. The new results show a very strong lower crustal layer (LCL) with variable thickness beneath the Wyoming Province. In addition, I was able to show that it terminates at 42° latitude and provide a seismic tie between the CD-ROM and Deep Probe seismic experiments so they represent a continuous N-S transect extending from New Mexico into Alberta, Canada. This new tie is particularly important because it occurs close to a major tectonic boundary, the Cheyenne belt, between an Archean craton and a Proterozoic terrane. The controlled-source seismic dataset was analyzed with the aid of forward modeling and inversion to establish a two-dimensional velocity and interface model of the area. I have developed a picking strategy, which helps identify the seismic phases, and improves quality and quantity of the picks. In addition, I was able to pick and identify S-wave phases, which furthermore allowed me to establish an independent S-wave model, and hence the Poisson's and Vp/Vs ratios. The final velocity and interface model was compared to prior results, and the results were jointly interpreted with the receiver function results. Thanks to the integration of the controlled-source and receiver function

  20. Assessing the Performance of a Classification-Based Vulnerability Analysis Model.

    PubMed

    Wang, Tai-ran; Mousseau, Vincent; Pedroni, Nicola; Zio, Enrico

    2015-09-01

    In this article, a classification model based on the majority rule sorting (MR-Sort) method is employed to evaluate the vulnerability of safety-critical systems with respect to malevolent intentional acts. The model is built on the basis of a (limited-size) set of data representing (a priori known) vulnerability classification examples. The empirical construction of the classification model introduces a source of uncertainty into the vulnerability analysis process: a quantitative assessment of the performance of the classification model (in terms of accuracy and confidence in the assignments) is thus in order. Three different app oaches are here considered to this aim: (i) a model-retrieval-based approach, (ii) the bootstrap method, and (iii) the leave-one-out cross-validation technique. The analyses are presented with reference to an exemplificative case study involving the vulnerability assessment of nuclear power plants. PMID:25487957

  1. Exploring drought vulnerability in Africa: an indicator based analysis to inform early warning systems

    NASA Astrophysics Data System (ADS)

    Naumann, G.; Barbosa, P.; Garrote, L.; Iglesias, A.; Vogt, J.

    2013-10-01

    Drought vulnerability is a complex concept that includes both biophysical and socio-economic drivers of drought impact that determine capacity to cope with drought. In order to develop an efficient drought early warning system and to be prepared to mitigate upcoming drought events it is important to understand the drought vulnerability of the affected regions. We propose a composite Drought Vulnerability Indicator (DVI) that reflects different aspects of drought vulnerability evaluated at Pan-African level in four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation is less vulnerable to drought; furthermore we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided in three main different

  2. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  3. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    NASA Astrophysics Data System (ADS)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  4. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  5. Seismic and hydroacoustic analysis relevant to MH370

    SciTech Connect

    Stead, Richard J.

    2014-07-03

    The vicinity of the Indian Ocean is searched for open and readily available seismic and/or hydroacoustic stations that might have recorded a possible impact of MH370 with the ocean surface. Only three stations are identified: the IMS hydrophone arrays H01 and H08, and the Geoscope seismic station AIS. Analysis of the data from these stations shows an interesting arrival on H01 that has some interference from an Antarctic ice event, large amplitude repeating signals at H08 that obscure any possible arrivals, and large amplitude chaotic noise at AIS precludes any analysis at higher frequencies of interest. The results are therefore rather inconclusive but may point to a more southerly impact location within the overall Indian Ocean search region. The results would be more useful if they can be combined with any other data that are not readily available.

  6. Improved seismic data analysis tool in hydrogeophysical applications

    NASA Astrophysics Data System (ADS)

    Scholtz, P.

    2003-04-01

    To study the near-surface environment several geophysical measurement techniques exist. Seismic methods are widely and successfully used to aid the solution of different geological tasks. Unfortunately the financial background of environmental related efforts are limited, hence it is vital to get the most information out of our geophysical field data. Hydrogeological investigations require special accuracy and resolution from the applied seismic methods. A dispersion analysis tool will be presented, which is insensitive to inaccuracies, works under noisy conditions and can separate close arrivals. We show a wavelet transformation based method, where the choice of the appropriate basic wavelet improves the quality of the results. Applying this analysis technique to the inversion of frequency-velocity functions of wave guiding sequences or in mapping inhomogeneities of the near-surface by group traveltime tomography will yield to more reliable physical parameters needed by hydrogeologists.

  7. Order parameter analysis of seismicity of the Mexican Pacific coast

    NASA Astrophysics Data System (ADS)

    Ramírez-Rojas, A.; Flores-Márquez, E. L.

    2013-05-01

    The natural time domain has shown to be an important tool to obtain relevant information hidden in time series of complex systems not easily obtainable by means of standard analysis methods. By assuming that tectonism is a complex system and that earthquakes are similar to a phase transition, it is possible to define an order parameter for seismicity in the context of the natural time domain. In this work we analyze the statistical features of the order parameter (OP) computed for the seismic Mexican catalog spanning from 1974 to 2012. We found that in four out of the six regions the pdf of the order parameter fluctuations is similar with that earlier reported by other authors, but in two of these regions noticeable differences are identified. Also, except for Michoacán, the scaled pdfs analysis of all regions collapse on a universal curve with non-Gaussian tails.

  8. Analysis of Vulnerability Around The Colima Volcano, MEXICO

    NASA Astrophysics Data System (ADS)

    Carlos, S. P.

    2001-12-01

    The Colima volcano located in the western of the Trasmexican Volcanic Belt, in the central portion of the Colima Rift Zone, between the Mexican States of Jalisco and Colima. The volcano since January of 1998 presents a new activity, which has been characterized by two stages: the first one was an effusive phase that begin on 20 November 1998 and finish by the middle of January 1999. On February 10of 1999 a great explosion in the summit marked the beginning of an explosive phase, these facts implies that the eruptive process changes from an effusive model to an explosive one. Suárez-Plascencia et al, 2000, present hazard maps to ballistic projectiles, ashfalls and lahars for this scenario. This work presents the evaluation of the vulnerability in the areas identified as hazardous in the maps for ballistic, ashfalls and lahars, based on the economic elements located in the middle and lower sections of the volcano building, like agriculture, forestry, agroindustries and communication lines (highways, power, telephonic, railroad, etc). The method is based in Geographic Information Systems, using digital cartography scale 1:50,000, digital orthophotos from the Instituto Nacional de Estadística, Geografía e Informática, SPOT and Landsat satellite images from 1997 and 2000 in the bands 1, 2 and 3. The land use maps obtained for 1997 and 2000, were compared with the land use map reported by Suárez in 1992, from these maps an increase of the 5 porcent of the sugar cane area and corn cultivations were observed compared of those of 1990 (1225.7 km2) and a decrease of the forest surface, moving the agricultural limits uphill, and showing also some agave cultivation in the northwest and north hillslopes of the Nevado de Colima. This increment of the agricultural surface results in bigger economic activity in the area, which makes that the vulnerability also be increased to different volcanic products emitted during this phase of activity. The degradation of the soil by the

  9. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  10. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  11. Seismicity analysis in Indonesia region from high precision hypocenter location

    NASA Astrophysics Data System (ADS)

    Nugraha, Andri; Shiddiqi, Hasbi; Widiyantoro, Sri; Ramdhan, Mohamad; Wandono, Wandono

    2015-04-01

    As a complex tectonic region, Indonesia has a high seismicity rate which is related to subduction and collision as well as strike-slip fault. High-precision earthquake locations with adequate relocation method and proper velocity model are necessary for seismicity analysis. We used nearly 25,000 earthquakes that were relocated using double-difference method. In our relocation process, we employed teleseismic, regional, and local P-wave arrival times. Furthermore, we employed regional-global nested velocity models that take into account the subduction slab in the study region by using a 3D model for area inside and a 1D model for area outside Indonesia. Relocation results show shifted hypocenters that are generally perpendicular to the trench. Beneath western Sunda arc, the Wadati-Benioff Zone (WBZ) extents to a depth of about 300 km and depicts a gently dipping slab. The WBZ beneath eastern Sunda arc extends deeper to about 500 km and depicts a steep slab geometry. In the Sunda-Banda transition zone, we found anomalously low seismicity beneath the oceanic-continental transition region. The WBZ of the severely curved Banda arc extends to a depth of about 600 km and depicts a two-slab model. In the Molucca collision zone, seismicity clearly depicts two opposing slabs of the Molucca sea plate, i.e. to the east and to the west. Around Sulawesi region, most earthquakes are related to the north Sulawesi trench and depict subducted slab beneath the northern part of the island. In Sumatra region, we identified a seismic gap in the WBZ between 70 km and 150 km. Seismicity gaps are also detected beneath particular regions, e.g. Mentawai region, and several parts along the subducted slab. Similar to the Sumatra region, beneath eastern Sunda arc, seismic gap in WBZ is also detected but deeper, i.e. at depths of 150 km to 250 km. Furthermore, we used global centroid moment tensor catalog data available for earthquakes with magnitude 5.0 or greater. In general, focal mechanism

  12. Interdependent networks: vulnerability analysis and strategies to limit cascading failure

    NASA Astrophysics Data System (ADS)

    Fu, Gaihua; Dawson, Richard; Khoury, Mehdi; Bullock, Seth

    2014-07-01

    Network theory is increasingly employed to study the structure and behaviour of social, physical and technological systems — including civil infrastructure. Many of these systems are interconnected and the interdependencies between them allow disruptive events to propagate across networks, enabling damage to spread far beyond the immediate footprint of disturbance. In this research we experiment with a model to characterise the configuration of interdependencies in terms of direction, redundancy, and extent, and we analyse the performance of interdependent systems with a wide range of possible coupling modes. We demonstrate that networks with directed dependencies are less robust than those with undirected dependencies, and that the degree of redundancy in inter-network dependencies can have a differential effect on robustness depending on the directionality of the dependencies. As interdependencies between many real-world systems exhibit these characteristics, it is likely that many such systems operate near their critical thresholds. The vulnerability of an interdependent network is shown to be reducible in a cost effective way, either by optimising inter-network connections, or by hardening high degree nodes. The results improve understanding of the influence of interdependencies on system performance and provide insight into how to mitigate associated risks.

  13. Primary component analysis method and reduction of seismicity parameters

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Ma, Qin-Zhong; Lin, Ming-Zhou; Wu, Geng-Feng; Wu, Shao-Chun

    2005-09-01

    In the paper, the primary component analysis is made using 8 seismicity parameters of earthquake frequency N (M l≥3.0), b-value, η-value, A(b)-value, Mf-value, Ac-value, C-value and D-value that reflect the characteristics of magnitude, time and space distribution of seismicity from different respects. By using the primary component analysis method, the synthesis parameter W reflecting the anomalous features of earthquake magnitude, time and space distribution can be gained. Generally, there is some relativity among the 8 parameters, but their variations are different in different periods. The earthquake prediction based on these parameters is not very well. However, the synthesis parameter W showed obvious anomalies before 13 earthquakes (M S≥5.8) occurred in North China, which indicates that the synthesis parameter W can reflect the anomalous characteristics of magnitude, time and space distribution of seismicity better. Other problems related to the conclusions drawn by the primary component analysis method are also discussed.

  14. Noise analysis of the seismic system employed in the northern and southern California seismic nets

    USGS Publications Warehouse

    Eaton, J.P.

    1984-01-01

    The seismic networks have been designed and operated to support recording on Develocorders (less than 40db dynamic range) and analog magnetic tape (about 50 db dynamic range). The principal analysis of the records has been based on Develocorder films; and background earth noise levels have been adjusted to be about 1 to 2 mm p-p on the film readers. Since the traces are separated by only 10 to 12 mm on the reader screen, they become hopelessly tangled when signal amplitudes on several adjacent traces exceed 10 to 20 mm p-p. Thus, the background noise level is hardly more than 20 db below the level of largest readable signals. The situation is somewhat better on tape playbacks, but the high level of background noise set to accomodate processing from film records effectively limits the range of maximum-signal to background-earth-noise on high gain channels to a little more than 30 db. Introduction of the PDP 11/44 seismic data acquisition system has increased the potential dynamic range of recorded network signals to more than 60 db. To make use of this increased dynamic range we must evaluate the characteristics and performance of the seismic system. In particular, we must determine whether the electronic noise in the system is or can be made sufficiently low so that background earth noise levels can be lowered significantly to take advantage of the increased dynamic range of the digital recording system. To come to grips with the complex problem of system noise, we have carried out a number of measurements and experiments to evaluate critical components of the system as well as to determine the noise characteristics of the system as a whole.

  15. Earthquakes, vulnerability and disaster risk: Georgia case

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashvili, Otar; Askan, Aysegul

    2015-04-01

    The Republic of Georgia, located on the East coast of the Black Sea, is prone to multiple natural hazards, the most dangerous and devastating of which are strong earthquakes. This work issues a call for advance planning and action to reduce natural disaster risks, notably seismic risk through the investigation of vulnerability and seismic hazard for Georgia. Ground motion prediction equations are essential for several purposes ranging from seismic design and analysis to probabilistic seismic hazard assessment. Seismic hazard maps were calculated based on modern approach of selecting and ranking global and regional ground motion prediction equation for region. We have also applied the host-to-target method in two regions in Georgia with different source mechanisms. According to the tectonic regime of the target areas, two different regions are chosen as host regions. One of them is the North Anatolian Fault zone in Turkey with the dominant strike-slip source mechanism while the other is Tabas in Iran with mostly events of reverse mechanism. We performed stochastic finite-fault simulations in both host and target areas and employed the hybrid-empirical method as introduced and outlined in Campbell (2003). An initial hybrid empirical ground motion model is developed for PGA and SA at selected periods for Georgia. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Intensity based vulnerability study were completed for Georgian buildings. Finally, Probabilistic seismic risk assessment in terms of structural damage and casualties were calculated. This methodology gave prediction of damage and casualty for a given probability of recurrence, based on a probabilistic seismic hazard model, population distribution, inventory, and vulnerability of buildings

  16. Decision analysis framework for evaluating CTBT seismic verification options

    SciTech Connect

    Judd, B.R.; Strait, R.S.; Younker, L.W.

    1986-09-01

    This report describes a decision analysis framework for evaluating seismic verification options for a Comprehensive Test Ban Treaty (CTBT). In addition to providing policy makers with insights into the relative merits of different options, the framework is intended to assist in formulating and evaluating political decisions - such as responses to evidence of violations - and in setting research priorities related to the options. To provide these broad analytical capabilities to decision makers, the framework incorporates a wide variety of issues. These include seismic monitoring capabilities, evasion possibilities, evidence produced by seismic systems, US response to the evidence, the dependence between US and Soviet decision-making, and the relative values of possible outcomes to the US and the Soviet Union. An added benefit of the framework is its potential use to improve communication about these CTBT verification issues among US experts and decision makers. The framework has been implemented on a portable microcomputer to facilitate this communication through demonstration and rapid evaluation of alternative judgments and policy choices. The report presents the framework and its application in four parts. The first part describes the decision analysis framework and the types of analytical results produced. In the second part, the framework is used to evaluate representative seismic verification options. The third part describes the results of sensitivity analyses that determine the relative importance of the uncertainties or subjective judgments that influence the evaluation of the options. The fourth (and final) part summaries conclusions and presents implications of the sample analytical results for further research and for policy-making related to CTBT verification. The fourth section also describes the next steps in the development and use of the decision analysis framework.

  17. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    NASA Astrophysics Data System (ADS)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of

  18. Vulnerability of Thai rice production to simultaneous climate and socioeconomic changes: a double exposure analysis

    NASA Astrophysics Data System (ADS)

    Sangpenchan, R.

    2011-12-01

    This research explores the vulnerability of Thai rice production to simultaneous exposure by climate and socioeconomic change -- so-called "double exposure." Both processes influence Thailand's rice production system, but the vulnerabilities associated with their interactions are unknown. To understand this double exposure, I adopts a mixed-method, qualitative-quantitative analytical approach consisting of three phases of analysis involving a Vulnerability Scoping Diagram, a Principal Component Analysis, and the EPIC crop model using proxy datasets collected from secondary data sources at provincial scales.The first and second phases identify key variables representing each of the three dimensions of vulnerability -- exposure, sensitivity, and adaptive capacity indicating that the greatest vulnerability in the rice production system occurs in households and areas with high exposure to climate change, high sensitivity to climate and socioeconomic stress, and low adaptive capacity. In the third phase, the EPIC crop model simulates rice yields associated with future climate change projected by CSIRO and MIROC climate models. Climate change-only scenarios project the decrease in yields by 10% from the current productivity during 2016-2025 and 30% during 2045-2054. Scenarios applying both climate change and improved technology and management practices show that a 50% increase in rice production is possible, but requires strong collaboration between sectors to advance agricultural research and technology and requires strong adaptive capacity in the rice production system characterized by well-developed social capital, social networks, financial capacity, and infrastructure and household mobility at the local scale. The vulnerability assessment and climate and crop adaptation simulations used here provide useful information to decision makers developing vulnerability reduction plans in the face of concurrent climate and socioeconomic change.

  19. Social vulnerability assessment using spatial multi-criteria analysis (SEVI model) and the Social Vulnerability Index (SoVI model) - a case study for Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Armaş, I.; Gavriş, A.

    2013-06-01

    In recent decades, the development of vulnerability frameworks has enlarged the research in the natural hazards field. Despite progress in developing the vulnerability studies, there is more to investigate regarding the quantitative approach and clarification of the conceptual explanation of the social component. At the same time, some disaster-prone areas register limited attention. Among these, Romania's capital city, Bucharest, is the most earthquake-prone capital in Europe and the tenth in the world. The location is used to assess two multi-criteria methods for aggregating complex indicators: the social vulnerability index (SoVI model) and the spatial multi-criteria social vulnerability index (SEVI model). Using the data of the 2002 census we reduce the indicators through a factor analytical approach to create the indices and examine if they bear any resemblance to the known vulnerability of Bucharest city through an exploratory spatial data analysis (ESDA). This is a critical issue that may provide better understanding of the social vulnerability in the city and appropriate information for authorities and stakeholders to consider in their decision making. The study emphasizes that social vulnerability is an urban process that increased in a post-communist Bucharest, raising the concern that the population at risk lacks the capacity to cope with disasters. The assessment of the indices indicates a significant and similar clustering pattern of the census administrative units, with an overlap between the clustering areas affected by high social vulnerability. Our proposed SEVI model suggests adjustment sensitivity, useful in the expert-opinion accuracy.

  20. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    .I. Gushchenko, 1979) and seismological (database of USGS/NEIC Significant Worldwide Earthquakes, 2150 B.C.- 1994 A.D.) information which displays dynamics of endogenic relief-forming processes over a period of 1900 to 1994. In the course of the analysis, a substitution of calendar variable by a corresponding astronomical one has been performed and the epoch superposition method was applied. In essence, the method consists in that the massifs of information on volcanic eruptions (over a period of 1900 to 1977) and seismic events (1900-1994) are differentiated with respect to value of astronomical parameters which correspond to the calendar dates of the known eruptions and earthquakes, regardless of the calendar year. The obtained spectra of volcanic eruptions and violent earthquake distribution in the fields of the Earth orbital movement parameters were used as a basis for calculation of frequency spectra and diurnal probability of volcanic and seismic activity. The objective of the proposed investigations is a probabilistic model development of the volcanic and seismic events, as well as GIS designing for monitoring and forecast of volcanic and seismic activities. In accordance with the stated objective, three probability parameters have been found in the course of preliminary studies; they form the basis for GIS-monitoring and forecast development. 1. A multidimensional analysis of volcanic eruption and earthquakes (of magnitude 7) have been performed in terms of the Earth orbital movement. Probability characteristics of volcanism and seismicity have been defined for the Earth as a whole. Time intervals have been identified with a diurnal probability twice as great as the mean value. Diurnal probability of volcanic and seismic events has been calculated up to 2020. 2. A regularity is found in duration of dormant (repose) periods has been established. A relationship has been found between the distribution of the repose period probability density and duration of the period. 3

  1. Seismic Noise Analysis and Reduction through Utilization of Collocated Seismic and Atmospheric Sensors at the GRO Chile Seismic Network

    NASA Astrophysics Data System (ADS)

    Farrell, M. E.; Russo, R. M.

    2013-12-01

    The installation of Earthscope Transportable Array-style geophysical observatories in Chile expands open data seismic recording capabilities in the southern hemisphere by nearly 30%, and has nearly tripled the number of seismic stations providing freely-available data in southern South America. Through the use of collocated seismic and atmospheric sensors at these stations we are able to analyze how local atmospheric conditions generate seismic noise, which can degrade data in seismic frequency bands at stations in the ';roaring forties' (S latitudes). Seismic vaults that are climate-controlled and insulated from the local environment are now employed throughout the world in an attempt to isolate seismometers from as many noise sources as possible. However, this is an expensive solution that is neither practical nor possible for all seismic deployments; and also, the increasing number and scope of temporary seismic deployments has resulted in the collection and archiving of terabytes of seismic data that is affected to some degree by natural seismic noise sources such as wind and atmospheric pressure changes. Changing air pressure can result in a depression and subsequent rebound of Earth's surface - which generates low frequency noise in seismic frequency bands - and even moderate winds can apply enough force to ground-coupled structures or to the surface above the seismometers themselves, resulting in significant noise. The 10 stations of the permanent Geophysical Reporting Observatories (GRO Chile), jointly installed during 2011-12 by IRIS and the Chilean Servicio Sismológico, include instrumentation in addition to the standard three seismic components. These stations, spaced approximately 300 km apart along the length of the country, continuously record a variety of atmospheric data including infrasound, air pressure, wind speed, and wind direction. The collocated seismic and atmospheric sensors at each station allow us to analyze both datasets together, to

  2. SeismicWaveTool: Continuous and discrete wavelet analysis and filtering for multichannel seismic data

    NASA Astrophysics Data System (ADS)

    Galiana-Merino, J. J.; Rosa-Herranz, J. L.; Rosa-Cintas, S.; Martinez-Espla, J. J.

    2013-01-01

    A MATLAB-based computer code has been developed for the simultaneous wavelet analysis and filtering of multichannel seismic data. The considered time-frequency transforms include the continuous wavelet transform, the discrete wavelet transform and the discrete wavelet packet transform. The developed approaches provide a fast and precise time-frequency examination of the seismograms at different frequency bands. Moreover, filtering methods for noise, transients or even baseline removal, are implemented. The primary motivation is to support seismologists with a user-friendly and fast program for the wavelet analysis, providing practical and understandable results. Program summaryProgram title: SeismicWaveTool Catalogue identifier: AENG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 611072 No. of bytes in distributed program, including test data, etc.: 14688355 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) version 7.8.0.347 (R2009a) or higher. Wavelet Toolbox is required. Computer: Developed on a MacBook Pro. Tested on Mac and PC. No computer-specific optimization was performed. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.8.0.347 (R2009a) or higher. Tested on Mac OS X 10.6.8, Windows XP and Vista. Classification: 13. Nature of problem: Numerous research works have developed a great number of free or commercial wavelet based software, which provide specific solutions for the analysis of seismic data. On the other hand, standard toolboxes, packages or libraries, such as the MathWorks' Wavelet Toolbox for MATLAB, offer command line functions and interfaces for the wavelet analysis of one-component signals. Thus, software usually is focused on very specific problems

  3. Exploring drought vulnerability in Africa: an indicator based analysis to be used in early warning systems

    NASA Astrophysics Data System (ADS)

    Naumann, G.; Barbosa, P.; Garrote, L.; Iglesias, A.; Vogt, J.

    2014-05-01

    We propose a composite drought vulnerability indicator (DVI) that reflects different aspects of drought vulnerability evaluated at Pan-African level for four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation, is less vulnerable to drought; furthermore, we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided into the following geographical regions: the Mediterranean coast of Africa; the Sahel region and the Horn of Africa; the Serengeti and the Eastern Miombo woodlands in eastern Africa; the western part of the Zambezi Basin, the southeastern border of the Congo Basin, and the belt of Fynbos in the Western Cape province of South Africa. The results of the DVI at the country level were

  4. Cluster Computing For Real Time Seismic Array Analysis.

    NASA Astrophysics Data System (ADS)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  5. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    NASA Astrophysics Data System (ADS)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar

  6. Understanding North Texas Seismicity: A Joint Analysis of Seismic Data and 3D Pore Pressure Modeling

    NASA Astrophysics Data System (ADS)

    DeShon, H. R.; Hornbach, M. J.; Ellsworth, W. L.; Oldham, H. R.; Hayward, C.; Stump, B. W.; Frohlich, C.; Olson, J. E.; Luetgert, J. H.

    2014-12-01

    In November 2013, a series of earthquakes began along a mapped ancient fault system near Azle, Texas. The Azle events are the third felt earthquake sequence in the Fort Worth (Barnett Shale) Basin since 2008, and several production and injection wells in the area are drilled to depths near the recent seismic activity. Understanding if and/or how injection and removal of fluids in the crystalline crust reactivates faults have important implications for seismology, the energy industry, and society. We assessed whether the Azle earthquakes were induced using a joint analysis of the earthquake data, subsurface geology and fault structure, and 3D pore pressure modeling. Using a 12-station temporary seismic deployment, we have recorded and located >300 events large enough to be recorded on multiple stations and 1000s of events during periods of swarm activity. High-resolution locations and focal mechanisms indicate that events occurred on NE-SW trending, steeply dipping normal faults associated with the southern end of the Newark East Fault Zone with hypocenters between 2-8 km depth. We considered multiple causes that might have changed stress along this system. Earthquakes resulting from natural processes, though perhaps unlikely in this historically inactive region, can be neither ruled out nor confirmed due to lack of information on the natural stress state of these faults. Analysis of lake and groundwater variations near Azle showed that no significant stress changes occurred prior to or during the earthquake sequence. In contrast, analysis of pore-pressure models shows that the combination of formation water production and wastewater injection near the fault could have caused pressure increases that induced earthquakes on near-critically stressed faults.

  7. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  8. Analysis of embedded waste storage tanks subjected to seismic loading

    SciTech Connect

    Zaslawsky, M.; Sammaddar, S.; Kennedy, W.N.

    1991-01-01

    At the Savannah River Site, High Activity Wastes are stored in carbon steel tanks that are within reinforced concrete vaults. These soil-embedded tank/vault structures are approximately 80 ft. in diameter and 40 ft. deep. The tanks were studied to determine the essentials of governing variables, to reduce the problem to the least number of governing cases to optimize analysis effort without introducing excessive conservatism. The problem reduced to a limited number of cases of soil-structure interaction and fluid (tank contents) -- structure interaction problems. It was theorized that substantially reduced input would be realized from soil structure interaction (SSI) but that it was also possible that tank-to-tank proximity would result in (re)amplification of the input. To determine the governing seismic input motion, the three dimensional SSI code, SASSI, was used. Significant among the issues relative to waste tanks is to the determination of fluid response and tank behavior as a function of tank contents viscosity. Tank seismic analyses and studies have been based on low viscosity fluids (water) and the behavior is quite well understood. Typical wastes (salts, sludge), which are highly viscous, have not been the subject of studies to understand the effect of viscosity on seismic response. The computer code DYNA3D was used to study how viscosity alters tank wall pressure distribution and tank base shear and overturning moments. A parallel hand calculation was performed using standard procedures. Conclusions based on the study provide insight into the quantification of the reduction of seismic inputs for soil structure interaction for a soft'' soil site.

  9. Analysis of embedded waste storage tanks subjected to seismic loading

    SciTech Connect

    Zaslawsky, M.; Sammaddar, S.; Kennedy, W.N.

    1991-12-31

    At the Savannah River Site, High Activity Wastes are stored in carbon steel tanks that are within reinforced concrete vaults. These soil-embedded tank/vault structures are approximately 80 ft. in diameter and 40 ft. deep. The tanks were studied to determine the essentials of governing variables, to reduce the problem to the least number of governing cases to optimize analysis effort without introducing excessive conservatism. The problem reduced to a limited number of cases of soil-structure interaction and fluid (tank contents) -- structure interaction problems. It was theorized that substantially reduced input would be realized from soil structure interaction (SSI) but that it was also possible that tank-to-tank proximity would result in (re)amplification of the input. To determine the governing seismic input motion, the three dimensional SSI code, SASSI, was used. Significant among the issues relative to waste tanks is to the determination of fluid response and tank behavior as a function of tank contents viscosity. Tank seismic analyses and studies have been based on low viscosity fluids (water) and the behavior is quite well understood. Typical wastes (salts, sludge), which are highly viscous, have not been the subject of studies to understand the effect of viscosity on seismic response. The computer code DYNA3D was used to study how viscosity alters tank wall pressure distribution and tank base shear and overturning moments. A parallel hand calculation was performed using standard procedures. Conclusions based on the study provide insight into the quantification of the reduction of seismic inputs for soil structure interaction for a ``soft`` soil site.

  10. Analysis of the seismic origin of landslides: examples from the New Madrid seismic zone

    USGS Publications Warehouse

    Jibson, R.W.; Keefer, D.K.

    1993-01-01

    By analyzing two landslides in the New Madrid seismic zone, we develop an approach for judging if a landslide or group of landslides of unknown origin was more likely to have formed as a result of earthquake shaking or in aseismic conditions. The two landslides analyzed are representative of two groups of land-slides that previous research on the geomorphology and regional distribution of landslides in this region indicates may have been triggered by the 1811-1812 New Madrid earthquakes. Slope-stability models of aseismic conditions show that neither landslide is likely to have formed aseismically even in unrealistically high ground-water conditions. Our analysis yields a general relationship between Newmark landslide displacement, earthquake shaking intensity, and the critical acceleration of a landslide. -from Authors

  11. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES.

    SciTech Connect

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. .

    2005-07-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice.

  12. Seismic vulnerability assessment of a steel-girder highway bridge equipped with different SMA wire-based smart elastomeric isolators

    NASA Astrophysics Data System (ADS)

    Hedayati Dezfuli, Farshad; Shahria Alam, M.

    2016-07-01

    Shape memory alloy wire-based rubber bearings (SMA-RBs) possess enhanced energy dissipation capacity and self-centering property compared to conventional RBs. The performance of different types of SMA-RBs with different wire configurations has been studied in detail. However, their reliability in isolating structures has not been thoroughly investigated. The objective of this study is to analytically explore the effect of SMA-RBs on the seismic fragility of a highway bridge. Steel-reinforced elastomeric isolators are equipped with SMA wires and used to isolate the bridge. Results revealed that SMA wires with a superelastic behavior and re-centering capability can increase the reliability of the bearing and the bridge structure. It was observed that at the collapse level of damage, the bridge isolated by SMA-HDRB has the lowest fragility. Findings also showed that equipping NRB with SMA wires decreases the possibility of damage in the bridge while, replacing HDRB with SMA-HDRB; or LRB with SMA-LRB increases the failure probability of the system at slight, moderate, and extensive limit states.

  13. Cluster Analysis for CTBT Seismic Event Monitoring

    SciTech Connect

    Carr, Dorthe B.; Young, Chris J.; Aster, Richard C.; Zhang, Xioabing

    1999-08-03

    , respectively. The clustering techniques prove to be much more effective for the New Mexico data than the Wyoming data, apparently because the New Mexico mines are closer and consequently the signal to noise ratios (SNR's) for those events are higher. To verify this hypothesis we experiment with adding gaussian noise to the New Mexico data to simulate data from more distant sites. Our results suggest that clustering techniques can be very useful for identifying small anomalous events if at least one good recording is available, and that the only reliable way to improve clustering results is to process the waveforms to improve SNR. For events with good SNR that do have strong grouping, cluster analysis will reveal the inherent groupings regardless of the choice of clustering method.

  14. Seismic Fragility Analysis of a Degraded Condensate Storage Tank

    SciTech Connect

    Nie, J.; Braverman, J.; Hofmayer, C.; Choun, Y-S.; Kim, M.K.; Choi, I-K.

    2011-05-16

    The Korea Atomic Energy Research Institute (KAERI) and Brookhaven National Laboratory are conducting a collaborative research project to develop seismic capability evaluation technology for degraded structures and components in nuclear power plants (NPPs). One of the goals of this collaboration endeavor is to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The essential part of this collaboration is aimed at achieving a better understanding of the effects of aging on the performance of SSCs and ultimately on the safety of NPPs. A recent search of the degradation occurrences of structures and passive components (SPCs) showed that the rate of aging related degradation in NPPs was not significantly large but increasing, as the plants get older. The slow but increasing rate of degradation of SPCs can potentially affect the safety of the older plants and become an important factor in decision making in the current trend of extending the operating license period of the plants (e.g., in the U.S. from 40 years to 60 years, and even potentially to 80 years). The condition and performance of major aged NPP structures such as the containment contributes to the life span of a plant. A frequent misconception of such low degradation rate of SPCs is that such degradation may not pose significant risk to plant safety. However, under low probability high consequence initiating events, such as large earthquakes, SPCs that have slowly degraded over many years could potentially affect plant safety and these effects need to be better understood. As part of the KAERI-BNL collaboration, a condensate storage tank (CST) was analyzed to estimate its seismic fragility capacities under various postulated degradation scenarios. CSTs were shown to have a significant impact on the seismic core damage frequency of a nuclear power plant. The seismic fragility capacity of the CST was developed

  15. An analysis of short-to-medium-range seismic attenuation tests using a multilayered viscoelastic seismic propagation model

    NASA Astrophysics Data System (ADS)

    Carnes, B. L.; Lundien, J. R.

    1984-11-01

    This study was conducted to provide a database from which to draw conclusive results on the efficiencies of seismic wave propagation in natural terrain and the resolution and fidelity of multiple frequency signals, and to supplement data for validation of theoretical models of seismic wave propagation. An extensive test program was conducted at White Sands Missile Range, New Mexico, using an electrohydraulic vibrator, an impulse loader, and a vehicle as sources of seismic waves over a 5-m to 1-km range and using explosive seismic sources over a 1- to 10-km range. The results are presented for discrete frequency vibration test (1-120 Hz), tone burst tests (1-120 Hz), random noise vibration tests, and background noise tests for vehicle, impulse, and explosive tests. Analysis of data has been performed to correlate frequency, amplitude, range, and other signal characteristics with model predictions for future tests. This study relates the dispersion and attenuation of seismic waves, frequency resolution, and wind and background noise to the refinement of the WES seismic propagation model.

  16. Seismic margin review of the Maine Yankee Atomic Power Station: Fragility analysis

    SciTech Connect

    Ravindra, M. K.; Hardy, G. S.; Hashimoto, P. S.; Griffin, M. J.

    1987-03-01

    This Fragility Analysis is the third of three volumes for the Seismic Margin Review of the Maine Yankee Atomic Power Station. Volume 1 is the Summary Report of the first trial seismic margin review. Volume 2, Systems Analysis, documents the results of the systems screening for the review. The three volumes are part of the Seismic Margins Program initiated in 1984 by the Nuclear Regulatory Commission (NRC) to quantify seismic margins at nuclear power plants. The overall objectives of the trial review are to assess the seismic margins of a particular pressurized water reactor, and to test the adequacy of this review approach, quantification techniques, and guidelines for performing the review. Results from the trial review will be used to revise the seismic margin methodology and guidelines so that the NRC and industry can readily apply them to assess the inherent quantitative seismic capacity of nuclear power plants.

  17. Detection, Measurement, Visualization, and Analysis of Seismic Crustal Deformation

    NASA Technical Reports Server (NTRS)

    Crippen, R.; Blom, R.

    1995-01-01

    Remote sensing plays a key role in the analysis of seismic crustal deformation. Recently radar interferometry has been used to measure one dimension of the strain fields of earthquakes at a resolution of centimeters. Optical imagery is useful in measuring the strain fields in both geographic dimensions of the strain field down to 1/20 of pixel size, and soon will be capable of high resolution. Visual observation of fault motion from space can also be used to detect fault motion from aerial photographs.

  18. Kinematic Seismic Rupture Parameters from a Doppler Analysis

    NASA Astrophysics Data System (ADS)

    Caldeira, Bento; Bezzeghoud, Mourad; Borges, José F.

    2010-05-01

    The radiation emitted from extended seismic sources, mainly when the rupture spreads in preferred directions, presents spectral deviations as a function of the observation location. This aspect, unobserved to point sources, and named as directivity, are manifested by an increase in the frequency and amplitude of seismic waves when the rupture occurs in the direction of the seismic station and a decrease in the frequency and amplitude if it occurs in the opposite direction. The model of directivity that supports the method is a Doppler analysis based on a kinematic source model of rupture and wave propagation through a structural medium with spherical symmetry [1]. A unilateral rupture can be viewed as a sequence of shocks produced along certain paths on the fault. According this model, the seismic record at any point on the Earth's surface contains a signature of the rupture process that originated the recorded waveform. Calculating the rupture direction and velocity by a general Doppler equation, - the goal of this work - using a dataset of common time-delays read from waveforms recorded at different distances around the epicenter, requires the normalization of measures to a standard value of slowness. This normalization involves a non-linear inversion that we solve numerically using an iterative least-squares approach. The evaluation of the performance of this technique was done through a set of synthetic and real applications. We present the application of the method at four real case studies, the following earthquakes: Arequipa, Peru (Mw = 8.4, June 23, 2001); Denali, AK, USA (Mw = 7.8; November 3, 2002); Zemmouri-Boumerdes, Algeria (Mw = 6.8, May 21, 2003); and Sumatra, Indonesia (Mw = 9.3, December 26, 2004). The results obtained from the dataset of the four earthquakes agreed, in general, with the values presented by other authors using different methods and data. [1] Caldeira B., Bezzeghoud M, Borges JF, 2009; DIRDOP: a directivity approach to determining

  19. Seismic Slope Stabilty Analysis: Gurpinar (Istanbul) As A Case History

    NASA Astrophysics Data System (ADS)

    Ozcep, Ferhat; Erol, Engin; Saracoglu, Fatih; Haliloglu, Mustafa

    2010-05-01

    Slope failures triggered by the earthquakes are one of the most important soil problems. In this study, dynamic (earthquake) slope stability analysis was carried out in Gurpınar area. For this aim, in situ tests (SPT) were carried out and laboratory samples were obtained from 6 boreholes (their max. dept 50.0m) to determine soil classification and strength characteristics. Moreover, geophysical studies (seismic refraction and MASW) were also carried out in the area to estimate the structure and strength characteristics of the slope to 50.0 m. All of data, obtained in field and laboratory, was used to construct the mechanical and structural (geometrical) behavior of the slope. To solve slope stability problem, tree soil slope model was considered for the area. In dynamic state, to estimate the earthquake acceleration seismic hazard analysis was carried out in the region. In the end of the analysis, while there is not any problem in static condition/loads, some slope stability problems was appeared with increasing earthquake acceleration. A geotechnical slope improvement project was proposed for the study area.

  20. 77 FR 69509 - Combining Modal Responses and Spatial Components in Seismic Response Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-19

    ... COMMISSION Combining Modal Responses and Spatial Components in Seismic Response Analysis AGENCY: Nuclear... Components in Seismic Response Analysis'' as an administratively changed guide in which there are minor... response analysis of nuclear power plant structures, systems, and components that are important to...

  1. Seismic analysis of base-isolated liquid storage tanks

    NASA Astrophysics Data System (ADS)

    Shrimali, M. K.; Jangid, R. S.

    2004-08-01

    Three analytical studies for the seismic response of base-isolated ground supported cylindrical liquid storage tanks under recorded earthquake ground motion are presented. The continuous liquid mass of the tank is modelled as lumped masses referred as sloshing mass, impulsive mass and rigid mass. Firstly, the seismic response of isolated tanks is obtained using the modal superposition technique and compared with the exact response to study the effects of non-classical damping. The comparison of results with different tank aspect ratios and stiffness and damping of the bearing indicate that the effects of non-classical damping are insignificant implying that the response of isolated liquid storage tanks can be accurately obtained by the modal analysis with classical damping approximation. The second investigation involves the analysis of base-isolated liquid storage tanks using the response spectrum method in which the peak response of tank in different modes is obtained for the specified response spectrum of earthquake motion and combined with different combination rules. The results indicate that the peak response obtained by the response spectrum method matches well with the corresponding exact response. However, specific combination rule should be used for better estimation of various response quantities of the isolated tanks. Finally, the closed-form expressions for the modal parameters of the base-isolated liquid storage tanks are derived and compared with the exact values. A simplified approximate method is also proposed to evaluate the seismic response of isolated tanks. The response obtained from the above approximate method was found to be in good agreement with the exact response.

  2. SCEC/CME CyberShake: Probabilistic Seismic Hazard Analysis Using 3D Seismic Waveform Modeling

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Cui, Y.; Faerman, M.; Field, E.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T. H.; Kesselman, C.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2005-12-01

    Researchers on the SCEC Community Modeling Environment (SCEC/CME) Project are calculating Probabilistic Seismic Hazard Curves for several sites in the Los Angeles area. The hazard curves calculated in this study use Intensity Measure Relationships (IMRs) based on 3D ground motion simulations rather than on attenuation relationships. State-of-the-art Probabilistic Seismic Hazard Analysis (PSHA) is currently conducted using IMRs that use empirically-based attenuation relationships. These attenuation relationships represent relatively simple analytical models based on the regression of observed data. However, it is widely believed that significant improvements in SHA will rely on the use of more physics-based, waveform modeling. In fact, a more physics-based approach to PSHA was endorsed in a recent assessment of earthquake science by National Research Council (2003). In order to introduce the use of 3D seismic waveform modeling into PSHA hazard curve calculations, the SCEC/CME CyberShake group is integrating state-of-the-art PSHA software tools (OpenSHA), SCEC-developed geophysical models (SCEC CVM3.0), validated anelastic wave modeling (AWM) software, and state-of-the-art computational technologies including high performance computing and grid-based scientific workflows in an effort to develop an OpenSHA-compatible 3D waveform-based IMR component. This will allow researchers to combine a new class of waveform-based IMRs with the large number of existing PSHA components, such as Earthquake Rupture Forecasts (ERF's), that are currently implemented in the OpenSHA system. To calculate a probabilistic hazard curve for a site of interest, we use the OpenSHA implementation of the NSHMP-2002 ERF and identify all ruptures within 200km of the site of interest. For each of these ruptures, we convert the NSHMP-2002 rupture definition into one, or more, Ruptures with Slip Time History (Rupture Variations) using newly developed Rupture Generator software. Strain Green Tensors are

  3. Seismic fragility analysis of buried steel piping at P, L, and K reactors

    SciTech Connect

    Wingo, H.E.

    1989-10-01

    Analysis of seismic strength of buried cooling water piping in reactor areas is necessary to evaluate the risk of reactor operation because seismic events could damage these buried pipes and cause loss of coolant accidents. This report documents analysis of the ability of this piping to withstand the combined effects of the propagation of seismic waves, the possibility that the piping may not behave in a completely ductile fashion, and the distortions caused by relative displacements of structures connected to the piping.

  4. Mining-induced seismicity in faulted geologic structures: An analysis of seismicity-induced slip potential

    NASA Astrophysics Data System (ADS)

    Swanson, P. L.

    1992-09-01

    Relationships between the locations of mining-induced seismic events, local fault structure, and mine geometry were examined in a deep hard-rock mine in northern Idaho. Stopes experiencing rock bursts and other large seismic events were found to fall into two structural regimes: the “Silver Vein”, and the “N48°W Trend,” a steeply dipping plane of seismic activity that is subparallel to major local steeply dipping faults which bound blocky structures. The N48°W Trend also intersects a shaft that was seriously damaged when fault gouge was expelled into the opening during a 3-month period of high seismic energy release. Models of stress interaction are used to support the hypothesis that mining-induced deformation was mobilized along a 1.5 km length of the N48°W Trend. Specifically, numerical models are used to simulate rupture of seismic events and estimate induced changes in the quasi-static stress field. A Coulomb failure criterion is used with these results to estimate the spatial variation in potential for slip on planes parallel to local faulting. Increases in the potential for slip on fault planes subparallel to the N48°W Trend are consistent with activation of deformation along its 1.5 km length. For events with constant seismic moment, stress drop is shown to be far more important than source dimension in elevating slip potential along the observed plane of seismic activity

  5. MSNoise: A framework for Continuous Seismic Noise Analysis

    NASA Astrophysics Data System (ADS)

    Lecocq, Thomas; Caudron, Corentin; De Plaen, Raphaël; Mordret, Aurélien

    2016-04-01

    MSNoise is an Open and Free Python package known to be the only complete integrated workflow designed to analyse ambient seismic noise and study relative velocity changes (dv/v) in the crust. It is based on state of the art and well maintained Python modules, among which ObsPy plays an important role. To our knowledge, it is officially used for continuous monitoring at least in three notable places: the Observatory of the Piton de la Fournaise volcano (OVPF, France), the Auckland Volcanic Field (New Zealand) and on the South Napa earthquake (Berkeley, USA). It is also used by many researchers to process archive data to focus e.g. on fault zones, intraplate Europe, geothermal exploitations or Antarctica. We first present the general working of MSNoise, originally written in 2010 to automatically scan data archives and process seismic data in order to produce dv/v time series. We demonstrate that its modularity provides a new potential to easily test new algorithms for each processing step. For example, one could experiment new methods of cross-correlation (done by default in the frequency domain), stacking (default is linear stacking, averaging), or dv/v estimation (default is moving window cross-spectrum "MWCS", so-called "doublet"), etc. We present the last major evolution of MSNoise from a "single workflow: data archive to dv/v" to a framework system that allows plugins and modules to be developed and integrated into the MSNoise ecosystem. Small-scale plugins will be shown as examples, such as "continuous PPSD" (à la McNamarra & Buland) or "Seismic Amplitude Ratio Analysis" (Taisne, Caudron). We will also present the new MSNoise-TOMO package, using MSNoise as a "cross-correlation" toolbox and demystifying surface wave tomography ! Finally, the poster will be a meeting point for all those using or willing to use MSNoise, to meet the developer, exchange ideas and wishes !

  6. Probabilistic Seismic Hazard Analysis for Southern California Coastal Facilities

    SciTech Connect

    Savy, J; Foxall, B

    2004-04-16

    The overall objective of this study was to develop probabilistic seismic hazard estimates for the coastal and offshore area of Ventura, Los Angeles and Orange counties for use as a basis for the University of Southern California (USC) to develop physical models of tsunami for the coastal regions and by the California State Lands Commission (SLC) to develop regulatory standards for seismic loading and liquefaction evaluation of marine oil terminals. The probabilistic seismic hazard analysis (PSHA) was carried out by the Lawrence Livermore National Laboratory (LLNL), in several phases over a time period of two years, following the method developed by LLNL for the estimation of seismic hazards at Department Of Energy (DOE) facilities, and for 69 locations of nuclear plants in the Eastern United States, for the Nuclear Regulatory Commission (NRC). This method consists in making maximum use of all physical data (qualitative, and quantitative) and to characterize the uncertainties by using a set of alternate spatiotemporal models of occurrence of future earthquakes, as described in the SSHAC, PSHA Guidance Document (Budnitz et al., 1997), and implemented for the NRC (Savy et al., 2002). In general, estimation of seismic hazard is based not only on our understanding of the regional tectonics and detailed characterization of the faults in the area but also on the analysis methods employed and the types of physical and empirical models that are deemed appropriate for the analysis. To develop this understanding, the body of knowledge in the scientific community is sampled in a series of workshops with a group of experts representative of the entire scientific community, including geologists and seismologists from the United States Geological Survey (USGS), members of the South California Earthquake Center (SCEC), and members of academic institutions (University of California Santa-Cruz, Stanford, UC Santa Barbara, and University of Southern California), and members of

  7. Seismic elastic-plastic time history analysis and reliability study of quayside container crane

    NASA Astrophysics Data System (ADS)

    Jin, Yulong; Li, Zengguang

    2010-06-01

    Quayside container crane is a kind of huge dimension steel structure, which is the major equipment used for handling container at modern ports. With the aim to validate the safety and reliability of the crane under seismic loads, besides conventional analysis, elastic-plastic time history analysis under rare seismic intensity is carried out. An ideal finite element (FEM) elastic-plastic mechanical model of the quayside container crane is presented by using ANSYS codes. Furthermore, according to elastic-plastic time history analysis theory, deformation, stress and damage pattern of the structure under rare seismic intensity are investigated. Based on the above analysis, the established reliability model according to the reliability theory, together with seismic reliability analysis based on Monte-Carlo simulation is applied to practical analysis. The results show that the overall structure of the quayside container crane is generally unstable under rare seismic intensity, and the structure needs to be reinforced.

  8. Probabilistic Seismic Hazard Analysis: Adaptation for CO2 Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Vasudevan, K.; Eaton, D. W.

    2011-12-01

    Large-scale sequestration of CO2 in depleted oil and gas fields in sedimentary basins such as the Western Canada Sedimentary Basin (WCSB) and in particular, central Alberta, should consider, among other safety and risk issues, a seismic hazard analysis that would include potential ground motions induced by earthquakes. The region is juxtaposed to major tectonically active seismogenic zones such as the Cascadia Subduction Zone, the Queen Charlotte Fault Zone, and the northern Cordillera region. Hazards associated with large-scale storage from strong ground motions caused by large-magnitude earthquakes along the west coast of Canada, and/or medium-to-large magnitude earthquakes triggered by such earthquakes in the neighbourhood of the storage site, must be clearly understood. To this end, stochastic modeling of the accelerograms recorded during large magnitude earthquakes in western Canada has been undertaken. A lack of recorded accelerograms and the absence of a catalogue of ground-motion prediction equations similar to the Next Generation Attenuation (NGA) database, however, hamper such analysis for the WCSB. In order to generate our own database of ground-motions for probabilistic seismic hazard analysis, we employ a site-based stochastic simulation approach. We use it to simulate three-component ground-motion accelerograms recorded during the November 3, 2002 Denali earthquake to mimic the Queen Charlotte Fault earthquakes. To represent a Cascadia megathrust earthquake, we consider three-component strong-motion accelerograms recorded during the March 11, 2011 Tohoku earthquake in Japan. Finally, to simulate an event comparable to the thrust-style Kinbasket Lake earthquake of 1908, we use three-component ground-motion accelerograms recorded during the 1985 Nahanni earthquake and the 2004 Chuetsu earthquake. Here, we develop predictive equations for the stochastic model parameters that describe ground motions in terms of earthquake and site characteristics such as

  9. Structured Assessment Approach: a microcomputer-based insider-vulnerability analysis tool

    SciTech Connect

    Patenaude, C.J.; Sicherman, A.; Sacks, I.J.

    1986-01-01

    The Structured Assessment Approach (SAA) was developed to help assess the vulnerability of safeguards systems to insiders in a staged manner. For physical security systems, the SAA identifies possible diversion paths which are not safeguarded under various facility operating conditions and insiders who could defeat the system via direct access, collusion or indirect tampering. For material control and accounting systems, the SAA identifies those who could block the detection of a material loss or diversion via data falsification or equipment tampering. The SAA, originally desinged to run on a mainframe computer, has been converted to run on a personal computer. Many features have been added to simplify and facilitate its use for conducting vulnerability analysis. For example, the SAA input, which is a text-like data file, is easily readable and can provide documentation of facility safeguards and assumptions used for the analysis.

  10. Latest development in seismic texture analysis for subsurface structure, facies, and reservoir characterization: A review

    SciTech Connect

    Gao, Dengliang

    2011-03-01

    In exploration geology and geophysics, seismic texture is still a developing concept that has not been sufficiently known, although quite a number of different algorithms have been published in the literature. This paper provides a review of the seismic texture concepts and methodologies, focusing on latest developments in seismic amplitude texture analysis, with particular reference to the gray level co-occurrence matrix (GLCM) and the texture model regression (TMR) methods. The GLCM method evaluates spatial arrangements of amplitude samples within an analysis window using a matrix (a two-dimensional histogram) of amplitude co-occurrence. The matrix is then transformed into a suite of texture attributes, such as homogeneity, contrast, and randomness, which provide the basis for seismic facies classification. The TMR method uses a texture model as reference to discriminate among seismic features based on a linear, least-squares regression analysis between the model and the data within an analysis window. By implementing customized texture model schemes, the TMR algorithm has the flexibility to characterize subsurface geology for different purposes. A texture model with a constant phase is effective at enhancing the visibility of seismic structural fabrics, a texture model with a variable phase is helpful for visualizing seismic facies, and a texture model with variable amplitude, frequency, and size is instrumental in calibrating seismic to reservoir properties. Preliminary test case studies in the very recent past have indicated that the latest developments in seismic texture analysis have added to the existing amplitude interpretation theories and methodologies. These and future developments in seismic texture theory and methodologies will hopefully lead to a better understanding of the geologic implications of the seismic texture concept and to an improved geologic interpretation of reflection seismic amplitude

  11. Seismic catalog condensation with applications to multifractal analysis of South Californian seismicity

    NASA Astrophysics Data System (ADS)

    Kamer, Yavor; Ouillon, Guy; Sornette, Didier; Wössner, Jochen

    2014-05-01

    Latest advances in the instrumentation field have increased the station coverage and lowered event detection thresholds. This has resulted in a vast increase in the number of located events with each year. The abundance of data comes as a double edged sword: while it facilitates more robust statistics and provides better confidence intervals, it also paralyzes computations whose execution times grow exponentially with the number of data points. In this study, we present a novel method that assesses the relative importance of each data point, reduces the size of datasets while preserving the information content. For a given seismic catalog, the goal is to express the same spatial probability density distribution with fewer data points. To achieve this, we exploit the fact that seismic catalogs are not optimally encoded. This coding deficiency is the result of the sequential data entry where new events are added without taking into account previous ones. For instance, if there are several events with identical parameters occurring at the same location, these could be grouped together rather than occupying the same memory space as if they were distinct events. Following this reasoning, the proposed condensation methodology is implemented by grouping all event according to their overall variance, starting from the group with the highest variance (worst location uncertainty), each event is sampled by a number of sample points, these points are then used to calculate which better located events are able to express these probable locations with a higher likelihood. Based on these likelihood comparisons, weights from poorly located events are successively transferred to better located ones. As a result of the process, a large portion of the events (~30%) ends up with zero weights (thus being fully represented by events increasing their weights), while the information content (i.e the sum of all weights) remains preserved. The resulting condensed catalog not only provides

  12. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    NASA Astrophysics Data System (ADS)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  13. Integration of Gis-analysis and Atmospheric Modelling For Nuclear Risk and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Rigina, O.; Baklanov, A.; Mahura, A.

    The paper is devoted to the problems of residential radiation risk and territorial vul- nerability with respect to nuclear sites in Europe. The study suggests two approaches, based on an integration of the GIS-analysis and the atmospheric modelling, to calcu- late radiation risk/vulnerability. First, modelling simulations were done for a number of case-studies, based on real data, such as reactor core inventory and estimations from the known accidents, for a number of typical meteorological conditions and different accidental scenarios. Then, using these simulations and the population database as input data, the GIS-analysis reveals administrative units at the highest risk with re- spect to the mean individual and collective doses received by the population. Then, two alternative methods were suggested to assess a probabilistic risk to the population in case of a severe accident on the Kola and Leningrad NPPs (as examples) based on social-geophysical factors: proximity to the accident site, population density and presence of critical groups, and the probabilities of wind trajectories and precipitation. The two latter probabilities were calculated by the atmospheric trajectory models and statistical methods for many years. The GIS analysis was done for the Nordic coun- tries as an example. GIS-based spatial analyses integrated with mathematical mod- elling allow to develop a common methodological approach for complex assessment of regional vulnerability and residential radiation risk, by merging together the sepa- rate aspects: modelling of consequences, probabilistic analysis of atmospheric flows, dose estimation etc. The approach was capable to create risk/vulnerability maps of the Nordic countries and to reveal the most vulnerable provinces with respect to the radiation risk sites.

  14. Geotechnical stability analysis, fragility of structures and velocity of movement to assess landslides vulnerability

    NASA Astrophysics Data System (ADS)

    Cuanalo, O.; Bernal, E.; Polanco, G.

    2014-09-01

    Landslides are geohazards that can be potential risks to life and property; these phenomena usually cause disasters when they occur in densely populated communities as those that inhabit mountainous and steep regions. Hazard and vulnerability are parameters determined by probability mathematical analysis with values between 0 and 1. When there are no records or enough information regards historical events on the phenomenon in study, that have occurred in a specific area (as in several mountainous regions of Mexico inhabited by ethnic groups), it has the disadvantage of not being able to perform a statistical analysis to properly evaluate the hazard nor the vulnerability. To solve the problem, this paper presents a proposal for evaluating the physical and functional vulnerability of the elements at risk, from two fundamental aspects: (a) the exposure level (EL), and (b) the expected damage degree (EDD). First of these factors is determined by the severity index (SI) and the safety factor from geotechnical stability analysis (SFgeo); the second one from the construction type (degree of fragility of structures) and the velocity that may have the landslide. For evaluating the parameters aforementioned, included tables, graphs and equations proposed by the authors.

  15. Signal-to-noise ratio application to seismic marker analysis and fracture detection

    NASA Astrophysics Data System (ADS)

    Xu, Hui-Qun; Gui, Zhi-Xian

    2014-03-01

    Seismic data with high signal-to-noise ratios (SNRs) are useful in reservoir exploration. To obtain high SNR seismic data, significant effort is required to achieve noise attenuation in seismic data processing, which is costly in materials, and human and financial resources. We introduce a method for improving the SNR of seismic data. The SNR is calculated by using the frequency domain method. Furthermore, we optimize and discuss the critical parameters and calculation procedure. We applied the proposed method on real data and found that the SNR is high in the seismic marker and low in the fracture zone. Consequently, this can be used to extract detailed information about fracture zones that are inferred by structural analysis but not observed in conventional seismic data.

  16. Seismic source models for probabilistic hazard analysis of Georgia (Southern Caucasus)

    NASA Astrophysics Data System (ADS)

    Javakhishvili, Z.; Godoladze, T.; Gamkrelidze, E.; Sokhadze, G.

    2014-12-01

    Seismic Source model is one of the main components of probabilistic seismic-hazard analysis. Active faults and tectonics of Georgia (Sothern Caucasus) have been investigated in numerous scientific studies. The Caucasus consists of different geological structures with complex interactions. The major structures trend WNW-ESE, and focal mechanisms indicate primarily thrust faults striking parallel to the mountains. It is a part of the Alpine - Himalayan collision belt and it is well known for its high seismicity. Although the geodynamic activity of the region, caused by the convergence of the Arabian and the Eurasian plates at a rate of several cm/year, is well known, different tectonic models were proposed as an explanation for the seismic process in the region. The recent model on seismic sources for the Caucasus and derives from recent seismotectonic studies performed in Georgia in the framework of different international projects.We have analyzed previous studies and recent investigations on the bases of new seismic (spatial distribution, moment tensor solution etc), GPS and other data. As a result data base of seismic source models was compiled. Seismic sources are modeled as lines representing the surface projection of active faults or as wide areas (source zones), where the earthquakes can occur randomly. Each structure or zone was quantified on the basis of different parameters. Recent experience for harmonization of cross-border structures was used. As a result new seismic source model of Georgia (Southern Caucasus) for hazard analysis was created.

  17. Storey building early monitoring based on rapid seismic response analysis

    NASA Astrophysics Data System (ADS)

    Julius, Musa, Admiral; Sunardi, Bambang; Rudyanto, Ariska

    2016-05-01

    Within the last decade, advances in the acquisition, processing and transmission of data from seismic monitoring has contributed to the growth in the number structures instrumented with such systems. An equally important factor for such growth can be attributed to the demands by stakeholders to find rapid answers to important questions related to the functionality or state of "health" of structures during and immediately of a seismic events. Consequently, this study aims to monitor the storey building based on seismic response i. e. earthquake and tremor analysis at short time lapse using accelerographs data. This study used one of storey building (X) in Jakarta city that suffered the effects of Kebumen earthquake January 25th 2014, Pandeglang earthquake July 9th 2014, and Lebak earthquake November 8th 2014. Tremors used in this study are tremors after the three following earthquakes. Data processing used to determine peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), spectral acceleration (SA), spectral velocity (SV), spectral displacement (SD), A/V ratio, acceleration amplification and effective duration (te). Then determine the natural frequency (f0) and peak of H/V ratio using H/V ratio method.The earthquakes data processing result shows the value of peak ground motion, spectrum response, A/V ratio and acceleration amplification increases with height, while the value of the effective duration give a different viewpoint of building dynamic because duration of Kebumen earthquake shows the highest energy in the highest floor but Pandeglang and Lebak earthquake in the lowest floor. Then, tremors data processing result one month after each earthquakes shows the natural frequency of building in constant value. Increasing of peak ground motion, spectrum response, A/V ratio, acceleration amplification, then decrease of effective duration following the increase of building floors shows that the building construction supports the

  18. One-dimensional Seismic Analysis of a Solid-Waste Landfill

    SciTech Connect

    Castelli, Francesco; Lentini, Valentina; Maugeri, Michele

    2008-07-08

    Analysis of the seismic performance of solid waste landfill follows generally the same procedures for the design of embankment dams, even if the methods and safety requirements should be different. The characterization of waste properties for seismic design is difficult due the heterogeneity of the material, requiring the procurement of large samples. The dynamic characteristics of solid waste materials play an important role on the seismic response of landfill, and it also is important to assess the dynamic shear strengths of liner materials due the effect of inertial forces in the refuse mass. In the paper the numerical results of a dynamic analysis are reported and analysed to determine the reliability of the common practice of using 1D analysis to evaluate the seismic response of a municipal solid-waste landfill. Numerical results indicate that the seismic response of a landfill can vary significantly due to reasonable variations of waste properties, fill heights, site conditions, and design rock motions.

  19. Seismic hazard analysis for the NTS spent reactor fuel test site

    SciTech Connect

    Campbell, K.W.

    1980-05-02

    An experiment is being directed at the Nevada Test Site to test the feasibility for storage of spent fuel from nuclear reactors in geologic media. As part of this project, an analysis of the earthquake hazard was prepared. This report presents the results of this seismic hazard assessment. Two distinct components of the seismic hazard were addressed: vibratory ground motion and surface displacement. (ACR)

  20. An analysis of a seismic reflection from the base of a gas hydrate zone, offshore Peru

    USGS Publications Warehouse

    Miller, J.J.; Lee, M.W.; Von Huene, R.

    1991-01-01

    Seismic reflection data recorded near ODP Site 688, offshore Peru, exhibit a persistent bottom-simulating reflector (BSR) from a depth corresponding to the theoretical base of the gas hdyrate stability field. To carry out a quantitative analysis of the BSR, the seismic data were reprocessed using signature deconvolution and true amplitude recovery techniques. Results indicate the BSR is discontinuous laterally. -from Authors

  1. Landscape Vulnerability Analysis from Historic Lower Mississippi River Flood in 2011

    NASA Astrophysics Data System (ADS)

    Goodwell, A. E.; Zhu, Z.; Dutta, D.; Greenberg, J.; Kumar, P.; Garcia, M. H.; Rhoads, B. L.; Parker, G.; Berretta, D.; Holmes, R. R.

    2012-12-01

    This study presents the results of a landscape vulnerability analysis of the Birds Point New Madrid Floodway in southeastern Missouri. The U.S. Army Corps of Engineers intentionally inundated 500 square kilometers of agricultural floodplain in May of 2011 as an emergency flood control measure. We use pre-flood (2005) and post-flood (2011) high resolution Lidar data to establish the landscape impact of the levee breach on the floodplain. The Lidar DEMs were corrected for flight line errors using a Fourier filtering technique, and then subtracted to obtain a differential DEM of erosion and deposition patterns. We use soil erosion characteristics, AVIRIS remote sensing data, and 2D floodplain modeling to analyze the three components of vulnerability: sensitivity, exposure, and adaptive capacity. HydroSed2D (Liu, Landry and García 2008), a 2D flow model, is implemented to simulate flow depths and speeds, or flood exposure, over the entire floodway, as well as smaller sections at increased resolution using a nested grid. We classify woody vegetation based on AVIRIS remote sensing data, and represent vegetated regions in the model as varied values of the Manning's n coefficient. Soil erodibility, vegetation, topography, and flow characteristics are compared to observed landscape changes within the floodplain. Overall, the floodway showed a remarkable resilience to an extreme flood event. When compared to levee breaches on similar rivers in other floods, the lack of newly deposited sediment is noticeable and likely attributable to the presence of a substantial riparian corridor between the main channel of the Mississippi River and the floodway. Although many meander scars indicating former channels of the Mississippi River are apparent in the topography, only one, known as O'Bryan Ridge, experienced high volumes of erosion and deposition due to the flooding. The vulnerability analysis supports the hypothesis this high impact is due to a combination of vulnerability

  2. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    NASA Astrophysics Data System (ADS)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  3. Seismic signature analysis for discrimination of people from animals

    NASA Astrophysics Data System (ADS)

    Damarla, Thyagaraju; Mehmood, Asif; Sabatier, James M.

    2013-05-01

    Cadence analysis has been the main focus for discriminating between the seismic signatures of people and animals. However, cadence analysis fails when multiple targets are generating the signatures. We analyze the mechanism of human walking and the signature generated by a human walker, and compare it with the signature generated by a quadruped. We develop Fourier-based analysis to differentiate the human signatures from the animal signatures. We extract a set of basis vectors to represent the human and animal signatures using non-negative matrix factorization, and use them to separate and classify both the targets. Grazing animals such as deer, cows, etc., often produce sporadic signals as they move around from patch to patch of grass and one must characterize them so as to differentiate their signatures from signatures generated by a horse steadily walking along a path. These differences in the signatures are used in developing a robust algorithm to distinguish the signatures of animals from humans. The algorithm is tested on real data collected in a remote area.

  4. Letter report seismic shutdown system failure mode and effect analysis

    SciTech Connect

    KECK, R.D.

    1999-09-01

    The Supply Ventilation System Seismic Shutdown ensures that the 234-52 building supply fans, the dry air process fans and vertical development calciner are shutdown following a seismic event. This evaluates the failure modes and determines the effects of the failure modes.

  5. China's water resources vulnerability: A spatio-temporal analysis during 2003-2013

    NASA Astrophysics Data System (ADS)

    Cai, J.; Varis, O.; Yin, H.

    2015-12-01

    The present highly serious situation of China's water environment and aquatic ecosystems has occurred in the context of its stunning socioeconomic development over the past several decades. Therefore, an analysis with a high spatio-temporal resolution of the vulnerability assessment of water resources (VAWR) in China is burningly needed. However, to our knowledge, the temporal analysis of VAWR has been not yet addressed. Consequently, we performed, for the first time, a comprehensive spatio-temporal analysis of China's water resources vulnerability (WRV), using a composite index approach with an array of aspects highlighting key challenges that China's water resources system is nowadays facing. During our study period of 2003-2013, the political weight of China's integrated water resources management has been increasing continuously. Hence, it is essential and significant, based on the historical socioeconomic changes influenced by water-environment policy making and implementation, to reveal China's WRV for pinpointing key challenges to the healthy functionality of its water resources system. The water resources system in North and Central Coast appeared more vulnerable than that in Western China. China's water use efficiency has grown substantially over the study period, and so is water supply and sanitation coverage. In contrast, water pollution has been worsening remarkably in most parts of China, and so have water scarcity and shortage in the most stressed parts of the country. This spatio-temporal analysis implies that the key challenges to China's water resources system not only root in the geographical mismatch between socioeconomic development (e.g. water demand) and water resources endowments (e.g. water resources availability), but also stem from the intertwinement between socioeconomic development and national strategic policy making.

  6. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    SciTech Connect

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  7. Detecting Seismic Activity with a Covariance Matrix Analysis of Data Recorded on Seismic Arrays

    NASA Astrophysics Data System (ADS)

    Seydoux, L.; Shapiro, N.; de Rosny, J.; Brenguier, F.

    2014-12-01

    Modern seismic networks are recording the ground motion continuously all around the word, with very broadband and high-sensitivity sensors. The aim of our study is to apply statistical array-based approaches to processing of these records. We use the methods mainly brought from the random matrix theory in order to give a statistical description of seismic wavefields recorded at the Earth's surface. We estimate the array covariance matrix and explore the distribution of its eigenvalues that contains information about the coherency of the sources that generated the studied wavefields. With this approach, we can make distinctions between the signals generated by isolated deterministic sources and the "random" ambient noise. We design an algorithm that uses the distribution of the array covariance matrix eigenvalues to detect signals corresponding to coherent seismic events. We investigate the detection capacity of our methods at different scales and in different frequency ranges by applying it to the records of two networks: (1) the seismic monitoring network operating on the Piton de la Fournaise volcano at La Réunion island composed of 21 receivers and with an aperture of ~15 km, and (2) the transportable component of the USArray composed of ~400 receivers with ~70 km inter-station spacing.

  8. Application and Validation of a GIS Model for Local Tsunami Vulnerability and Mortality Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Frauenfelder, R.; Kaiser, G.; Glimsdal, S.; Sverdrup-thygeson, K.; Løvholt, F.; Gruenburg, L.; Mc Adoo, B. G.

    2015-12-01

    The 2011 Tōhoku tsunami caused a high number of fatalities and massive destruction. Data collected after the event allow for retrospective analyses. Since 2009, NGI has developed a generic GIS model for local analyses of tsunami vulnerability and mortality risk. The mortality risk convolves the hazard, exposure, and vulnerability. The hazard is represented by the maximum tsunami flow depth (with a corresponding likelihood), the exposure is described by the population density in time and space, while the vulnerability is expressed by the probability of being killed as a function of flow depth and building class. The analysis is further based on high-resolution DEMs. Normally a certain tsunami scenario with a corresponding return period is applied for vulnerability and mortality risk analysis. Hence, the model was first employed for a tsunami forecast scenario affecting Bridgetown, Barbados, and further developed in a forecast study for the city of Batangas in the Philippines. Subsequently, the model was tested by hindcasting the 2009 South Pacific tsunami in American Samoa. This hindcast was based on post-tsunami information. The GIS model was adapted for optimal use of the available data and successfully estimated the degree of mortality.For further validation and development, the model was recently applied in the RAPSODI project for hindcasting the 2011 Tōhoku tsunami in Sendai and Ishinomaki. With reasonable choices of building vulnerability, the estimated expected number of fatalities agree well with the reported death toll. The results of the mortality hindcast for the 2011 Tōhoku tsunami substantiate that the GIS model can help to identify high tsunami mortality risk areas, as well as identify the main risk drivers.The research leading to these results has received funding from CONCERT-Japan Joint Call on Efficient Energy Storage and Distribution/Resilience against Disasters (http://www.concertjapan.eu; project RAPSODI - Risk Assessment and design of

  9. A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators

    PubMed Central

    Beccari, Benjamin

    2016-01-01

    related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. Discussion: A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development. PMID:27066298

  10. Geo-ethical dimension of community's safety: rural and urban population vulnerability analysis methodology

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro; Kopachevsky, Ivan; Yuschenko, Maxim

    2016-04-01

    Modern world based on relations more than on causalities, so communicative, socio-economic, and socio-cultural issues are important to understand nature of risks and to make correct, ethical decisions. Today major part of risk analysts declared new nature of modern risks. We faced coherent or systemic risks, realization of which leads to domino effect, unexpected growing of losses and fatalities. This type of risks originated by complicated nature of heterogeneous environment, close interconnection of engineering networks, and changing structure of society. Heterogeneous multi-agent environment generates systemic risks, which requires analyze multi-source data with sophisticated tools. Formal basis for analysis of this type of risks is developed during last 5-7 years. But issues of social fairness, ethics, and education require further development. One aspect of analysis of social issues of risk management is studied in this paper. Formal algorithm for quantitative analysis of multi-source data analysis is proposed. As it was demonstrated, using proposed methodological base and the algorithm, it is possible to obtain regularized spatial-temporal distribution of investigated parameters over whole observation period with rectified reliability and controlled uncertainty. The result of disaster data analysis demonstrates that about half of direct disaster damage might be caused by social factors: education, experience and social behaviour. Using data presented also possible to estimate quantitative parameters of the losses distributions: a relation between education, age, experience, and losses; as well as vulnerability (in terms of probable damage) toward financial status in current social density. It is demonstrated that on wide-scale range an education determines risk perception and so vulnerability of societies. But on the local level there are important heterogeneities. Land-use and urbanization structure influencing to vulnerability essentially. The way to

  11. Reservoir lithofacies analysis using 3D seismic data in dissimilarity space

    NASA Astrophysics Data System (ADS)

    Bagheri, M.; Riahi, M. A.; Hashemi, H.

    2013-06-01

    Seismic data interpretation is one of the most important steps in exploration seismology. Seismic facies analysis (SFA) with emphasis on lithofacies can be used to extract more information about structures and geology, which results in seismic interpretation enhancement. Facies analysis is based on unsupervised and supervised classification using seismic attributes. In this paper, supervised classification by a support vector machine using well logs and seismic attributes is applied. Dissimilarity as a new measuring space is employed, after which classification is carried out. Often, SFA is carried out in a feature space in which each dimension stands as a seismic attribute. Different facies show lots of class overlap in the feature space; hence, high classification error values are reported. Therefore, decreasing class overlap before classification is a necessary step to be targeted. To achieve this goal, a dissimilarity space is initially created. As a result of the definition of the new space, the class overlap between objects (seismic samples) is reduced and hence the classification can be done reliably. This strategy causes an increase in the accuracy of classification, and a more trustworthy lithofacies analysis is attained. For applying this method, 3D seismic data from an oil field in Iran were selected and the results obtained by a support vector classifier (SVC) in dissimilarity space are presented, discussed and compared with the SVC applied in conventional feature space.

  12. Numerical analysis on seismic response of Shinkansen bridge-train interaction system under moderate earthquakes

    NASA Astrophysics Data System (ADS)

    He, Xingwen; Kawatani, Mitsuo; Hayashikawa, Toshiro; Matsumoto, Takashi

    2011-03-01

    This study is intended to evaluate the influence of dynamic bridge-train interaction (BTI) on the seismic response of the Shinkansen system in Japan under moderate earthquakes. An analytical approach to simulate the seismic response of the BTI system is developed. In this approach, the behavior of the bridge structure is assumed to be within the elastic range under moderate ground motions. A bullet train car model idealized as a sprung-mass system is established. The viaduct is modeled with 3D finite elements. The BTI analysis algorithm is verified by comparing the analytical and experimental results. The seismic analysis is validated through comparison with a general program. Then, the seismic responses of the BTI system are simulated and evaluated. Some useful conclusions are drawn, indicating the importance of a proper consideration of the dynamic BTI in seismic design.

  13. Pembina Cardium CO2-EOR monitoring project: Integrated surface seismic and VSP time-lapse seismic analysis

    NASA Astrophysics Data System (ADS)

    Alshuhail, A. A.

    2009-12-01

    In the Pembina field in west-central Alberta, Canada, approximately 40,000 tons of supercritical CO2 was injected into the 1650 m deep, 20 m thick upper-Cretaceous Cardium Fm. between March 2005 and 2007. A time-lapse seismic program was designed and incorporated into the overall measurement, monitoring and verification program. The objectives were to track the CO2 plume within the reservoir, and to evaluate the integrity of storage. Fluid replacement modeling predicts a decrease in the P-wave velocity and bulk density in the reservoir by about 4% and 1%, respectively. Synthetic seismograms show subtle reflectivity changes at the Cardium Fm. and a traveltime delay at the later high-amplitude Viking event of less than 1 ms. The time-lapse datasets, however, show no significant anomalies in the P-wave seismic data that can be attributed to supercritical CO2 injected into the Cardium Fm. (Figure 1). The converted-wave (P-S) data, on the other hand, showed small traveltime anomalies. The most coherent results were those obtained by the fixed-array VSP dataset (Figure 2) due to higher frequency bandwidth and high signal to noise ratio. The amplitude and traveltime changes observed in the VSP dataset are small but are consistent in magnitude with those predicted from rock physics modeling. The analysis suggests that the inability to clearly detect the CO2 plume in surface seismic data is likely due to the CO2 being contained in thin permeable sandstone members of the Cardium Formation. The seismic signature of the Cardium Fm. in this area may also be degraded by multiples and strong attenuation involving the shallow Ardley coals. However, the lack of a 4D seismic changes above the reservoir indicates that the injected CO2 is not migrating through the caprock into shallower formations.

  14. Analysis of seismic noise recorded by temporary seismic array near the Pyhäsalmi underground mine in Finland

    NASA Astrophysics Data System (ADS)

    Afonin, Nikita; Kozlovskaya, Elena; Narkilahti, Janne; Nevalainen, Jouni

    2016-04-01

    The Pyhäsalmi mine is an underground copper and zinc mine located in central Finland. It is one of the oldest and deepest underground mines in Europe, in which ore is excavated from the depth of about 1450 m. Due to the large amount of heavy machinery, the mine itself is a source of strong seismic and acoustic noise. This continuous noise creates a problem for high-resolution active source seismic experiments. That is why in out study we investigated the opportunity to use this seismic noise for studying structure of the uppermost crust. For this we installed 24 3-component DSU-SA MEMS seismic sensors with the autonomous RAUD eX data acquisition units produced by Sercel Ltd. along a 10 km long line crossing the mine area. The array recorded continuous seismic data from 29.10.2013 to 1.11.2013 with the sampling rate of 500 sps. The continuous data for the period 5 days were processed in several steps including single station data analysis, pre-filtering and time-domain stacking. The processed data set was used to estimate empirical Green's functions (EGF) between pairs of stations in the frequency band of 1-100 Hz. We developed our own procedure of stacking EGF in time-domain and, as a result, we were able to extract not only Rayleigh, but also refracted P-waves. Finally, we calculated surface wave dispersion curves and solved inversion problems for surface waves and refracted waves. In our paper we concentrate mainly on details of our data processing routine and its influence on quality of results of EGF extraction. The study is a part of SEISLAB project funded by he European Regional Development Fund (ERDF), Council of Oulu region (Finland) and Pyhäsalmi Mine Oy.

  15. A probabilistic method for evaluation of seismic amplification at a regional scale - A case study in some high seismic risk areas of the Northern Apennines (Italy)

    NASA Astrophysics Data System (ADS)

    Delle Donne, Dario; Ripepe, Maurizio; Lacanna, Giorgio; Marchetti, Emanuele; Fabbroni, Pierangelo; Baglione, Massimo; D'Intinosante, Vittorio

    2014-05-01

    Seismic amplification caused by local geological conditions has an important role in seismic risk assessment. The main parameters controlling seismic amplification are the shear wave velocities of shallow sub-surface (Vs) and the thickness of soft sediments (h). However, the knowledge of shear wave velocity profile is usually sparse and can not be measured over large areas. In this study we propose a method that integrates data from surface geological maps with data from subsurface seismo-stratigraphic well-logs, and is aimed to estimate seismic amplification over large areas (~100 km2) through a probabilistic approach. The methodology we developed is characterized by the following steps: 1. Analysis of the geological framework and definition of Seismic Units; 2. 1-D seismic modeling of each Seismic Unit; 3. Probability analysis of Seismic Amplification. Probability function of seismic amplification for each Seismic Unit is calculated for all the possible combinations of the expected values of Vs and thickness (h). We apply this approach to seismic areas in the Northern Apennines (Italy). Finally, the results of this analysis have been validated by seismic amplification measurements using local and regional earthquakes and with macro-seismic data. The comparison between the predicted amplification using this probabilistic approach and the measured seismic amplification shows a general agreement. This work is not intended as an alternative to the standard methodologies to calculate site effect, but offers a new approach to identify areas potentially more vulnerable.

  16. Array analysis methods for detection, classification and location of seismic sources: a first evaluation for aftershock analysis using dense temporary post-seismic array network

    NASA Astrophysics Data System (ADS)

    Poiata, N.; Satriano, C.; Vilotte, J.; Bernard, P.

    2012-12-01

    Detection, separation, classification and location of distributed non stationary seismic sources in broadband noisy environment is an important problem in seismology, in particular for monitoring the high-level post-seismic activity following large subduction earthquakes, like the off-shore Maule (Mw 8.8, 2010) earthquake in Central Chile. Multiple seismic arrays, and local antenna, distributed over a region allow exploiting frequency selective coherence of the signals that arrive at widely-separated array stations, leading to improved detection, convolution blind source separation, and location of distributed non stationary sources. We present here first results on the investigation of time-frequency adaptive array analysis techniques for detection and location of broadband distributed seismic events recorded by the dense temporary seismic network (International Maule Aftershock Deployment, IMAD) installed for monitoring the high-level seismic activity following the 27 February 2010 Maule earthquake (Mw 8.8). This seismic network is characterized by a large aperture, with variable inter-station distances, corroborated with a high level of distributed near and far field seismic source activity and noise. For this study, we first extract from the post-seismic network a number of seismic arrays distributed over the region covered by this network. A first aspect is devoted to passive distributed seismic sources detection, classification and separation. We investigate a number of narrow and wide band signal analysis methods both in time and time-frequency domains for energy arrival detection and tracking, including time adaptive higher order statistics, e.g. like kurtosis, and multiband band-pass filtering, together with adaptive time-frequency transformation and extraction techniques. We demonstrate that these techniques provide superior resolution and robustness than classical STA/LTA techniques in particular in the case of distributed sources with potential signal

  17. Sampling and Analysis Plan Waste Treatment Plant Seismic Boreholes Project.

    SciTech Connect

    Brouns, Thomas M.

    2007-07-15

    This sampling and analysis plan (SAP) describes planned data collection activities for four entry boreholes through the sediment overlying the Saddle Mountains Basalt, up to three new deep rotary boreholes through the Saddle Mountains Basalt and sedimentary interbeds, and one corehole through the Saddle Mountains Basalt and sedimentary interbeds at the Waste Treatment Plant (WTP) site. The SAP will be used in concert with the quality assurance plan for the project to guide the procedure development and data collection activities needed to support borehole drilling, geophysical measurements, and sampling. This SAP identifies the American Society of Testing Materials standards, Hanford Site procedures, and other guidance to be followed for data collection activities. Revision 3 incorporates all interim change notices (ICN) that were issued to Revision 2 prior to completion of sampling and analysis activities for the WTP Seismic Boreholes Project. This revision also incorporates changes to the exact number of samples submitted for dynamic testing as directed by the U.S. Army Corps of Engineers. Revision 3 represents the final version of the SAP.

  18. Digital Cartographic Models as Analysis Support in Multicriterial Assessment of Vulnerable Flood Risk Elements

    NASA Astrophysics Data System (ADS)

    Nichersu, Iulian; Mierla, Marian; Trifanov, Cristian; Nichersu, Iuliana; Marin, Eugenia; Sela, Florentina

    2014-05-01

    In the last 20 years there has been an increase of frequency in extreme weather and hydrological events. This frequency increase arise the need to research the risk to the events that are extreme and has big impact to the environment. This paper presents a method to analysis the vulnerable elements to the risk at extreme hydrological event, to be more precisely to flood. The method is using also the LiDAR point cloud. The risk concept has two main components: the first one hazard (represented by frequency of the occurrence and intensity of the flood) and the second one vulnerability (represented by the vulnerable elements to the flood). The studied area in the present paper is situated in the South-East of Europe (Romania, Danube Delta). The Digital Cartographic Models were accomplished by using the LiDAR data obtained within the CARTODD project. The digital cartographic models, with high resolution, consist of 3 components: digital terrain model (DTM), digital elevation model (DEM) and elevation classes (EC). Completing the information of the three models there were used also the orthophotos in visible (VIS) and infrared (IR) spectrum slices. Digital Terrain Model gives information on the altitude of the terrain and indirect of the flood hazard, taking into account the high resolution that the final product has. Digital Elevation Model supplies information related to the surfaces of the terrain plus the altitude of each object on the surface. This model helps to reach to the third model the Elevation Classes Model. We present here three categories of applications of clouds points analyses in floodrisk assessment: buildings assessment, endangered species mentioned in Annex 1 of the European Habitats Directive and morphologic/habitats damages. Pilot case studies of these applications are: Sulina town; endangering species like Osmoderma eremita, Vipera ursini and Spermophilus citellus; Sireasa Polder. For Sulina town was assessed the manmade vulnerable elements to

  19. Regional analysis of earthquake occurrence and seismic energy release

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1980-01-01

    The historic temporal variation in earthquake occurrence and seismic energy release on a regional basis throughtout the world were studied. The regionalization scheme employed divided the world into large areas based either on seismic and tectonic considerations (Flinn-Engdahl Scheme) or geographic (longitude and latitude) criteria. The data set is the wide earthquake catalog of the National Geophysical Solar-Terrestrial Data Center. An apparent relationship exists between the maximum energy released in a limited time within a seismic region and the average or background energy per year averaged over a long time period. In terms of average or peak energy release, the most seismic regions of the world during the 50 to 81 year period ending in 1977 were Japanese, Andean South American, and the Alaska-Aleutian Arc regions. The year to year fluctuations in regional seismic energy release are greater, by orders of magnitude, than the corresponding variations in the world-wide seismic energy release. The b values of seismic regions range from 0.7 to 1.4 where earthquake magnitude is in the range 6.0 to 7.5.

  20. Optimization strategies for the vulnerability analysis of the electric power grid.

    SciTech Connect

    Meza, Juan C.; Pinar, Ali; Lesieutre, Bernard; Donde, Vaibhav

    2009-03-01

    Identifying small groups of lines, whose removal would cause a severe blackout, is critical for the secure operation of the electric power grid. We show how power grid vulnerability analysis can be studied as a mixed integer nonlinear programming (minlp) problem. Our analysis reveals a special structure in the formulation that can be exploited to avoid nonlinearity and approximate the original problem as a pure combinatorial problem. The key new observation behind our analysis is the correspondence between the Jacobian matrix (a representation of the feasibility boundary of the equations that describe the flow of power in the network) and the Laplacian matrix in spectral graph theory (a representation of the graph of the power grid). The reduced combinatorial problem is known as the network inhibition problem, for which we present a mixed integer linear programming formulation. Our experiments on benchmark power grids show that the reduced combinatorial model provides an accurate approximation, to enable vulnerability analyses of real-sized problems with more than 10,000 power lines.

  1. Optimization Strategies for the Vulnerability Analysis of the Electric Power Grid

    SciTech Connect

    Pinar, A.; Meza, J.; Donde, V.; Lesieutre, B.

    2007-11-13

    Identifying small groups of lines, whose removal would cause a severe blackout, is critical for the secure operation of the electric power grid. We show how power grid vulnerability analysis can be studied as a mixed integer nonlinear programming (MINLP) problem. Our analysis reveals a special structure in the formulation that can be exploited to avoid nonlinearity and approximate the original problem as a pure combinatorial problem. The key new observation behind our analysis is the correspondence between the Jacobian matrix (a representation of the feasibility boundary of the equations that describe the flow of power in the network) and the Laplacian matrix in spectral graph theory (a representation of the graph of the power grid). The reduced combinatorial problem is known as the network inhibition problem, for which we present a mixed integer linear programming formulation. Our experiments on benchmark power grids show that the reduced combinatorial model provides an accurate approximation, to enable vulnerability analyses of real-sized problems with more than 10,000 power lines.

  2. Seismic data interpretation using the Hough transform and principal component analysis

    NASA Astrophysics Data System (ADS)

    Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Martin, R.; Ávila-Carrera, R.; Rodríguez-Castellanos, A.

    2011-03-01

    In this work two novel image processing techniques are applied to detect and delineate complex salt bodies from seismic exploration profiles: Hough transform and principal component analysis (PCA). It is well recognized by the geophysical community that the lack of resolution and poor structural identification in seismic data recorded at sub-salt plays represent severe technical and economical problems. Under such circumstances, seismic interpretation based only on the human-eye is inaccurate. Additionally, petroleum field development decisions and production planning depend on good-quality seismic images that generally are not feasible in salt tectonics areas. In spite of this, morphological erosion, region growing and, especially, a generalization of the Hough transform (closely related to the Radon transform) are applied to build parabolic shapes that are useful in the idealization and recognition of salt domes from 2D seismic profiles. In a similar way, PCA is also used to identify shapes associated with complex salt bodies in seismic profiles extracted from 3D seismic data. To show the validity of the new set of seismic results, comparisons between both image processing techniques are exhibited. It is remarkable that the main contribution of this work is oriented in providing the seismic interpreters with new semi-automatic computational tools. The novel image processing approaches presented here may be helpful in the identification of diapirs and other complex geological features from seismic images. Conceivably, in the near future, a new branch of seismic attributes could be recognized by geoscientists and engineers based on the encouraging results reported here.

  3. Discrimination of porosity and fluid saturation using seismic velocity analysis

    DOEpatents

    Berryman, James G.

    2001-01-01

    The method of the invention is employed for determining the state of saturation in a subterranean formation using only seismic velocity measurements (e.g., shear and compressional wave velocity data). Seismic velocity data collected from a region of the formation of like solid material properties can provide relatively accurate partial saturation data derived from a well-defined triangle plotted in a (.rho./.mu., .lambda./.mu.)-plane. When the seismic velocity data are collected over a large region of a formation having both like and unlike materials, the method first distinguishes the like materials by initially plotting the seismic velocity data in a (.rho./.lambda., .mu./.lambda.)-plane to determine regions of the formation having like solid material properties and porosity.

  4. Shallow seismic surface waves analysis across a tectonic fault

    NASA Astrophysics Data System (ADS)

    Gazdova, R.; Vilhelm, J.; Kolinsky, P.

    2011-12-01

    When performing a seismic survey of a shallow medium, we record wave motion which can be excited by a sledge hammer blow on the ground surface. The recorded wave motion is a complex combination of different types of waves, propagating directly from the source to the receiver, reflecting from velocity boundaries, passing through multiple layers or forming dispersive surface waves. We can use all of these wave types to identify the structure of the medium. In the presented contribution we deal with interpretation of surface waves. In contrast with body waves, the surface wave velocity is frequency-dependent. This property is called dispersion, and the dependence of the velocity on the frequency is known as the dispersion curve. The measured dispersion of the surface waves can be used to assess the structural velocity distribution in the layered medium, through which the waves propagate. We analyze surface waves recorded within the geophysical survey of the paleoseismological trench site over the Hluboka tectonic fault, Czech Republic, Central Europe. The surface waves in frequency range 15 - 70 Hz were recorded by the three component geophones with the active (sledge hammer) source. Group velocities are analyzed by the program SVAL which is based on the multiple filtering technique. It is a standard method of the Fourier transform-based frequency-time analysis. The spectrum of each record is multiplied by weighting functions centered at many discrete frequencies. Five local envelope maxima of all quasiharmonic components obtained by the inverse Fourier transform are found and their propagation times determined. These maxima are assigned to different modes of direct surface waves as well as to possible reflected, converted and multipathed modes. Filtered fundamental modes at pairs of geophones are correlated and phase velocities of surface waves are computed from the delays of propagation times of all quasiharmonic components. From the dispersion curves the shear wave

  5. Seismic Analysis Issues in Design Certification Applications for New Reactors

    SciTech Connect

    Miranda, M.; Morante, R.; Xu, J.

    2011-07-17

    The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants,” provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

  6. Risk and Vulnerability Analysis of Satellites Due to MM/SD with PIRAT

    NASA Astrophysics Data System (ADS)

    Kempf, Scott; Schafer, Frank Rudolph, Martin; Welty, Nathan; Donath, Therese; Destefanis, Roberto; Grassi, Lilith; Janovsky, Rolf; Evans, Leanne; Winterboer, Arne

    2013-08-01

    Until recently, the state-of-the-art assessment of the threat posed to spacecraft by micrometeoroids and space debris was limited to the application of ballistic limit equations to the outer hull of a spacecraft. The probability of no penetration (PNP) is acceptable for assessing the risk and vulnerability of manned space mission, however, for unmanned missions, whereby penetrations of the spacecraft exterior do not necessarily constitute satellite or mission failure, these values are overly conservative. The newly developed software tool PIRAT (Particle Impact Risk and Vulnerability Analysis Tool) has been developed based on the Schäfer-Ryan-Lambert (SRL) triple-wall ballistic limit equation (BLE), applicable for various satellite components. As a result, it has become possible to assess the individual failure rates of satellite components. This paper demonstrates the modeling of an example satellite, the performance of a PIRAT analysis and the potential for subsequent design optimizations with respect of micrometeoroid and space debris (MM/SD) impact risk.

  7. Seismicity and earthquake hazard analysis of the Teton-Yellowstone region, Wyoming

    NASA Astrophysics Data System (ADS)

    White, Bonnie J. Pickering; Smith, Robert B.; Husen, Stephan; Farrell, Jamie M.; Wong, Ivan

    2009-11-01

    hypocenters, unified magnitudes, and seismotectonic analysis helped refine the characterization of the background seismicity that was used as input into a probabilistic seismic hazards analysis. Our results reveals the highest seismic hazard is associated with the Teton fault because of its high slip-rate of approximately 1.3 mm/yr compared to the highest rate of 1.4 mm/yr in southern Yellowstone on the Mt. Sheridan fault. This study demonstrates that the Teton-Yellowstone area is among the regions highest seismic hazard in the western U.S.

  8. Earthquake Cluster Analysis for Turkey and its Application for Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake clusters are an important element in general seismology and also for the application in seismic hazard assessment. In probabilistic seismic hazard assessment, the occurrence of earthquakes is often linked to an independent Monte Carlo process, following a stationary Poisson model. But earthquakes are dependent and constrained, especially in terms of earthquake swarms, fore- and aftershocks or even larger sequences as observed for the Landers sequence in California or the Darfield-Christchurch sequence in New Zealand. For earthquake catalogues, the element of declustering is an important step to capture earthquake frequencies by avoiding a bias towards small magnitudes due to aftershocks. On the other hand, declustered catalogues for independent probabilistic seismic activity will underestimate the total number of earthquakes by neglecting dependent seismicity. In this study, the effect of clusters on probabilistic seismic hazard assessment is investigated in detail. To capture the features of earthquake clusters, a uniform framework for earthquake cluster analysis is introduced using methodologies of geostatistics and machine learning. These features represent important cluster characteristics like cluster b-values, temporal decay, rupture orientations and many more. Cluster parameters are mapped in space using kriging. Furthermore, a detailed data analysis is undertaken to provide magnitude-dependent relations for various cluster parameters. The acquired features are used to introduce dependent seismicity within stochastic earthquake catalogues. In addition, the development of smooth seismicity maps based on historic databases is in general biased to the more complete recent decades. A filling methodology is introduced which will add dependent seismicity in catalogues where none has been recorded to avoid the above mentioned bias. As a case study, Turkey has been chosen due to its inherent seismic activity and well-recorded data coverage. Clustering

  9. Pre-stack-texture-based reservoir characteristics and seismic facies analysis

    NASA Astrophysics Data System (ADS)

    Song, Cheng-Yun; Liu, Zhi-Ning; Cai, Han-Peng; Qian, Feng; Hu, Guang-Min

    2016-03-01

    Seismic texture attributes are closely related to seismic facies and reservoir characteristics and are thus widely used in seismic data interpretation. However, information is mislaid in the stacking process when traditional texture attributes are extracted from post-stack data, which is detrimental to complex reservoir description. In this study, pre-stack texture attributes are introduced, these attributes can not only capable of precisely depicting the lateral continuity of waveforms between different reflection points but also reflect amplitude versus offset, anisotropy, and heterogeneity in the medium. Due to its strong ability to represent stratigraphics, a pre-stack-data-based seismic facies analysis method is proposed using the self-organizing map algorithm. This method is tested on wide azimuth seismic data from China, and the advantages of pre-stack texture attributes in the description of stratum lateral changes are verified, in addition to the method's ability to reveal anisotropy and heterogeneity characteristics. The pre-stack texture classification results effectively distinguish different seismic reflection patterns, thereby providing reliable evidence for use in seismic facies analysis.

  10. Vulnerabilities to Rock-Slope Failure Impacts from Christchurch, NZ Case History Analysis

    NASA Astrophysics Data System (ADS)

    Grant, A.; Wartman, J.; Massey, C. I.; Olsen, M. J.; Motley, M. R.; Hanson, D.; Henderson, J.

    2015-12-01

    Rock-slope failures during the 2010/11 Canterbury (Christchurch), New Zealand Earthquake Sequence resulted in 5 fatalities and caused an estimated US$400 million of damage to buildings and infrastructure. Reducing losses from rock-slope failures requires consideration of both hazard (i.e. likelihood of occurrence) and risk (i.e. likelihood of losses given an occurrence). Risk assessment thus requires information on the vulnerability of structures to rock or boulder impacts. Here we present 32 case histories of structures impacted by boulders triggered during the 2010/11 Canterbury earthquake sequence, in the Port Hills region of Christchurch, New Zealand. The consequences of rock fall impacts on structures, taken as penetration distance into structures, are shown to follow a power-law distribution with impact energy. Detailed mapping of rock fall sources and paths from field mapping, aerial lidar digital elevation model (DEM) data, and high-resolution aerial imagery produced 32 well-constrained runout paths of boulders that impacted structures. Impact velocities used for structural analysis were developed using lumped mass 2-D rock fall runout models using 1-m resolution lidar elevation data. Model inputs were based on calibrated surface parameters from mapped runout paths of 198 additional boulder runouts. Terrestrial lidar scans and structure from motion (SfM) imagery generated 3-D point cloud data used to measure structural damage and impacting boulders. Combining velocity distributions from 2-D analysis and high-precision boulder dimensions, kinetic energy distributions were calculated for all impacts. Calculated impact energy versus penetration distance for all cases suggests a power-law relationship between damage and impact energy. These case histories and resulting fragility curve should serve as a foundation for future risk analysis of rock fall hazards by linking vulnerability data to the predicted energy distributions from the hazard analysis.