Sample records for probabilistic hazard maps

  1. First USGS urban seismic hazard maps predict the effects of soils

    USGS Publications Warehouse

    Cramer, C.H.; Gomberg, J.S.; Schweig, E.S.; Waldron, B.A.; Tucker, K.

    2006-01-01

    Probabilistic and scenario urban seismic hazard maps have been produced for Memphis, Shelby County, Tennessee covering a six-quadrangle area of the city. The nine probabilistic maps are for peak ground acceleration and 0.2 s and 1.0 s spectral acceleration and for 10%, 5%, and 2% probability of being exceeded in 50 years. Six scenario maps for these three ground motions have also been generated for both an M7.7 and M6.2 on the southwest arm of the New Madrid seismic zone ending at Marked Tree, Arkansas. All maps include the effect of local geology. Relative to the national seismic hazard maps, the effect of the thick sediments beneath Memphis is to decrease 0.2 s probabilistic ground motions by 0-30% and increase 1.0 s probabilistic ground motions by ???100%. Probabilistic peak ground accelerations remain at levels similar to the national maps, although the ground motion gradient across Shelby County is reduced and ground motions are more uniform within the county. The M7.7 scenario maps show ground motions similar to the 5%-in-50-year probabilistic maps. As an effect of local geology, both M7.7 and M6.2 scenario maps show a more uniform seismic ground-motion hazard across Shelby County than scenario maps with constant site conditions (i.e., NEHRP B/C boundary).

  2. New ShakeMaps for Georgia Resulting from Collaboration with EMME

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.; Varazanashvili, O.; Alania, V.

    2015-12-01

    Correct assessment of probabilistic seismic hazard and risks maps are first step for advance planning and action to reduce seismic risk. Seismic hazard maps for Georgia were calculated based on modern approach that was developed in the frame of EMME (Earthquake Modl for Middle east region) project. EMME was one of GEM's successful endeavors at regional level. With EMME and GEM assistance, regional models were analyzed to identify the information and additional work needed for the preparation national hazard models. Probabilistic seismic hazard map (PSH) provides the critical bases for improved building code and construction. The most serious deficiency in PSH assessment for the territory of Georgia is the lack of high-quality ground motion data. Due to this an initial hybrid empirical ground motion model is developed for PGA and SA at selected periods. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Obtained results of seismic hazard maps show evidence that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation.

  3. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA

  4. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    NASA Astrophysics Data System (ADS)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  5. Seismic Hazard Maps for Seattle, Washington, Incorporating 3D Sedimentary Basin Effects, Nonlinear Site Response, and Rupture Directivity

    USGS Publications Warehouse

    Frankel, Arthur D.; Stephenson, William J.; Carver, David L.; Williams, Robert A.; Odum, Jack K.; Rhea, Susan

    2007-01-01

    This report presents probabilistic seismic hazard maps for Seattle, Washington, based on over 500 3D simulations of ground motions from scenario earthquakes. These maps include 3D sedimentary basin effects and rupture directivity. Nonlinear site response for soft-soil sites of fill and alluvium was also applied in the maps. The report describes the methodology for incorporating source and site dependent amplification factors into a probabilistic seismic hazard calculation. 3D simulations were conducted for the various earthquake sources that can affect Seattle: Seattle fault zone, Cascadia subduction zone, South Whidbey Island fault, and background shallow and deep earthquakes. The maps presented in this document used essentially the same set of faults and distributed-earthquake sources as in the 2002 national seismic hazard maps. The 3D velocity model utilized in the simulations was validated by modeling the amplitudes and waveforms of observed seismograms from five earthquakes in the region, including the 2001 M6.8 Nisqually earthquake. The probabilistic seismic hazard maps presented here depict 1 Hz response spectral accelerations with 10%, 5%, and 2% probabilities of exceedance in 50 years. The maps are based on determinations of seismic hazard for 7236 sites with a spacing of 280 m. The maps show that the most hazardous locations for this frequency band (around 1 Hz) are soft-soil sites (fill and alluvium) within the Seattle basin and along the inferred trace of the frontal fault of the Seattle fault zone. The next highest hazard is typically found for soft-soil sites in the Duwamish Valley south of the Seattle basin. In general, stiff-soil sites in the Seattle basin exhibit higher hazard than stiff-soil sites outside the basin. Sites with shallow bedrock outside the Seattle basin have the lowest estimated hazard for this frequency band.

  6. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps

    USGS Publications Warehouse

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha

    2016-01-01

    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated ground deformation

  7. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  8. Quantifying the uncertainty in site amplification modeling and its effects on site-specific seismic-hazard estimation in the upper Mississippi embayment and adjacent areas

    USGS Publications Warehouse

    Cramer, C.H.

    2006-01-01

    The Mississippi embayment, located in the central United States, and its thick deposits of sediments (over 1 km in places) have a large effect on earthquake ground motions. Several previous studies have addressed how these thick sediments might modify probabilistic seismic-hazard maps. The high seismic hazard associated with the New Madrid seismic zone makes it particularly important to quantify the uncertainty in modeling site amplification to better represent earthquake hazard in seismic-hazard maps. The methodology of the Memphis urban seismic-hazard-mapping project (Cramer et al., 2004) is combined with the reference profile approach of Toro and Silva (2001) to better estimate seismic hazard in the Mississippi embayment. Improvements over previous approaches include using the 2002 national seismic-hazard model, fully probabilistic hazard calculations, calibration of site amplification with improved nonlinear soil-response estimates, and estimates of uncertainty. Comparisons are made with the results of several previous studies, and estimates of uncertainty inherent in site-amplification modeling for the upper Mississippi embayment are developed. I present new seismic-hazard maps for the upper Mississippi embayment with the effects of site geology incorporating these uncertainties.

  9. Probabilistic seismic hazard estimates incorporating site effects - An example from Indiana, U.S.A

    USGS Publications Warehouse

    Hasse, J.S.; Park, C.H.; Nowack, R.L.; Hill, J.R.

    2010-01-01

    The U.S. Geological Survey (USGS) has published probabilistic earthquake hazard maps for the United States based on current knowledge of past earthquake activity and geological constraints on earthquake potential. These maps for the central and eastern United States assume standard site conditions with Swave velocities of 760 m/s in the top 30 m. For urban and infrastructure planning and long-term budgeting, the public is interested in similar probabilistic seismic hazard maps that take into account near-surface geological materials. We have implemented a probabilistic method for incorporating site effects into the USGS seismic hazard analysis that takes into account the first-order effects of the surface geologic conditions. The thicknesses of sediments, which play a large role in amplification, were derived from a P-wave refraction database with over 13, 000 profiles, and a preliminary geology-based velocity model was constructed from available information on S-wave velocities. An interesting feature of the preliminary hazard maps incorporating site effects is the approximate factor of two increases in the 1-Hz spectral acceleration with 2 percent probability of exceedance in 50 years for parts of the greater Indianapolis metropolitan region and surrounding parts of central Indiana. This effect is primarily due to the relatively thick sequence of sediments infilling ancient bedrock topography that has been deposited since the Pleistocene Epoch. As expected, the Late Pleistocene and Holocene depositional systems of the Wabash and Ohio Rivers produce additional amplification in the southwestern part of Indiana. Ground motions decrease, as would be expected, toward the bedrock units in south-central Indiana, where motions are significantly lower than the values on the USGS maps.

  10. A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen

    2014-05-01

    Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho are too short for a meaningful frequency analysis. The detailed hazard mapping is performed by a 2D hydrodynamic model for Can Tho city. As the scenarios are derived in a Monte-Carlo framework, the final flood hazard maps are probabilistic, i.e. show the median flood hazard along with uncertainty estimates for each defined level of probabilities of exceedance. For the pluvial flood hazard a frequency analysis of the hourly rain gauge data of Can Tho is performed implementing a peak-over-threshold procedure. Based on this frequency analysis synthetic rains storms are generated in a Monte-Carlo framework for the same probabilities of exceedance as in the fluvial flood hazard analysis. Probabilistic flood hazard maps were then generated with the same 2D hydrodynamic model for the city. In a last step the fluvial and pluvial scenarios are combined assuming independence of the events. These scenarios were also transferred into hazard maps by the 2D hydrodynamic model finally yielding combined fluvial-pluvial probabilistic flood hazard maps for Can Tho. The derived set of maps may be used for an improved city planning or a flood risk analysis.

  11. Revision of Time-Independent Probabilistic Seismic Hazard Maps for Alaska

    USGS Publications Warehouse

    Wesson, Robert L.; Boyd, Oliver S.; Mueller, Charles S.; Bufe, Charles G.; Frankel, Arthur D.; Petersen, Mark D.

    2007-01-01

    We present here time-independent probabilistic seismic hazard maps of Alaska and the Aleutians for peak ground acceleration (PGA) and 0.1, 0.2, 0.3, 0.5, 1.0 and 2.0 second spectral acceleration at probability levels of 2 percent in 50 years (annual probability of 0.000404), 5 percent in 50 years (annual probability of 0.001026) and 10 percent in 50 years (annual probability of 0.0021). These maps represent a revision of existing maps based on newly obtained data and assumptions reflecting best current judgments about methodology and approach. These maps have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States. A significant improvement relative to the 2002 methodology is the ability to include variable slip rate along a fault where appropriate. These maps incorporate new data, the responses to comments received at workshops held in Fairbanks and Anchorage, Alaska, in May, 2005, and comments received after draft maps were posted on the National Seismic Hazard Mapping Web Site. These maps will be proposed for adoption in future revisions to the International Building Code. In this documentation we describe the maps and in particular explain and justify changes that have been made relative to the 1999 maps. We are also preparing a series of experimental maps of time-dependent hazard that will be described in future documents.

  12. Probabilistic Flood Maps to support decision-making: Mapping the Value of Information

    NASA Astrophysics Data System (ADS)

    Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.

    2016-02-01

    Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.

  13. A probabilistic estimate of maximum acceleration in rock in the contiguous United States

    USGS Publications Warehouse

    Algermissen, Sylvester Theodore; Perkins, David M.

    1976-01-01

    This paper presents a probabilistic estimate of the maximum ground acceleration to be expected from earthquakes occurring in the contiguous United States. It is based primarily upon the historic seismic record which ranges from very incomplete before 1930 to moderately complete after 1960. Geologic data, primarily distribution of faults, have been employed only to a minor extent, because most such data have not been interpreted yet with earthquake hazard evaluation in mind.The map provides a preliminary estimate of the relative hazard in various parts of the country. The report provides a method for evaluating the relative importance of the many parameters and assumptions in hazard analysis. The map and methods of evaluation described reflect the current state of understanding and are intended to be useful for engineering purposes in reducing the effects of earthquakes on buildings and other structures.Studies are underway on improved methods for evaluating the relativ( earthquake hazard of different regions. Comments on this paper are invited to help guide future research and revisions of the accompanying map.The earthquake hazard in the United States has been estimated in a variety of ways since the initial effort by Ulrich (see Roberts and Ulrich, 1950). In general, the earlier maps provided an estimate of the severity of ground shaking or damage but the frequency of occurrence of the shaking or damage was not given. Ulrich's map showed the distribution of expected damage in terms of no damage (zone 0), minor damage (zone 1), moderate damage (zone 2), and major damage (zone 3). The zones were not defined further and the frequency of occurrence of damage was not suggested. Richter (1959) and Algermissen (1969) estimated the ground motion in terms of maximum Modified Mercalli intensity. Richter used the terms "occasional" and "frequent" to characterize intensity IX shaking and Algermissen included recurrence curves for various parts of the country in the paper accompanying his map.The first probabilistic hazard maps covering portions of the United States were by Milne and Davenport (1969a). Recently, Wiggins, Hirshberg and Bronowicki (1974) prepared a probabilistic map of maximum particle velocity and Modified Mercalli intensity for the entire United States. The maps are based on an analysis of the historical seismicity. In general, geological data were not incorporated into the development of the maps.

  14. User perception and interpretation of tornado probabilistic hazard information: Comparison of four graphical designs.

    PubMed

    Miran, Seyed M; Ling, Chen; James, Joseph J; Gerard, Alan; Rothfusz, Lans

    2017-11-01

    Effective design for presenting severe weather information is important to reduce devastating consequences of severe weather. The Probabilistic Hazard Information (PHI) system for severe weather is being developed by NOAA National Severe Storms Laboratory (NSSL) to communicate probabilistic hazardous weather information. This study investigates the effects of four PHI graphical designs for tornado threat, namely, "four-color"," red-scale", "grayscale" and "contour", on users' perception, interpretation, and reaction to threat information. PHI is presented on either a map background or a radar background. Analysis showed that the accuracy was significantly higher and response time faster when PHI was displayed on map background as compared to radar background due to better contrast. When displayed on a radar background, "grayscale" design resulted in a higher accuracy of responses. Possibly due to familiarity, participants reported four-color design as their favorite design, which also resulted in the fastest recognition of probability levels on both backgrounds. Our study shows the importance of using intuitive color-coding and sufficient contrast in conveying probabilistic threat information via graphical design. We also found that users follows a rational perceiving-judging-feeling-and acting approach in processing probabilistic hazard information for tornado. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Working towards a clearer and more helpful hazard map: investigating the influence of hazard map design on hazard communication

    NASA Astrophysics Data System (ADS)

    Thompson, M. A.; Lindsay, J. M.; Gaillard, J.

    2015-12-01

    Globally, geological hazards are communicated using maps. In traditional hazard mapping practice, scientists analyse data about a hazard, and then display the results on a map for stakeholder and public use. However, this one-way, top-down approach to hazard communication is not necessarily effective or reliable. The messages which people take away will be dependent on the way in which they read, interpret, and understand the map, a facet of hazard communication which has been relatively unexplored. Decades of cartographic studies suggest that variables in the visual representation of data on maps, such as colour and symbology, can have a powerful effect on how people understand map content. In practice, however, there is little guidance or consistency in how hazard information is expressed and represented on maps. Accordingly, decisions are often made based on subjective preference, rather than research-backed principles. Here we present the results of a study in which we explore how hazard map design features can influence hazard map interpretation, and we propose a number of considerations for hazard map design. A series of hazard maps were generated, with each one showing the same probabilistic volcanic ashfall dataset, but using different verbal and visual variables (e.g., different colour schemes, data classifications, probabilistic formats). Following a short pilot study, these maps were used in an online survey of 110 stakeholders and scientists in New Zealand. Participants answered 30 open-ended and multiple choice questions about ashfall hazard based on the different maps. Results suggest that hazard map design can have a significant influence on the messages readers take away. For example, diverging colour schemes were associated with concepts of "risk" and decision-making more than sequential schemes, and participants made more precise estimates of hazard with isarithmic data classifications compared to binned or gradational shading. Based on such findings, we make a number of suggestions for communicating hazard using maps. Most importantly, we emphasise that multiple meanings may be taken away from a map, and this may have important implications in a crisis. We propose that engaging with map audiences in a two-way dialogue in times of peace may help prevent miscommunications in the event of a crisis.

  16. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrencemore » relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.« less

  17. First Volcanological-Probabilistic Pyroclastic Density Current and Fallout Hazard Map for Campi Flegrei and Somma Vesuvius Volcanoes.

    NASA Astrophysics Data System (ADS)

    Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.

    2005-05-01

    Integrated volcanological-probabilistic approaches has been used in order to simulate pyroclastic density currents and fallout and produce hazard maps for Campi Flegrei and Somma Vesuvius areas. On the basis of the analyses of all types of pyroclastic flows, surges, secondary pyroclastic density currents and fallout events occurred in the volcanological history of the two volcanic areas and the evaluation of probability for each type of events, matrixs of input parameters for a numerical simulation have been performed. The multi-dimensional input matrixs include the main controlling parameters of the pyroclasts transport and deposition dispersion, as well as the set of possible eruptive vents used in the simulation program. Probabilistic hazard maps provide of each points of campanian area, the yearly probability to be interested by a given event with a given intensity and resulting demage. Probability of a few events in one thousand years are typical of most areas around the volcanoes whitin a range of ca 10 km, including Neaples. Results provide constrains for the emergency plans in Neapolitan area.

  18. Assessment of Coastal Communities' Vulnerability to Hurricane Surge under Climate Change via Probabilistic Map - A Case Study of the Southwest Coast of Florida

    NASA Astrophysics Data System (ADS)

    Feng, X.; Shen, S.

    2014-12-01

    The US coastline, over the past few years, has been overwhelmed by major storms including Hurricane Katrina (2005), Ike (2008), Irene (2011), and Sandy (2012). Supported by a growing and extensive body of evidence, a majority of research agrees hurricane activities have been enhanced due to climate change. However, the precise prediction of hurricane induced inundation remains a challenge. This study proposed a probabilistic inundation map based on a Statistically Modeled Storm Database (SMSD) to assess the probabilistic coastal inundation risk of Southwest Florida for near-future (20 years) scenario considering climate change. This map was processed through a Joint Probability Method with Optimal-Sampling (JPM-OS), developed by Condon and Sheng in 2012, and accompanied by a high resolution storm surge modeling system CH3D-SSMS. The probabilistic inundation map shows a 25.5-31.2% increase in spatially averaged inundation height compared to an inundation map of present-day scenario. To estimate climate change impacts on coastal communities, socioeconomic analyses were conducted using both the SMSD based probabilistic inundation map and the present-day inundation map. Combined with 2010 census data and 2012 parcel data from Florida Geographic Data Library, the differences of economic loss between the near-future and present day scenarios were used to generate an economic exposure map at census block group level to reflect coastal communities' exposure to climate change. The results show that climate change induced inundation increase has significant economic impacts. Moreover, the impacts are not equally distributed among different social groups considering their social vulnerability to hazards. Social vulnerability index at census block group level were obtained from Hazards and Vulnerability Research Institute. The demographic and economic variables in the index represent a community's adaptability to hazards. Local Moran's I was calculated to identify the clusters of highly exposed and vulnerable communities. The economic-exposure cluster map was overlapped with social-vulnerability cluster map to identify communities with low adaptive capability but high exposure. The result provides decision makers an intuitive tool to identify most susceptible communities for adaptation.

  19. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part I: Propagation Modelling and Tsunami Hazard Assessment at the Shoreline

    NASA Astrophysics Data System (ADS)

    Power, William; Wang, Xiaoming; Lane, Emily; Gillibrand, Philip

    2013-09-01

    Regional source tsunamis represent a potentially devastating threat to coastal communities in New Zealand, yet are infrequent events for which little historical information is available. It is therefore essential to develop robust methods for quantitatively estimating the hazards posed, so that effective mitigation measures can be implemented. We develop a probabilistic model for the tsunami hazard posed to the Auckland region of New Zealand from the Kermadec Trench and the southern New Hebrides Trench subduction zones. An innovative feature of our model is the systematic analysis of uncertainty regarding the magnitude-frequency distribution of earthquakes in the source regions. The methodology is first used to estimate the tsunami hazard at the coastline, and then used to produce a set of scenarios that can be applied to produce probabilistic maps of tsunami inundation for the study region; the production of these maps is described in part II. We find that the 2,500 year return period regional source tsunami hazard for the densely populated east coast of Auckland is dominated by events originating in the Kermadec Trench, while the equivalent hazard to the sparsely populated west coast is approximately equally due to events on the Kermadec Trench and the southern New Hebrides Trench.

  20. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    NASA Astrophysics Data System (ADS)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  1. USGS National Seismic Hazard Maps

    USGS Publications Warehouse

    Frankel, A.D.; Mueller, C.S.; Barnhard, T.P.; Leyendecker, E.V.; Wesson, R.L.; Harmsen, S.C.; Klein, F.W.; Perkins, D.M.; Dickman, N.C.; Hanson, S.L.; Hopper, M.G.

    2000-01-01

    The U.S. Geological Survey (USGS) recently completed new probabilistic seismic hazard maps for the United States, including Alaska and Hawaii. These hazard maps form the basis of the probabilistic component of the design maps used in the 1997 edition of the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, prepared by the Building Seismic Safety Council arid published by FEMA. The hazard maps depict peak horizontal ground acceleration and spectral response at 0.2, 0.3, and 1.0 sec periods, with 10%, 5%, and 2% probabilities of exceedance in 50 years, corresponding to return times of about 500, 1000, and 2500 years, respectively. In this paper we outline the methodology used to construct the hazard maps. There are three basic components to the maps. First, we use spatially smoothed historic seismicity as one portion of the hazard calculation. In this model, we apply the general observation that moderate and large earthquakes tend to occur near areas of previous small or moderate events, with some notable exceptions. Second, we consider large background source zones based on broad geologic criteria to quantify hazard in areas with little or no historic seismicity, but with the potential for generating large events. Third, we include the hazard from specific fault sources. We use about 450 faults in the western United States (WUS) and derive recurrence times from either geologic slip rates or the dating of pre-historic earthquakes from trenching of faults or other paleoseismic methods. Recurrence estimates for large earthquakes in New Madrid and Charleston, South Carolina, were taken from recent paleoliquefaction studies. We used logic trees to incorporate different seismicity models, fault recurrence models, Cascadia great earthquake scenarios, and ground-motion attenuation relations. We present disaggregation plots showing the contribution to hazard at four cities from potential earthquakes with various magnitudes and distances.

  2. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  3. Seismic hazard maps for Haiti

    USGS Publications Warehouse

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2011-01-01

    We have produced probabilistic seismic hazard maps of Haiti for peak ground acceleration and response spectral accelerations that include the hazard from the major crustal faults, subduction zones, and background earthquakes. The hazard from the Enriquillo-Plantain Garden, Septentrional, and Matheux-Neiba fault zones was estimated using fault slip rates determined from GPS measurements. The hazard from the subduction zones along the northern and southeastern coasts of Hispaniola was calculated from slip rates derived from GPS data and the overall plate motion. Hazard maps were made for a firm-rock site condition and for a grid of shallow shear-wave velocities estimated from topographic slope. The maps show substantial hazard throughout Haiti, with the highest hazard in Haiti along the Enriquillo-Plantain Garden and Septentrional fault zones. The Matheux-Neiba Fault exhibits high hazard in the maps for 2% probability of exceedance in 50 years, although its slip rate is poorly constrained.

  4. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  5. Long-term multi-hazard assessment for El Misti volcano (Peru)

    NASA Astrophysics Data System (ADS)

    Sandri, Laura; Thouret, Jean-Claude; Constantinescu, Robert; Biass, Sébastien; Tonini, Roberto

    2014-02-01

    We propose a long-term probabilistic multi-hazard assessment for El Misti Volcano, a composite cone located <20 km from Arequipa. The second largest Peruvian city is a rapidly expanding economic centre and is classified by UNESCO as World Heritage. We apply the Bayesian Event Tree code for Volcanic Hazard (BET_VH) to produce probabilistic hazard maps for the predominant volcanic phenomena that may affect c.900,000 people living around the volcano. The methodology accounts for the natural variability displayed by volcanoes in their eruptive behaviour, such as different types/sizes of eruptions and possible vent locations. For this purpose, we treat probabilistically several model runs for some of the main hazardous phenomena (lahars, pyroclastic density currents (PDCs), tephra fall and ballistic ejecta) and data from past eruptions at El Misti (tephra fall, PDCs and lahars) and at other volcanoes (PDCs). The hazard maps, although neglecting possible interactions among phenomena or cascade effects, have been produced with a homogeneous method and refer to a common time window of 1 year. The probability maps reveal that only the north and east suburbs of Arequipa are exposed to all volcanic threats except for ballistic ejecta, which are limited to the uninhabited but touristic summit cone. The probability for pyroclastic density currents reaching recently expanding urban areas and the city along ravines is around 0.05 %/year, similar to the probability obtained for roof-critical tephra loading during the rainy season. Lahars represent by far the most probable threat (around 10 %/year) because at least four radial drainage channels can convey them approximately 20 km away from the volcano across the entire city area in heavy rain episodes, even without eruption. The Río Chili Valley represents the major concern to city safety owing to the probable cascading effect of combined threats: PDCs and rockslides, dammed lake break-outs and subsequent lahars or floods. Although this study does not intend to replace the current El Misti hazard map, the quantitative results of this probabilistic multi-hazard assessment can be incorporated into a multi-risk analysis, to support decision makers in any future improvement of the current hazard evaluation, such as further land-use planning and possible emergency management.

  6. Seismotectonic Map of Afghanistan and Adjacent Areas

    USGS Publications Warehouse

    Wheeler, Russell L.; Rukstales, Kenneth S.

    2007-01-01

    Introduction This map is part of an assessment of Afghanistan's geology, natural resources, and natural hazards. One of the natural hazards is from earthquake shaking. One of the tools required to address the shaking hazard is a probabilistic seismic-hazard map, which was made separately. The information on this seismotectonic map has been used in the design and computation of the hazard map. A seismotectonic map like this one shows geological, seismological, and other information that previously had been scattered among many sources. The compilation can show spatial relations that might not have been seen by comparing the original sources, and it can suggest hypotheses that might not have occurred to persons who studied those scattered sources. The main map shows faults and earthquakes of Afghanistan. Plate convergence drives the deformations that cause the earthquakes. Accordingly, smaller maps and text explain the modern plate-tectonic setting of Afghanistan and its evolution, and relate both to patterns of faults and earthquakes.

  7. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2013-07-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  8. Central US earthquake catalog for hazard maps of Memphis, Tennessee

    USGS Publications Warehouse

    Wheeler, R.L.; Mueller, C.S.

    2001-01-01

    An updated version of the catalog that was used for the current national probabilistic seismic-hazard maps would suffice for production of large-scale hazard maps of the Memphis urban area. Deaggregation maps provide guidance as to the area that a catalog for calculating Memphis hazard should cover. For the future, the Nuttli and local network catalogs could be examined for earthquakes not presently included in the catalog. Additional work on aftershock removal might reduce hazard uncertainty. Graphs of decadal and annual earthquake rates suggest completeness at and above magnitude 3 for the last three or four decades. Any additional work on completeness should consider the effects of rapid, local population changes during the Nation's westward expansion. ?? 2001 Elsevier Science B.V. All rights reserved.

  9. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.

  10. Probabilistic Hazard Estimation at a Densely Urbanised Area: the Neaples Volcanoes

    NASA Astrophysics Data System (ADS)

    de Natale, G.; Mastrolorenzo, G.; Panizza, A.; Pappalardo, L.; Claudia, T.

    2005-12-01

    The Neaples volcanic area (Southern Italy), including Vesuvius, Campi Flegrei caldera and Ischia island, is the highest risk one in the World, where more than 2 million people live within about 10 km from an active volcanic vent. Such an extreme risk calls for accurate methodologies aimed to quantify it, in a probabilistic way, considering all the available volcanological information as well as modelling results. In fact, simple hazard maps based on the observation of deposits from past eruptions have the major problem that eruptive history generally samples a very limited number of possible outcomes, thus resulting almost meaningless to get the event probability in the area. This work describes a methodology making the best use (from a Bayesian point of view) of volcanological data and modelling results, to compute probabilistic hazard maps from multi-vent explosive eruptions. The method, which follows an approach recently developed by the same authors for pyroclastic flows hazard, has been here improved and extended to compute also fall-out hazard. The application of the method to the Neapolitan volcanic area, including the densely populated city of Naples, allows, for the first time, to get a global picture of the areal distribution for the main hazards from multi-vent explosive eruptions. From a joint consideration of the hazard contributions from all the three volcanic areas, new insight on the volcanic hazard distribution emerges, which will have strong implications for urban and emergency planning in the area.

  11. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    NASA Astrophysics Data System (ADS)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  12. Quantifying volcanic hazard at Campi Flegrei caldera (Italy) with uncertainty assessment: 2. Pyroclastic density current invasion maps

    NASA Astrophysics Data System (ADS)

    Neri, Augusto; Bevilacqua, Andrea; Esposti Ongaro, Tomaso; Isaia, Roberto; Aspinall, Willy P.; Bisson, Marina; Flandoli, Franco; Baxter, Peter J.; Bertagnini, Antonella; Iannuzzi, Enrico; Orsucci, Simone; Pistolesi, Marco; Rosi, Mauro; Vitale, Stefano

    2015-04-01

    Campi Flegrei (CF) is an example of an active caldera containing densely populated settlements at very high risk of pyroclastic density currents (PDCs). We present here an innovative method for assessing background spatial PDC hazard in a caldera setting with probabilistic invasion maps conditional on the occurrence of an explosive event. The method encompasses the probabilistic assessment of potential vent opening positions, derived in the companion paper, combined with inferences about the spatial density distribution of PDC invasion areas from a simplified flow model, informed by reconstruction of deposits from eruptions in the last 15 ka. The flow model describes the PDC kinematics and accounts for main effects of topography on flow propagation. Structured expert elicitation is used to incorporate certain sources of epistemic uncertainty, and a Monte Carlo approach is adopted to produce a set of probabilistic hazard maps for the whole CF area. Our findings show that, in case of eruption, almost the entire caldera is exposed to invasion with a mean probability of at least 5%, with peaks greater than 50% in some central areas. Some areas outside the caldera are also exposed to this danger, with mean probabilities of invasion of the order of 5-10%. Our analysis suggests that these probability estimates have location-specific uncertainties which can be substantial. The results prove to be robust with respect to alternative elicitation models and allow the influence on hazard mapping of different sources of uncertainty, and of theoretical and numerical assumptions, to be quantified.

  13. Probabilistic Seismic Hazard Maps for Ecuador

    NASA Astrophysics Data System (ADS)

    Mariniere, J.; Beauval, C.; Yepes, H. A.; Laurence, A.; Nocquet, J. M.; Alvarado, A. P.; Baize, S.; Aguilar, J.; Singaucho, J. C.; Jomard, H.

    2017-12-01

    A probabilistic seismic hazard study is led for Ecuador, a country facing a high seismic hazard, both from megathrust subduction earthquakes and shallow crustal moderate to large earthquakes. Building on the knowledge produced in the last years in historical seismicity, earthquake catalogs, active tectonics, geodynamics, and geodesy, several alternative earthquake recurrence models are developed. An area source model is first proposed, based on the seismogenic crustal and inslab sources defined in Yepes et al. (2016). A slightly different segmentation is proposed for the subduction interface, with respect to Yepes et al. (2016). Three earthquake catalogs are used to account for the numerous uncertainties in the modeling of frequency-magnitude distributions. The hazard maps obtained highlight several source zones enclosing fault systems that exhibit low seismic activity, not representative of the geological and/or geodetical slip rates. Consequently, a fault model is derived, including faults with an earthquake recurrence model inferred from geological and/or geodetical slip rate estimates. The geodetical slip rates on the set of simplified faults are estimated from a GPS horizontal velocity field (Nocquet et al. 2014). Assumptions on the aseismic component of the deformation are required. Combining these alternative earthquake models in a logic tree, and using a set of selected ground-motion prediction equations adapted to Ecuador's different tectonic contexts, a mean hazard map is obtained. Hazard maps corresponding to the percentiles 16 and 84% are also derived, highlighting the zones where uncertainties on the hazard are highest.

  14. Great Balls of Fire: A probabilistic approach to quantify the hazard related to ballistics - A case study at La Fossa volcano, Vulcano Island, Italy

    NASA Astrophysics Data System (ADS)

    Biass, Sébastien; Falcone, Jean-Luc; Bonadonna, Costanza; Di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino

    2016-10-01

    We present a probabilistic approach to quantify the hazard posed by volcanic ballistic projectiles (VBP) and their potential impact on the built environment. A model named Great Balls of Fire (GBF) is introduced to describe ballistic trajectories of VBPs accounting for a variable drag coefficient and topography. It relies on input parameters easily identifiable in the field and is designed to model large numbers of VBPs stochastically. Associated functions come with the GBF code to post-process model outputs into a comprehensive probabilistic hazard assessment for VBP impacts. Outcomes include probability maps to exceed given thresholds of kinetic energies at impact, hazard curves and probabilistic isoenergy maps. Probabilities are calculated either on equally-sized pixels or zones of interest. The approach is calibrated, validated and applied to La Fossa volcano, Vulcano Island (Italy). We constructed a generic eruption scenario based on stratigraphic studies and numerical inversions of the 1888-1890 long-lasting Vulcanian cycle of La Fossa. Results suggest a ~ 10- 2% probability of occurrence of VBP impacts with kinetic energies ≤ 104 J at the touristic locality of Porto. In parallel, the vulnerability to roof perforation was estimated by combining field observations and published literature, allowing for a first estimate of the potential impact of VBPs during future Vulcanian eruptions. Results indicate a high physical vulnerability to the VBP hazard, and, consequently, half of the building stock having a ≥ 2.5 × 10- 3% probability of roof perforation.

  15. A Framework for the Validation of Probabilistic Seismic Hazard Analysis Maps Using Strong Ground Motion Data

    NASA Astrophysics Data System (ADS)

    Bydlon, S. A.; Beroza, G. C.

    2015-12-01

    Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.

  16. Evansville Area Earthquake Hazards Mapping Project (EAEHMP) - Progress Report, 2008

    USGS Publications Warehouse

    Boyd, Oliver S.; Haase, Jennifer L.; Moore, David W.

    2009-01-01

    Maps of surficial geology, deterministic and probabilistic seismic hazard, and liquefaction potential index have been prepared by various members of the Evansville Area Earthquake Hazard Mapping Project for seven quadrangles in the Evansville, Indiana, and Henderson, Kentucky, metropolitan areas. The surficial geologic maps feature 23 types of surficial geologic deposits, artificial fill, and undifferentiated bedrock outcrop and include alluvial and lake deposits of the Ohio River valley. Probabilistic and deterministic seismic hazard and liquefaction hazard mapping is made possible by drawing on a wealth of information including surficial geologic maps, water well logs, and in-situ testing profiles using the cone penetration test, standard penetration test, down-hole shear wave velocity tests, and seismic refraction tests. These data were compiled and collected with contributions from the Indiana Geological Survey, Kentucky Geological Survey, Illinois State Geological Survey, United States Geological Survey, and Purdue University. Hazard map products are in progress and are expected to be completed by the end of 2009, with a public roll out in early 2010. Preliminary results suggest that there is a 2 percent probability that peak ground accelerations of about 0.3 g will be exceeded in much of the study area within 50 years, which is similar to the 2002 USGS National Seismic Hazard Maps for a firm rock site value. Accelerations as high as 0.4-0.5 g may be exceeded along the edge of the Ohio River basin. Most of the region outside of the river basin has a low liquefaction potential index (LPI), where the probability that LPI is greater than 5 (that is, there is a high potential for liquefaction) for a M7.7 New Madrid type event is only 20-30 percent. Within the river basin, most of the region has high LPI, where the probability that LPI is greater than 5 for a New Madrid type event is 80-100 percent.

  17. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr events could benefit the NTHMP. The joint NFIP/NTHMP pilot study at Seaside, Oregon is organized into three closely related components: Probabilistic, Modeling, and Impact studies. Probabilistic studies (Geist, et al., this session) are led by the USGS and include the specification of near- and far-field seismic tsunami sources and their associated probabilities. Modeling studies (Titov, et al., this session) are led by NOAA and include the development and testing of a Seaside tsunami inundation model and an associated database of computed wave height and flow velocity fields. Impact studies (Synolakis, et al., this session) are led by USC and include the computation and analyses of indices for the categorization of hazard zones. The results of each component study will be integrated to produce a Seaside tsunami hazard map. This presentation will provide a brief overview of the project and an update on progress, while the above-referenced companion presentations will provide details on the methods used and the preliminary results obtained by each project component.

  18. PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary

    2015-06-01

    PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.

  19. Effect of time dependence on probabilistic seismic-hazard maps and deaggregation for the central Apennines, Italy

    USGS Publications Warehouse

    Akinci, A.; Galadini, F.; Pantosti, D.; Petersen, M.; Malagnini, L.; Perkins, D.

    2009-01-01

    We produce probabilistic seismic-hazard assessments for the central Apennines, Italy, using time-dependent models that are characterized using a Brownian passage time recurrence model. Using aperiodicity parameters, ?? of 0.3, 0.5, and 0.7, we examine the sensitivity of the probabilistic ground motion and its deaggregation to these parameters. For the seismic source model we incorporate both smoothed historical seismicity over the area and geological information on faults. We use the maximum magnitude model for the fault sources together with a uniform probability of rupture along the fault (floating fault model) to model fictitious faults to account for earthquakes that cannot be correlated with known geologic structural segmentation.

  20. Probabilistic tsunami hazard analysis: Multiple sources and global applications

    USGS Publications Warehouse

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-01-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  1. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    NASA Astrophysics Data System (ADS)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  2. GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2007-01-01

    A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.

  3. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Leblois, E.; Onfroy, T.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2014-09-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible (but which have not yet occurred) flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2010 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90 % of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff, due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of the CCR (Caisse Centrale de Reassurance) claim database have shown that approximately 45 % of the insured flood losses are located inside the floodplains and 45 % outside. Another 10 % is due to sea surge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: a generation of fictive river flows based on the historical records of the river gauge network and a generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (Macif) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  4. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015 Sabah earthquake offers a case in point.

  5. Develop Probabilistic Tsunami Design Maps for ASCE 7

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thio, H. K.; Chock, G.; Titov, V. V.

    2014-12-01

    A national standard for engineering design for tsunami effects has not existed before and this significant risk is mostly ignored in engineering design. The American Society of Civil Engineers (ASCE) 7 Tsunami Loads and Effects Subcommittee is completing a chapter for the 2016 edition of ASCE/SEI 7 Standard. Chapter 6, Tsunami Loads and Effects, would become the first national tsunami design provisions. These provisions will apply to essential facilities and critical infrastructure. This standard for tsunami loads and effects will apply to designs as part of the tsunami preparedness. The provisions will have significance as the post-tsunami recovery tool, to plan and evaluate for reconstruction. Maps of 2,500-year probabilistic tsunami inundation for Alaska, Washington, Oregon, California, and Hawaii need to be developed for use with the ASCE design provisions. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. The NOAA Center for Tsunami Research (NCTR) has developed 75 tsunami inundation models as part of the operational tsunami model forecast capability for the U.S. coastline. NCTR, UW, and URS are collaborating with ASCE to develop the 2,500-year tsunami design maps for the Pacific states using these tsunami models. This ensures the probabilistic criteria are established in ASCE's tsunami design maps. URS established a Probabilistic Tsunami Hazard Assessment approach consisting of a large amount of tsunami scenarios that include both epistemic uncertainty and aleatory variability (Thio et al., 2010). Their study provides 2,500-year offshore tsunami heights at the 100-m water depth, along with the disaggregated earthquake sources. NOAA's tsunami models are used to identify a group of sources that produce these 2,500-year tsunami heights. The tsunami inundation limits and runup heights derived from these sources establish the tsunami design map for the study site. ASCE's Energy Grad Line Analysis then uses these modeling constraints to derive hydrodynamic forces for structures within the tsunami design zone. The probabilistic tsunami design maps will be validated by comparison to state inundation maps under the coordination of the National Tsunami Hazard Mitigation Program.

  6. Introduction: Hazard mapping

    USGS Publications Warehouse

    Baum, Rex L.; Miyagi, Toyohiko; Lee, Saro; Trofymchuk, Oleksandr M

    2014-01-01

    Twenty papers were accepted into the session on landslide hazard mapping for oral presentation. The papers presented susceptibility and hazard analysis based on approaches ranging from field-based assessments to statistically based models to assessments that combined hydromechanical and probabilistic components. Many of the studies have taken advantage of increasing availability of remotely sensed data and nearly all relied on Geographic Information Systems to organize and analyze spatial data. The studies used a range of methods for assessing performance and validating hazard and susceptibility models. A few of the studies presented in this session also included some element of landslide risk assessment. This collection of papers clearly demonstrates that a wide range of approaches can lead to useful assessments of landslide susceptibility and hazard.

  7. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    USGS Publications Warehouse

    Bonasia, Rosanna; Scaini, Chirara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2013-01-01

    Popocatépetl is one of Mexico’s most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene–Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl’s reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the “Ochre Pumice” Plinian eruption (4965 14C yr BP). FALL3D model input eruptive parameters are constrained through an inversion method carried out with the semi-analytical HAZMAP model and are varied by sampling them using probability density functions. We analyze the influence of seasonal variations on ash dispersal and estimate the average persistence of critical ash concentrations at relevant locations and airports. This study assesses the impact that a Plinian eruption similar to the Ochre Pumice eruption would have on the main airports of Mexico and adjacent areas. The hazard maps presented here can support long-term planning that would help minimize the impacts of such an eruption on civil aviation.

  8. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    NASA Astrophysics Data System (ADS)

    Bonasia, Rosanna; Scaini, Chiara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2014-01-01

    Popocatépetl is one of Mexico's most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene-Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl's reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the "Ochre Pumice" Plinian eruption (4965 14C yr BP). FALL3D model input eruptive parameters are constrained through an inversion method carried out with the semi-analytical HAZMAP model and are varied by sampling them using probability density functions. We analyze the influence of seasonal variations on ash dispersal and estimate the average persistence of critical ash concentrations at relevant locations and airports. This study assesses the impact that a Plinian eruption similar to the Ochre Pumice eruption would have on the main airports of Mexico and adjacent areas. The hazard maps presented here can support long-term planning that would help minimize the impacts of such an eruption on civil aviation.

  9. Probabilistic seismic hazard assessment of southern part of Ghana

    NASA Astrophysics Data System (ADS)

    Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.

    2018-05-01

    This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.

  10. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  11. Challenges in making a seismic hazard map for Alaska and the Aleutians

    USGS Publications Warehouse

    Wesson, R.L.; Boyd, O.S.; Mueller, C.S.; Frankel, A.D.; Freymueller, J.T.

    2008-01-01

    We present a summary of the data and analyses leading to the revision of the time-independent probabilistic seismic hazard maps of Alaska and the Aleutians. These maps represent a revision of existing maps based on newly obtained data, and reflect best current judgments about methodology and approach. They have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States, and will be proposed for adoption in future revisions to the International Building Code. We present example maps for peak ground acceleration, 0.2 s spectral amplitude (SA), and 1.0 s SA at a probability level of 2% in 50 years (annual probability of 0.000404). In this summary, we emphasize issues encountered in preparation of the maps that motivate or require future investigation and research.

  12. Monte Carlo simulation for slip rate sensitivity analysis in Cimandiri fault area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratama, Cecep, E-mail: great.pratama@gmail.com; Meilano, Irwan; Nugraha, Andri Dian

    Slip rate is used to estimate earthquake recurrence relationship which is the most influence for hazard level. We examine slip rate contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedance in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Then, Monte Carlo simulations properties have been assessed. Uncertainty and coefficient of variation from slip rate formore » Cimandiri Fault area has been calculated. We observe that seismic hazard estimates is sensitive to fault slip rate with seismic hazard uncertainty result about 0.25 g. For specific site, we found seismic hazard estimate for Sukabumi is between 0.4904 – 0.8465 g with uncertainty between 0.0847 – 0.2389 g and COV between 17.7% – 29.8%.« less

  13. Seismic probabilistic tsunami hazard: from regional to local analysis and use of geological and historical observations

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Lorito, S.; Orefice, S.; Graziani, L.; Brizuela, B.; Smedile, A.; Volpe, M.; Romano, F.; De Martini, P. M.; Maramai, A.; Selva, J.; Piatanesi, A.; Pantosti, D.

    2016-12-01

    Site-specific probabilistic tsunami hazard analyses demand very high computational efforts that are often reduced by introducing approximations on tsunami sources and/or tsunami modeling. On one hand, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could easily lead to important bias in the analysis. On the other hand, detailed inundation maps computed by tsunami numerical simulations require very long running time. When tsunami effects are calculated at regional scale, a common practice is to propagate tsunami waves in deep waters (up to 50-100 m depth) neglecting non-linear effects and using coarse bathymetric meshes. Then, maximum wave heights on the coast are empirically extrapolated, saving a significant amount of computational time. However, moving to local scale, such assumptions drop out and tsunami modeling would require much greater computational resources. In this work, we perform a local Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) for the 50 km long coastal segment between Augusta and Siracusa, a touristic and commercial area placed along the South-Eastern Sicily coast, Italy. The procedure consists in using the outcomes of a regional SPTHA as input for a two-step filtering method to select and substantially reduce the number of scenarios contributing to the specific target area. These selected scenarios are modeled using high resolution topo-bathymetry for producing detailed inundation maps. Results are presented as probabilistic hazard curves and maps, with the goal of analyze, compare and highlight the different results provided by regional and local hazard assessments. Moreover, the analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for the selected target areas are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore, they can represent valuable benchmarks for testing and strengthening the results of such kind of studies. The work is funded by the Italian Flagship Project RITMARE, the two EC FP7 projects ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389), and the INGV-DPC Agreement.

  14. Seismic Sources and Recurrence Rates as Adopted by USGS Staff for the Production of the 1982 and 1990 Probabilistic Ground Motion Maps for Alaska and the Conterminous United States

    USGS Publications Warehouse

    Hanson, Stanley L.; Perkins, David M.

    1995-01-01

    The construction of a probabilistic ground-motion hazard map for a region follows a sequence of analyses beginning with the selection of an earthquake catalog and ending with the mapping of calculated probabilistic ground-motion values (Hanson and others, 1992). An integral part of this process is the creation of sources used for the calculation of earthquake recurrence rates and ground motions. These sources consist of areas and lines that are representative of geologic or tectonic features and faults. After the design of the sources, it is necessary to arrange the coordinate points in a particular order compatible with the input format for the SEISRISK-III program (Bender and Perkins, 1987). Source zones are usually modeled as a point-rupture source. Where applicable, linear rupture sources are modeled with articulated lines, representing known faults, or a field of parallel lines, representing a generalized distribution of hypothetical faults. Based on the distribution of earthquakes throughout the individual source zones (or a collection of several sources), earthquake recurrence rates are computed for each of the sources, and a minimum and maximum magnitude is assigned. Over a period of time from 1978 to 1980 several conferences were held by the USGS to solicit information on regions of the United States for the purpose of creating source zones for computation of probabilistic ground motions (Thenhaus, 1983). As a result of these regional meetings and previous work in the Pacific Northwest, (Perkins and others, 1980), California continental shelf, (Thenhaus and others, 1980), and the Eastern outer continental shelf, (Perkins and others, 1979) a consensus set of source zones was agreed upon and subsequently used to produce a national ground motion hazard map for the United States (Algermissen and others, 1982). In this report and on the accompanying disk we provide a complete list of source areas and line sources as used for the 1982 and later 1990 seismic hazard maps for the conterminous U.S. and Alaska. These source zones are represented in the input form required for the hazard program SEISRISK-III, and they include the attenuation table and several other input parameter lines normally found at the beginning of an input data set for SEISRISK-III.

  15. Deaggregation of Probabilistic Ground Motions in the Central and Eastern United States

    USGS Publications Warehouse

    Harmsen, S.; Perkins, D.; Frankel, A.

    1999-01-01

    Probabilistic seismic hazard analysis (PSHA) is a technique for estimating the annual rate of exceedance of a specified ground motion at a site due to known and suspected earthquake sources. The relative contributions of the various sources to the total seismic hazard are determined as a function of their occurrence rates and their ground-motion potential. The separation of the exceedance contributions into bins whose base dimensions are magnitude and distance is called deaggregation. We have deaggregated the hazard analyses for the new USGS national probabilistic ground-motion hazard maps (Frankel et al., 1996). For points on a 0.2?? grid in the central and eastern United States (CEUS), we show color maps of the geographical variation of mean and modal magnitudes (M??, M??) and distances (D??, D??) for ground motions having a 2% chance of exceedance in 50 years. These maps are displayed for peak horizontal acceleration and for spectral response accelerations of 0.2, 0.3, and 1.0 sec. We tabulate M??, D??, M??, and D?? for 49 CEUS cities for 0.2- and 1.0-sec response. Thus, these maps and tables are PSHA-derived estimates of the potential earthquakes that dominate seismic hazard at short and intermediate periods in the CEUS. The contribution to hazard of the New Madrid and Charleston sources dominates over much of the CEUS; for 0.2-sec response, over 40% of the area; for 1.0-sec response, over 80% of the area. For 0.2-sec response, D?? ranges from 20 to 200 km, for 1.0 sec, 30 to 600 km. For sites influenced by New Madrid or Charleston, D is less than the distance to these sources, and M?? is less than the characteristic magnitude of these sources, because averaging takes into account the effect of smaller magnitude and closer sources. On the other hand, D?? is directly the distance to New Madrid or Charleston and M?? for 0.2- and 1.0-sec response corresponds to the dominating source over much of the CEUS. For some cities in the North Atlantic states, short-period seismic hazard is apt to be controlled by local seismicity, whereas intermediate period (1.0 sec) hazard is commonly controlled by regional seismicity, such as that of the Charlevoix seismic zone.

  16. Analysis of flood hazard under consideration of dike breaches

    NASA Astrophysics Data System (ADS)

    Vorogushyn, S.; Apel, H.; Lindenschmidt, K.-E.; Merz, B.

    2009-04-01

    The study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). Dike failures for each mechanism are simulated based on fragility functions. The probability of breach is conditioned by the uncertainty in geometrical and geotechnical dike parameters. The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100; 200; 500; 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. The probabilistic nature of IHAM allows for the generation of percentile flood hazard maps that indicate the median and uncertainty bounds of the flood intensity indicators. The uncertainty results from the natural variability of the flow hydrographs and randomness of dike breach processes. The same uncertainty sources determine the uncertainty in the flow hydrographs along the study reach. The simulations showed that the dike breach stochasticity has an increasing impact on hydrograph uncertainty in downstream direction. Whereas in the upstream part of the reach the hydrograph uncertainty is mainly stipulated by the variability of the flood wave form, the dike failures strongly shape the uncertainty boundaries in the downstream part of the reach. Finally, scenarios of polder deployment for the extreme floods with T = 200; 500; 1000 a were simulated with IHAM. The results indicate a rather weak reduction of the mean and median flow hydrographs in the river channel. However, the capping of the flow peaks resulted in a considerable reduction of the overtopping failures downstream of the polder with a simultaneous slight increase of the piping and slope micro-instability frequencies explained by a more durable average impoundment. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management.

  17. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  18. Lateral spread hazard mapping of the northern Salt Lake Valley, Utah, for a M7.0 scenario earthquake

    USGS Publications Warehouse

    Olsen, M.J.; Bartlett, S.F.; Solomon, B.J.

    2007-01-01

    This paper describes the methodology used to develop a lateral spread-displacement hazard map for northern Salt Lake Valley, Utah, using a scenario M7.0 earthquake occurring on the Salt Lake City segment of the Wasatch fault. The mapping effort is supported by a substantial amount of geotechnical, geologic, and topographic data compiled for the Salt Lake Valley, Utah. ArcGIS?? routines created for the mapping project then input this information to perform site-specific lateral spread analyses using methods developed by Bartlett and Youd (1992) and Youd et al. (2002) at individual borehole locations. The distributions of predicted lateral spread displacements from the boreholes located spatially within a geologic unit were subsequently used to map the hazard for that particular unit. The mapped displacement zones consist of low hazard (0-0.1 m), moderate hazard (0.1-0.3 m), high hazard (0.3-1.0 m), and very high hazard (> 1.0 m). As expected, the produced map shows the highest hazard in the alluvial deposits at the center of the valley and in sandy deposits close to the fault. This mapping effort is currently being applied to the southern part of the Salt Lake Valley, Utah, and probabilistic maps are being developed for the entire valley. ?? 2007, Earthquake Engineering Research Institute.

  19. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    NASA Astrophysics Data System (ADS)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  20. An integrated approach to flood hazard assessment on alluvial fans using numerical modeling, field mapping, and remote sensing

    USGS Publications Warehouse

    Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.

    2005-01-01

    Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In contrast, FEMA Flood Insurance Rate Maps (FIRMs) based on the FAN model predict uniformly high flood risk across the study areas without regard for small-scale topography and surficial geology. ?? 2005 Geological Society of America.

  1. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  2. Landslide Hazard from Coupled Inherent and Dynamic Probabilities

    NASA Astrophysics Data System (ADS)

    Strauch, R. L.; Istanbulluoglu, E.; Nudurupati, S. S.

    2015-12-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We sought to unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach couples an empirical inherent landslide probability, based on a frequency ratio analysis, with a numerical dynamic probability, generated by combining subsurface water recharge and surface runoff from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model. Landslide hazard mapping is advanced by combining static and dynamic models of stability into a probabilistic measure of geohazard prediction in both space and time. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex in northern Washington State.

  3. The Spatial Assessment of the Current Seismic Hazard State for Hard Rock Underground Mines

    NASA Astrophysics Data System (ADS)

    Wesseloo, Johan

    2018-06-01

    Mining-induced seismic hazard assessment is an important component in the management of safety and financial risk in mines. As the seismic hazard is a response to the mining activity, it is non-stationary and variable both in space and time. This paper presents an approach for implementing a probabilistic seismic hazard assessment to assess the current hazard state of a mine. Each of the components of the probabilistic seismic hazard assessment is considered within the context of hard rock underground mines. The focus of this paper is the assessment of the in-mine hazard distribution and does not consider the hazard to nearby public or structures. A rating system and methodologies to present hazard maps, for the purpose of communicating to different stakeholders in the mine, i.e. mine managers, technical personnel and the work force, are developed. The approach allows one to update the assessment with relative ease and within short time periods as new data become available, enabling the monitoring of the spatial and temporal change in the seismic hazard.

  4. Performance of USGS one-year earthquake hazard map for natural and induced seismicity in the central and eastern United States

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S.; Spencer, B. D.; Salditch, L.; Petersen, M. D.; McNamara, D. E.

    2017-12-01

    Seismicity in the central United States has dramatically increased since 2008 due to the injection of wastewater produced by oil and gas extraction. In response, the USGS created a one-year probabilistic hazard model and map for 2016 to describe the increased hazard posed to the central and eastern United States. Using the intensity of shaking reported to the "Did You Feel It?" system during 2016, we assess the performance of this model. Assessing the performance of earthquake hazard maps for natural and induced seismicity is conceptually similar but has practical differences. Maps that have return periods of hundreds or thousands of years— as commonly used for natural seismicity— can be assessed using historical intensity data that also span hundreds or thousands of years. Several different features stand out when assessing the USGS 2016 seismic hazard model for the central and eastern United States from induced and natural earthquakes. First, the model can be assessed as a forecast in one year, because event rates are sufficiently high to permit evaluation with one year of data. Second, because these models are projections from the previous year thus implicitly assuming that fluid injection rates remain the same, misfit may reflect changes in human activity. Our results suggest that the model was very successful by the metric implicit in probabilistic hazard seismic assessment: namely, that the fraction of sites at which the maximum shaking exceeded the mapped value is comparable to that expected. The model also did well by a misfit metric that compares the spatial patterns of predicted and maximum observed shaking. This was true for both the central and eastern United States as a whole, and for the region within it with the highest amount of seismicity, Oklahoma and its surrounding area. The model performed least well in northern Texas, over-stating hazard, presumably because lower oil and gas prices and regulatory action reduced the water injection volume relative to the previous year. These results imply that such hazard maps have the potential to be valuable tools for policy makers and regulators in managing the seismic risks associated with unconventional oil and gas production.

  5. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, R.; Babeyko, A. Y.; Baptista, M. A.; Ben Abdallah, S.; Canals, M.; El Mouraouah, A.; Harbitz, C. B.; Ibenbrahim, A.; Lastras, G.; Lorito, S.; Løvholt, F.; Matias, L. M.; Omira, R.; Papadopoulos, G. A.; Pekcan, O.; Nmiri, A.; Selva, J.; Yalciner, A. C.

    2016-12-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement and prospective results, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  6. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, Roberto; Babeyko, Andrey Y.; Hoechner, Andreas; Baptista, Maria Ana; Ben Abdallah, Samir; Canals, Miquel; El Mouraouah, Azelarab; Bonnevie Harbitz, Carl; Ibenbrahim, Aomar; Lastras, Galderic; Lorito, Stefano; Løvholt, Finn; Matias, Luis Manuel; Omira, Rachid; Papadopoulos, Gerassimos A.; Pekcan, Onur; Nmiri, Abdelwaheb; Selva, Jacopo; Yalciner, Ahmet C.; Thio, Hong K.

    2017-04-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement including the firs preliminary release of the assessment, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  7. Assessment of pre-crisis and syn-crisis seismic hazard at Campi Flegrei and Mt. Vesuvius volcanoes, Campania, southern Italy

    NASA Astrophysics Data System (ADS)

    Convertito, Vincenzo; Zollo, Aldo

    2011-08-01

    In this study, we address the issue of short-term to medium-term probabilistic seismic hazard analysis for two volcanic areas, Campi Flegrei caldera and Mt. Vesuvius in the Campania region of southern Italy. Two different phases of the volcanic activity are considered. The first, which we term the pre-crisis phase, concerns the present quiescent state of the volcanoes that is characterized by low-to-moderate seismicity. The second phase, syn-crisis, concerns the unrest phase that can potentially lead to eruption. For the Campi Flegrei case study, we analyzed the pattern of seismicity during the 1982-1984 ground uplift episode (bradyseism). For Mt. Vesuvius, two different time-evolutionary models for seismicity were adopted, corresponding to different ways in which the volcano might erupt. We performed a site-specific analysis, linked with the hazard map, to investigate the effects of input parameters, in terms of source geometry, mean activity rate, periods of data collection, and return periods, for the syn-crisis phase. The analysis in the present study of the pre-crisis phase allowed a comparison of the results of probabilistic seismic hazard analysis for the two study areas with those provided in the Italian national hazard map. For the Mt. Vesuvius area in particular, the results show that the hazard can be greater than that reported in the national hazard map when information at a local scale is used. For the syn-crisis phase, the main result is that the data recorded during the early months of the unrest phase are substantially representative of the seismic hazard during the whole duration of the crisis.

  8. Assessment of volcanic hazards, vulnerability, risk and uncertainty (Invited)

    NASA Astrophysics Data System (ADS)

    Sparks, R. S.

    2009-12-01

    A volcanic hazard is any phenomenon that threatens communities . These hazards include volcanic events like pyroclastic flows, explosions, ash fall and lavas, and secondary effects such as lahars and landslides. Volcanic hazards are described by the physical characteristics of the phenomena, by the assessment of the areas that they are likely to affect and by the magnitude-dependent return period of events. Volcanic hazard maps are generated by mapping past volcanic events and by modelling the hazardous processes. Both these methods have their strengths and limitations and a robust map should use both approaches in combination. Past records, studied through stratigraphy, the distribution of deposits and age dating, are typically incomplete and may be biased. Very significant volcanic hazards, such as surge clouds and volcanic blasts, are not well-preserved in the geological record for example. Models of volcanic processes are very useful to help identify hazardous areas that do not have any geological evidence. They are, however, limited by simplifications and incomplete understanding of the physics. Many practical volcanic hazards mapping tools are also very empirical. Hazards maps are typically abstracted into hazards zones maps, which are some times called threat or risk maps. Their aim is to identify areas at high levels of threat and the boundaries between zones may take account of other factors such as roads, escape routes during evacuation, infrastructure. These boundaries may change with time due to new knowledge on the hazards or changes in volcanic activity levels. Alternatively they may remain static but implications of the zones may change as volcanic activity changes. Zone maps are used for planning purposes and for management of volcanic crises. Volcanic hazards maps are depictions of the likelihood of future volcanic phenomena affecting places and people. Volcanic phenomena are naturally variable, often complex and not fully understood. There are many sources of uncertainty in forecasting the areas that volcanic activity will effect and the severity of the effects. Uncertainties arise from: natural variability, inadequate data, biased data, incomplete data, lack of understanding of the processes, limitations to predictive models, ambiguity, and unknown unknowns. The description of volcanic hazards is thus necessarily probabilistic and requires assessment of the attendant uncertainties. Several issues arise from the probabilistic nature of volcanic hazards and the intrinsic uncertainties. Although zonation maps require well-defined boundaries for administrative pragmatism, such boundaries cannot divide areas that are completely safe from those that are unsafe. Levels of danger or safety need to be defined to decide on and justify boundaries through the concepts of vulnerability and risk. More data, better observations, improved models may reduce uncertainties, but can increase uncertainties and may lead to re-appraisal of zone boundaries. Probabilities inferred by statistical techniques are hard to communicate. Expert elicitation is an emerging methodology for risk assessment and uncertainty evaluation. The method has been applied at one major volcanic crisis (Soufrière Hills Volcano, Montserrat), and is being applied in planning for volcanic crises at Vesuvius.

  9. B-value and slip rate sensitivity analysis for PGA value in Lembang fault and Cimandiri fault area

    NASA Astrophysics Data System (ADS)

    Pratama, Cecep; Ito, Takeo; Meilano, Irwan; Nugraha, Andri Dian

    2017-07-01

    We examine slip rate and b-value contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedence in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi and Bandung using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Uncertainty and coefficient of variation from slip rate and b-value in Lembang and Cimandiri Fault area have been calculated. We observe that seismic hazard estimates are sensitive to fault slip rate and b-value with uncertainty result are 0.25 g dan 0.1-0.2 g, respectively. For specific site, we found seismic hazard estimate are 0.49 + 0.13 g with COV 27% and 0.39 + 0.05 g with COV 13% for Sukabumi and Bandung, respectively.

  10. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    NASA Astrophysics Data System (ADS)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-01

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationships for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.

  11. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-08

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationshipsmore » for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.« less

  12. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  13. Development of Maximum Considered Earthquake Ground Motion Maps

    USGS Publications Warehouse

    Leyendecker, E.V.; Hunt, R.J.; Frankel, A.D.; Rukstales, K.S.

    2000-01-01

    The 1997 NEHRP Recommended Provisions for Seismic Regulations for New Buildings use a design procedure that is based on spectral response acceleration rather than the traditional peak ground acceleration, peak ground velocity, or zone factors. The spectral response accelerations are obtained from maps prepared following the recommendations of the Building Seismic Safety Council's (BSSC) Seismic Design Procedures Group (SDPG). The SDPG-recommended maps, the Maximum Considered Earthquake (MCE) Ground Motion Maps, are based on the U.S. Geological Survey (USGS) probabilistic hazard maps with additional modifications incorporating deterministic ground motions in selected areas and the application of engineering judgement. The MCE ground motion maps included with the 1997 NEHRP Provisions also serve as the basis for the ground motion maps used in the seismic design portions of the 2000 International Building Code and the 2000 International Residential Code. Additionally the design maps prepared for the 1997 NEHRP Provisions, combined with selected USGS probabilistic maps, are used with the 1997 NEHRP Guidelines for the Seismic Rehabilitation of Buildings.

  14. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part-I. Bureau of Indian Standards, New Delhi, 2002]. Not only holistic treatment of earthquake catalog and seismogenic zones has been performed, but also higher resolution in spatial distribution could be achieved. The COV maps have been provided with the strong ground-motion maps under various conditions to show the confidence in the results obtained. Results obtained in the present study would be helpful for risk assessment and other disaster mitigation-related studies.

  15. Earthquake scenario and probabilistic ground-shaking hazard maps for the Albuquerque-Belen-Santa Fe, New Mexico, corridor

    USGS Publications Warehouse

    Wong, I.; Olig, S.; Dober, M.; Silva, W.; Wright, D.; Thomas, P.; Gregor, N.; Sanford, A.; Lin, K.-W.; Love, D.

    2004-01-01

    These maps are not intended to be a substitute for site-specific studies for engineering design nor to replace standard maps commonly referenced in building codes. Rather, we hope that these maps will be used as a guide by government agencies; the engineering, urban planning, emergency preparedness, and response communities; and the general public as part of an overall program to reduce earthquake risk and losses in New Mexico.

  16. Impacts of potential seismic landslides on lifeline corridors.

    DOT National Transportation Integrated Search

    2015-02-01

    This report presents a fully probabilistic method for regional seismically induced landslide hazard analysis and : mapping. The method considers the most current predictions for strong ground motions and seismic sources : through use of the U.S.G.S. ...

  17. Up-to-date Probabilistic Earthquake Hazard Maps for Egypt

    NASA Astrophysics Data System (ADS)

    Gaber, Hanan; El-Hadidy, Mahmoud; Badawy, Ahmed

    2018-04-01

    An up-to-date earthquake hazard analysis has been performed in Egypt using a probabilistic seismic hazard approach. Through the current study, we use a complete and homogenous earthquake catalog covering the time period between 2200 BC and 2015 AD. Three seismotectonic models representing the seismic activity in and around Egypt are used. A logic-tree framework is applied to allow for the epistemic uncertainty in the declustering parameters, minimum magnitude, seismotectonic setting and ground-motion prediction equations. The hazard analysis is performed for a grid of 0.5° × 0.5° in terms of types of rock site for the peak ground acceleration (PGA) and spectral acceleration at 0.2-, 0.5-, 1.0- and 2.0-s periods. The hazard is estimated for three return periods (72, 475 and 2475 years) corresponding to 50, 10 and 2% probability of exceedance in 50 years. The uniform hazard spectra for the cities of Cairo, Alexandria, Aswan and Nuwbia are constructed. The hazard maps show that the highest ground acceleration values are expected in the northeastern part of Egypt around the Gulf of Aqaba (PGA up to 0.4 g for return period 475 years) and in south Egypt around the city of Aswan (PGA up to 0.2 g for return period 475 years). The Western Desert of Egypt is characterized by the lowest level of hazard (PGA lower than 0.1 g for return period 475 years).

  18. Maps Showing Seismic Landslide Hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.; Michael, John A.

    2009-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =~300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazard zones were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  19. Maps showing seismic landslide hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.

    2014-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazards were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  20. Probabilistic Surface Characterization for Safe Landing Hazard Detection and Avoidance (HDA)

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E. (Inventor); Ivanov, Tonislav I. (Inventor); Huertas, Andres (Inventor)

    2015-01-01

    Apparatuses, systems, computer programs and methods for performing hazard detection and avoidance for landing vehicles are provided. Hazard assessment takes into consideration the geometry of the lander. Safety probabilities are computed for a plurality of pixels in a digital elevation map. The safety probabilities are combined for pixels associated with one or more aim points and orientations. A worst case probability value is assigned to each of the one or more aim points and orientations.

  1. Probabilistic Hazards Outlook

    Science.gov Websites

    Home Site Map News Organization Search: Go www.nws.noaa.gov Search the CPC Go Download KML Day 3-7 . See static maps below this for the most up to date graphics. Categorical Outlooks Day 3-7 Day 8-14 EDT May 25 2018 Synopsis: The summer season is expected to move in quickly for much of the contiguous

  2. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  3. How well can we test probabilistic seismic hazard maps?

    NASA Astrophysics Data System (ADS)

    Vanneste, Kris; Stein, Seth; Camelbeeck, Thierry; Vleminckx, Bart

    2017-04-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in probabilistic seismic hazard (PSH) maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSH model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating shaking histories for an area with assumed uniform distribution of earthquakes, Gutenberg-Richter magnitude-frequency relation, Poisson temporal occurrence model, and ground-motion prediction equation (GMPE). We compare the maximum simulated shaking at many sites over time with that predicted by a hazard map generated for the same set of parameters. The Poisson model predicts that the fraction of sites at which shaking will exceed that of the hazard map is p = 1 - exp(-t/T), where t is the duration of observations and T is the map's return period. Exceedance is typically associated with infrequent large earthquakes, as observed in real cases. The ensemble of simulated earthquake histories yields distributions of fractional exceedance with mean equal to the predicted value. Hence, the PSH algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. However, simulated fractional exceedances show a large scatter about the mean value that decreases with increasing t/T, increasing observation time and increasing Gutenberg-Richter a-value (combining intrinsic activity rate and surface area), but is independent of GMPE uncertainty. This scatter is due to the variability of earthquake recurrence, and so decreases as the largest earthquakes occur in more simulations. Our results are important for evaluating the performance of a hazard map based on misfits in fractional exceedance, and for assessing whether such misfit arises by chance or reflects a bias in the map. More specifically, we determined for a broad range of Gutenberg-Richter a-values theoretical confidence intervals on allowed misfits in fractional exceedance and on the percentage of hazard-map bias that can thus be detected by comparison with observed shaking histories. Given that in the real world we only have one shaking history for an area, these results indicate that even if a hazard map does not fit the observations, it is very difficult to assess its veracity, especially for low-to-moderate-seismicity regions. Because our model is a simplified version of reality, any additional uncertainty or complexity will tend to widen these confidence intervals.

  4. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    NASA Astrophysics Data System (ADS)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate the slope critical acceleration (Ac)x for which a prefixed probability exists that seismic shaking would result in a Dn value equal to a threshold x whose exceedence would cause landslide triggering. The obtained ac values represent the minimum slope resistance required to keep the probability of seismic-landslide triggering within the prefixed value. In particular we calculated the spatial distribution of (Ac)x for x thresholds of 10 and 2 cm in order to represent triggering conditions for coherent slides (e.g., slumps, block slides, slow earth flows) and disrupted slides (e.g., rock falls, rock slides, rock avalanches), respectively. Then we produced a probabilistic national map that shows the spatial distribution of (Ac)10 and (Ac)2, for a 10% probability of exceedence in 50 year, which is a significant level of hazard equal to that commonly used for building codes. The spatial distribution of the calculated (Ac)xvalues can be compared with the in situ actual ac values of specific slopes to estimate whether these slopes have a significant probability of failing under seismic action in the future. As example of possible application of this kind of time probabilistic map to hazard estimates, we compared the values obtained for the Manjil region with a GIS map providing spatial distribution of estimated ac values in the same region. The spatial distribution of slopes characterized by ac < (Ac)10 was then compared with the spatial distribution of the major landslides of coherent type triggered by the Manjil earthquake. This comparison provides indications on potential, problems and limits of the experimented approach for the study area. References Cornell, C.A., 1968: Engineering seismic risk analysis, Bull. Seism. Soc. Am., 58, 1583-1606. Del Gaudio V., Wasowski J., & Pierri P., 2003: An approach to time probabilistic evaluation of seismically-induced landslide hazard. Bull Seism. Soc. Am., 93, 557-569. Jibson, R.W., E.L. Harp and J.A. Michael, 1998: A method for producing digital probabilistic seismic landslide hazard maps: an example from the Los Angeles, California, area, U.S. Geological Survey Open-File Report 98-113, Golden, Colorado, 17 pp.

  5. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.

  6. A Moore's cellular automaton model to get probabilistic seismic hazard maps for different magnitude releases: A case study for Greece

    NASA Astrophysics Data System (ADS)

    Jiménez, A.; Posadas, A. M.

    2006-09-01

    Cellular automata are simple mathematical idealizations of natural systems and they supply useful models for many investigations in natural science. Examples include sandpile models, forest fire models, and slider block models used in seismology. In the present paper, they have been used for establishing temporal relations between the energy releases of the seismic events that occurred in neighboring parts of the crust. The catalogue is divided into time intervals, and the region is divided into cells which are declared active or inactive by means of a threshold energy release criterion. Thus, a pattern of active and inactive cells which evolves over time is determined. A stochastic cellular automaton is constructed starting with these patterns, in order to simulate their spatio-temporal evolution, by supposing a Moore's neighborhood interaction between the cells. The best model is chosen by maximizing the mutual information between the past and the future states. Finally, a Probabilistic Seismic Hazard Map is given for the different energy releases considered. The method has been applied to the Greece catalogue from 1900 to 1999. The Probabilistic Seismic Hazard Maps for energies corresponding to m = 4 and m = 5 are close to the real seismicity after the data in that area, and they correspond to a background seismicity in the whole area. This background seismicity seems to cover the whole area in periods of around 25-50 years. The optimum cell size is in agreement with other studies; for m > 6 the optimum area increases according to the threshold of clear spatial resolution, and the active cells are not so clustered. The results are coherent with other hazard studies in the zone and with the seismicity recorded after the data set, as well as provide an interaction model which points out the large scale nature of the earthquake occurrence.

  7. Preliminary Earthquake Hazard Map of Afghanistan

    USGS Publications Warehouse

    Boyd, Oliver S.; Mueller, Charles S.; Rukstales, Kenneth S.

    2007-01-01

    Introduction Earthquakes represent a serious threat to the people and institutions of Afghanistan. As part of a United States Agency for International Development (USAID) effort to assess the resource potential and seismic hazards of Afghanistan, the Seismic Hazard Mapping group of the United States Geological Survey (USGS) has prepared a series of probabilistic seismic hazard maps that help quantify the expected frequency and strength of ground shaking nationwide. To construct the maps, we do a complete hazard analysis for each of ~35,000 sites in the study area. We use a probabilistic methodology that accounts for all potential seismic sources and their rates of earthquake activity, and we incorporate modeling uncertainty by using logic trees for source and ground-motion parameters. See the Appendix for an explanation of probabilistic seismic hazard analysis and discussion of seismic risk. Afghanistan occupies a southward-projecting, relatively stable promontory of the Eurasian tectonic plate (Ambraseys and Bilham, 2003; Wheeler and others, 2005). Active plate boundaries, however, surround Afghanistan on the west, south, and east. To the west, the Arabian plate moves northward relative to Eurasia at about 3 cm/yr. The active plate boundary trends northwestward through the Zagros region of southwestern Iran. Deformation is accommodated throughout the territory of Iran; major structures include several north-south-trending, right-lateral strike-slip fault systems in the east and, farther to the north, a series of east-west-trending reverse- and strike-slip faults. This deformation apparently does not cross the border into relatively stable western Afghanistan. In the east, the Indian plate moves northward relative to Eurasia at a rate of about 4 cm/yr. A broad, transpressional plate-boundary zone extends into eastern Afghanistan, trending southwestward from the Hindu Kush in northeast Afghanistan, through Kabul, and along the Afghanistan-Pakistan border. Deformation here is expressed as a belt of major, north-northeast-trending, left-lateral strike-slip faults and abundant seismicity. The seismicity intensifies farther to the northeast and includes a prominent zone of deep earthquakes associated with northward subduction of the Indian plate beneath Eurasia that extends beneath the Hindu Kush and Pamirs Mountains. Production of the seismic hazard maps is challenging because the geological and seismological data required to produce a seismic hazard model are limited. The data that are available for this project include historical seismicity and poorly constrained slip rates on only a few of the many active faults in the country. Much of the hazard is derived from a new catalog of historical earthquakes: from 1964 to the present, with magnitude equal to or greater than about 4.5, and with depth between 0 and 250 kilometers. We also include four specific faults in the model: the Chaman fault with an assigned slip rate of 10 mm/yr, the Central Badakhshan fault with an assigned slip rate of 12 mm/yr, the Darvaz fault with an assigned slip rate of 7 mm/yr, and the Hari Rud fault with an assigned slip rate of 2 mm/yr. For these faults and for shallow seismicity less than 50 km deep, we incorporate published ground-motion estimates from tectonically active regions of western North America, Europe, and the Middle East. Ground-motion estimates for deeper seismicity are derived from data in subduction environments. We apply estimates derived for tectonic regions where subduction is the main tectonic process for intermediate-depth seismicity between 50- and 250-km depth. Within the framework of these limitations, we have developed a preliminary probabilistic seismic-hazard assessment of Afghanistan, the type of analysis that underpins the seismic components of modern building codes in the United States. The assessment includes maps of estimated peak ground-acceleration (PGA), 0.2-second spectral acceleration (SA), and 1.0-secon

  8. Landslide hazard assessment: recent trends and techniques.

    PubMed

    Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S

    2013-01-01

    Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.

  9. MED SUV TASK 6.3 Capacity building and interaction with decision makers: Improving volcanic risk communication through volcanic hazard tools evaluation, Campi Flegrei Caldera case study (Italy)

    NASA Astrophysics Data System (ADS)

    Nave, Rosella; Isaia, Roberto; Sandri, Laura; Cristiani, Chiara

    2016-04-01

    In the communication chain between scientists and decision makers (end users), scientific outputs, as maps, are a fundamental source of information on hazards zoning and the related at risk areas definition. Anyway the relationship between volcanic phenomena, their probability and potential impact can be complex and the geospatial information not easily decoded or understood by not experts even if decision makers. Focusing on volcanic hazard the goal of MED SUV WP6 Task 3 is to improve the communication efficacy of scientific outputs, to contribute in filling the gap between scientists and decision-makers. Campi Flegrei caldera, in Neapolitan area has been chosen as the pilot research area where to apply an evaluation/validation procedure to provide a robust evaluation of the volcanic maps and its validation resulting from end users response. The selected sample involved are decision makers and officials from Campanian Region Civil Protection and municipalities included in Campi Flegrei RED ZONE, the area exposed to risk from to pyroclastic currents hazard. Semi-structured interviews, with a sample of decision makers and civil protection officials have been conducted to acquire both quantitative and qualitative data. The tested maps have been: the official Campi Flegrei Caldera RED ZONE map, three maps produced by overlapping the Red Zone limit on Orthophoto, DTM and Contour map, as well as other maps included a probabilistic one, showing volcanological data used to border the Red Zone. The outcomes' analysis have assessed level of respondents' understanding of content as displayed, and their needs in representing the complex information embedded in volcanic hazard. The final output has been the development of a leaflet as "guidelines" that can support decision makers and officials in understanding volcanic hazard and risk maps, and also in using them as a communication tool in information program for the population at risk. The same evaluation /validation process has been applied also on the scientific output of MED-SUV WP6, as a tool for the short-term probabilistic volcanic hazard assessment. For the Campi Flegrei volcanic system, the expected tool has been implemented to compute hazard curves, hazard maps and probability maps for tephra fallout on a target grid covering the Campania region. This allows the end user to visualize the hazard from tephra fallout and its uncertainty. The response of end-users to such products will help to determine to what extent end-users understand them, find them useful, and match their requirements. In order to involve also Etna area in WP6 TASK 3 activities, a questionnaire developed in the VUELCO project (Volcanic Unrest in Europe and Latin America) has been proposed to Sicily Civil Protection officials having decision-making responsibility in case of volcanic unrest at Etna and Stromboli, to survey their opinions and requirements also in case of volcanic unrest

  10. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  11. Probabilistic seismic hazard zonation for the Cuban building code update

    NASA Astrophysics Data System (ADS)

    Garcia, J.; Llanes-Buron, C.

    2013-05-01

    A probabilistic seismic hazard assessment has been performed in response to a revision and update of the Cuban building code (NC-46-99) for earthquake-resistant building construction. The hazard assessment have been done according to the standard probabilistic approach (Cornell, 1968) and importing the procedures adopted by other nations dealing with the problem of revising and updating theirs national building codes. Problems of earthquake catalogue treatment, attenuation of peak and spectral ground acceleration, as well as seismic source definition have been rigorously analyzed and a logic-tree approach was used to represent the inevitable uncertainties encountered through the whole seismic hazard estimation process. The seismic zonation proposed here, is formed by a map where it is reflected the behaviour of the spectral acceleration values for short (0.2 seconds) and large (1.0 seconds) periods on rock conditions with a 1642 -year return period, which being considered as maximum credible earthquake (ASCE 07-05). In addition, other three design levels are proposed (severe earthquake: with a 808 -year return period, ordinary earthquake: with a 475 -year return period and minimum earthquake: with a 225 -year return period). The seismic zonation proposed here fulfils the international standards (IBC-ICC) as well as the world tendencies in this thematic.

  12. Integrating statistical and process-based models to produce probabilistic landslide hazard at regional scale

    NASA Astrophysics Data System (ADS)

    Strauch, R. L.; Istanbulluoglu, E.

    2017-12-01

    We develop a landslide hazard modeling approach that integrates a data-driven statistical model and a probabilistic process-based shallow landslide model for mapping probability of landslide initiation, transport, and deposition at regional scales. The empirical model integrates the influence of seven site attribute (SA) classes: elevation, slope, curvature, aspect, land use-land cover, lithology, and topographic wetness index, on over 1,600 observed landslides using a frequency ratio (FR) approach. A susceptibility index is calculated by adding FRs for each SA on a grid-cell basis. Using landslide observations we relate susceptibility index to an empirically-derived probability of landslide impact. This probability is combined with results from a physically-based model to produce an integrated probabilistic map. Slope was key in landslide initiation while deposition was linked to lithology and elevation. Vegetation transition from forest to alpine vegetation and barren land cover with lower root cohesion leads to higher frequency of initiation. Aspect effects are likely linked to differences in root cohesion and moisture controlled by solar insulation and snow. We demonstrate the model in the North Cascades of Washington, USA and identify locations of high and low probability of landslide impacts that can be used by land managers in their design, planning, and maintenance.

  13. New seismic hazard maps for Puerto Rico and the U.S. Virgin Islands

    USGS Publications Warehouse

    Mueller, C.; Frankel, A.; Petersen, M.; Leyendecker, E.

    2010-01-01

    The probabilistic methodology developed by the U.S. Geological Survey is applied to a new seismic hazard assessment for Puerto Rico and the U.S. Virgin Islands. Modeled seismic sources include gridded historical seismicity, subduction-interface and strike-slip faults with known slip rates, and two broad zones of crustal extension with seismicity rates constrained by GPS geodesy. We use attenuation relations from western North American and worldwide data, as well as a Caribbean-specific relation. Results are presented as maps of peak ground acceleration and 0.2- and 1.0-second spectral response acceleration for 2% and 10% probabilities of exceedance in 50 years (return periods of about 2,500 and 500 years, respectively). This paper describes the hazard model and maps that were balloted by the Building Seismic Safety Council and recommended for the 2003 NEHRP Provisions and the 2006 International Building Code. ?? 2010, Earthquake Engineering Research Institute.

  14. Seismic hazard in the Nation's breadbasket

    USGS Publications Warehouse

    Boyd, Oliver; Haller, Kathleen; Luco, Nicolas; Moschetti, Morgan P.; Mueller, Charles; Petersen, Mark D.; Rezaeian, Sanaz; Rubinstein, Justin L.

    2015-01-01

    The USGS National Seismic Hazard Maps were updated in 2014 and included several important changes for the central United States (CUS). Background seismicity sources were improved using a new moment-magnitude-based catalog; a new adaptive, nearest-neighbor smoothing kernel was implemented; and maximum magnitudes for background sources were updated. Areal source zones developed by the Central and Eastern United States Seismic Source Characterization for Nuclear Facilities project were simplified and adopted. The weighting scheme for ground motion models was updated, giving more weight to models with a faster attenuation with distance compared to the previous maps. Overall, hazard changes (2% probability of exceedance in 50 years, across a range of ground-motion frequencies) were smaller than 10% in most of the CUS relative to the 2008 USGS maps despite new ground motion models and their assigned logic tree weights that reduced the probabilistic ground motions by 5–20%.

  15. Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS

    NASA Astrophysics Data System (ADS)

    Ahmad, Raed; Adris, Ahmad; Singh, Ramesh

    2016-07-01

    In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.

  16. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach, Deterministic Approach, Historical Heritage, Cappadocia.

  17. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  18. Preliminary Seismic Probabilistic Tsunami Hazard Map for Italy

    NASA Astrophysics Data System (ADS)

    Lorito, Stefano; Selva, Jacopo; Basili, Roberto; Grezio, Anita; Molinari, Irene; Piatanesi, Alessio; Romano, Fabrizio; Tiberti, Mara Monica; Tonini, Roberto; Bonini, Lorenzo; Michelini, Alberto; Macias, Jorge; Castro, Manuel J.; González-Vida, José Manuel; de la Asunción, Marc

    2015-04-01

    We present a preliminary release of the first seismic probabilistic tsunami hazard map for Italy. The map aims to become an important tool for the Italian Department of Civil Protection (DPC), as well as a support tool for the NEAMTWS Tsunami Service Provider, the Centro Allerta Tsunami (CAT) at INGV, Rome. The map shows the offshore maximum tsunami elevation expected for several average return periods. Both crustal and subduction earthquakes are considered. The probability for each scenario (location, depth, mechanism, source size, magnitude and temporal rate) is defined on a uniform grid covering the entire Mediterranean for crustal earthquakes and on the plate interface for subduction earthquakes. Activity rates are assigned from seismic catalogues and basing on a tectonic regionalization of the Mediterranean area. The methodology explores the associated aleatory uncertainty through the innovative application of an Event Tree. Main sources of epistemic uncertainty are also addressed although in preliminary way. The whole procedure relies on a database of pre-calculated Gaussian-shaped Green's functions for the sea level elevation, to be used also as a real time hazard assessment tool by CAT. Tsunami simulations are performed using the non-linear shallow water multi-GPU code HySEA, over a 30 arcsec bathymetry (from the SRTM30+ dataset) and the maximum elevations are stored at the 50-meter isobath and then extrapolated through the Green's law at 1 meter depth. This work is partially funded by project ASTARTE - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839, and by the Italian flagship project RITMARE.

  19. Probabilistic tsunami hazard assessment in Greece for seismic sources along the segmented Hellenic Arc

    NASA Astrophysics Data System (ADS)

    Novikova, Tatyana; Babeyko, Andrey; Papadopoulos, Gerassimos

    2017-04-01

    Greece and adjacent coastal areas are characterized by a high population exposure to tsunami hazard. The Hellenic Arc is the most active geotectonic structure for the generation of earthquakes and tsunamis. We performed probabilistic tsunami hazard assessment for selected locations of Greek coastlines which are the forecasting points officially used in the tsunami warning operations by the Hellenic National Tsunami Warning Center and the NEAMTWS/IOC/UNESCO. In our analysis we considered seismic sources for tsunami generation along the western, central and eastern segments of the Hellenic Arc. We first created a synthetic catalog as long as 10,000 years for all the significant earthquakes with magnitudes in the range from 6.0 to 8.5, the real events being included in this catalog. For each event included in the synthetic catalog a tsunami was generated and propagated using Boussinesq model. The probability of occurrence for each event was determined by Gutenberg-Richter magnitude-frequency distribution. The results of our study are expressed as hazard curves and hazard maps. The hazard curves were obtained for the selected sites and present the annual probability of exceedance as a function of pick coastal tsunami amplitude. Hazard maps represent the distribution of peak coastal tsunami amplitudes corresponding to a fixed annual probability. In such forms our results can be easily compared to the ones obtained in other studies and further employed for the development of tsunami risk management plans. This research is a contribution to the EU-FP7 tsunami research project ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant agreement no: 603839, 2013-10-30.

  20. Seismic hazard and risk assessment for large Romanian dams situated in the Moldavian Platform

    NASA Astrophysics Data System (ADS)

    Moldovan, Iren-Adelina; Popescu, Emilia; Otilia Placinta, Anica; Petruta Constantin, Angela; Toma Danila, Dragos; Borleanu, Felix; Emilian Toader, Victorin; Moldoveanu, Traian

    2016-04-01

    Besides periodical technical inspections, the monitoring and the surveillance of dams' related structures and infrastructures, there are some more seismic specific requirements towards dams' safety. The most important one is the seismic risk assessment that can be accomplished by rating the dams into seismic risk classes using the theory of Bureau and Ballentine (2002), and Bureau (2003), taking into account the maximum expected peak ground motions at the dams site - values obtained using probabilistic hazard assessment approaches (Moldovan et al., 2008), the structures vulnerability and the downstream risk characteristics (human, economical, historic and cultural heritage, etc) in the areas that might be flooded in the case of a dam failure. Probabilistic seismic hazard (PSH), vulnerability and risk studies for dams situated in the Moldavian Platform, starting from Izvorul Muntelui Dam, down on Bistrita and following on Siret River and theirs affluent will be realized. The most vulnerable dams will be studied in detail and flooding maps will be drawn to find the most exposed downstream localities both for risk assessment studies and warnings. GIS maps that clearly indicate areas that are potentially flooded are enough for these studies, thus giving information on the number of inhabitants and goods that may be destroyed. Geospatial servers included topography is sufficient to achieve them, all other further studies are not necessary for downstream risk assessment. The results will consist of local and regional seismic information, dams specific characteristics and locations, seismic hazard maps and risk classes, for all dams sites (for more than 30 dams), inundation maps (for the most vulnerable dams from the region) and possible affected localities. The studies realized in this paper have as final goal to provide the local emergency services with warnings of a potential dam failure and ensuing flood as a result of an large earthquake occurrence, allowing further public training for evacuation. The work is supported from PNII/PCCA 2013 Project DARING 69/2014, financed by UEFISCDI, Romania. Bureau GJ (2003) "Dams and appurtenant facilities" Earthquake Engineering Handbook, CRS Press, WF Chen, and C Scawthorn (eds.), Boca Raton, pp. 26.1-26.47. Bureau GJ and Ballentine GD (2002) "A comprehensive seismic vulnerability and loss assessment of the State of Carolina using HAZUS. Part IV: Dam inventory and vulnerability assessment methodology", 7th National Conference on Earthquake Engineering, July 21-25, Boston, Earthquake Engineering Research Institute, Oakland, CA. Moldovan IA, Popescu E, Constantin A (2008), "Probabilistic seismic hazard assessment in Romania: application for crustal seismic active zones", Romanian Journal of Physics, Vol.53, Nos. 3-4

  1. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  2. Evaluation of seismic hazard at the northwestern part of Egypt

    NASA Astrophysics Data System (ADS)

    Ezzelarab, M.; Shokry, M. M. F.; Mohamed, A. M. E.; Helal, A. M. A.; Mohamed, Abuoelela A.; El-Hadidy, M. S.

    2016-01-01

    The objective of this study is to evaluate the seismic hazard at the northwestern Egypt using the probabilistic seismic hazard assessment approach. The Probabilistic approach was carried out based on a recent data set to take into account the historic seismicity and updated instrumental seismicity. A homogenous earthquake catalogue was compiled and a proposed seismic sources model was presented. The doubly-truncated exponential model was adopted for calculations of the recurrence parameters. Ground-motion prediction equations that recently recommended by experts and developed based upon earthquake data obtained from tectonic environments similar to those in and around the studied area were weighted and used for assessment of seismic hazard in the frame of logic tree approach. Considering a grid of 0.2° × 0.2° covering the study area, seismic hazard curves for every node were calculated. Hazard maps at bedrock conditions were produced for peak ground acceleration, in addition to six spectral periods (0.1, 0.2, 0.3, 1.0, 2.0 and 3.0 s) for return periods of 72, 475 and 2475 years. The unified hazard spectra of two selected rock sites at Alexandria and Mersa Matruh Cities were provided. Finally, the hazard curves were de-aggregated to determine the sources that contribute most of hazard level of 10% probability of exceedance in 50 years for the mentioned selected sites.

  3. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  4. Insights into earthquake hazard map performance from shaking history simulations

    NASA Astrophysics Data System (ADS)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher-than-mapped shaking — arises by chance or reflects biases in the map. Due to this problem, there are limits to how well we can expect hazard maps to predict future shaking, as well as to our ability to test the performance of a hazard map based on available observations.

  5. Research, methodology, and applications of probabilistic seismic-hazard mapping of the Central and Eastern United States; minutes of a workshop on June 13-14, 2000, at Saint Louis University

    USGS Publications Warehouse

    Wheeler, Russell L.; Perkins, David M.

    2000-01-01

    The U.S. Geological Survey (USGS) is updating and revising its 1996 national seismic-hazard maps for release in 2001. Part of this process is the convening of four regional workshops with earth scientists and other users of the maps. The second of these workshops was sponsored by the USGS and the Mid-America Earthquake Center, and was hosted by Saint Louis University on June 13-14, 2000.The workshop concentrated on the central and eastern U.S. (CEUS) east of the Rocky Mountains. The tasks of the workshop were to (1) evaluate new research findings that are relevant to seismic hazard mapping, (2) discuss modifications in the inputs and methodology used in the national maps, (3) discuss concerns by engineers and other users about the scientific input to the maps and the use of the hazard maps in building codes, and (4) identify needed research in the CEUS that can improve the seismic hazard maps and reduce their uncertainties. These minutes summarize the workshop discussions. This is not a transcript; some individual remarks and short discussions of side issues and logistics were omitted. Named speakers were sent a draft of the minutes with a request for corrections of any errors in remarks attributed to them. Nine people returned corrections, amplifications, or approvals of their remarks as reported. The rest of this document consists of the meeting agenda, discussion summaries, and a list of the 60 attendees.

  6. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach

    PubMed Central

    Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.

    2014-01-01

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514

  7. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    PubMed

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  8. Model uncertainties of the 2002 update of California seismic hazard maps

    USGS Publications Warehouse

    Cao, T.; Petersen, M.D.; Frankel, A.D.

    2005-01-01

    In this article we present and explore the source and ground-motion model uncertainty and parametric sensitivity for the 2002 update of the California probabilistic seismic hazard maps. Our approach is to implement a Monte Carlo simulation that allows for independent sampling from fault to fault in each simulation. The source-distance dependent characteristics of the uncertainty maps of seismic hazard are explained by the fundamental uncertainty patterns from four basic test cases, in which the uncertainties from one-fault and two-fault systems are studied in detail. The California coefficient of variation (COV, ratio of the standard deviation to the mean) map for peak ground acceleration (10% of exceedance in 50 years) shows lower values (0.1-0.15) along the San Andreas fault system and other class A faults than along class B faults (0.2-0.3). High COV values (0.4-0.6) are found around the Garlock, Anacapa-Dume, and Palos Verdes faults in southern California and around the Maacama fault and Cascadia subduction zone in northern California.

  9. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    NASA Astrophysics Data System (ADS)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m resolution was used for the analysis. Three failure mechanisms were analyzed: planar and wedge sliding, as well as toppling. Based on this kinematic analysis, areas where failure is feasible were used as source areas for run out analysis using Rockyfor3D v. 4.1 (www.ecorisq.org). The software calculates trajectories of single falling blocks in three dimensions using physically based algorithms developed under a stochastic approach. The ALS-DEM was down-scaled to 5 m resolution to optimize processing time. Results were compared with run-out simulations using Rockyfor3D with the whole rock wall as source area, and with maps of deposits generated from field observations and aerial photo interpretation. The results product of our implementation show a better correlation with field observations, and help to produce more accurate rock fall hazard assessment maps by a better definition of the source areas. It reduces the time processing for the analysis as well. The findings presented in this contribution are part of an effort to produce guidelines for natural hazard mapping in Norway. Guidelines will be used in upcoming years for hazard mapping in areas where larger groups of population are exposed to mass movements from steep slopes.

  10. The Dependency of Probabilistic Tsunami Hazard Assessment on Magnitude Limits of Seismic Sources in the South China Sea and Adjoining Basins

    NASA Astrophysics Data System (ADS)

    Li, Hongwei; Yuan, Ye; Xu, Zhiguo; Wang, Zongchen; Wang, Juncheng; Wang, Peitao; Gao, Yi; Hou, Jingming; Shan, Di

    2017-06-01

    The South China Sea (SCS) and its adjacent small basins including Sulu Sea and Celebes Sea are commonly identified as tsunami-prone region by its historical records on seismicity and tsunamis. However, quantification of tsunami hazard in the SCS region remained an intractable issue due to highly complex tectonic setting and multiple seismic sources within and surrounding this area. Probabilistic Tsunami Hazard Assessment (PTHA) is performed in the present study to evaluate tsunami hazard in the SCS region based on a brief review on seismological and tsunami records. 5 regional and local potential tsunami sources are tentatively identified, and earthquake catalogs are generated using Monte Carlo simulation following the Tapered Gutenberg-Richter relationship for each zone. Considering a lack of consensus on magnitude upper bound on each seismic source, as well as its critical role in PTHA, the major concern of the present study is to define the upper and lower limits of tsunami hazard in the SCS region comprehensively by adopting different corner magnitudes that could be derived by multiple principles and approaches, including TGR regression of historical catalog, fault-length scaling, tectonic and seismic moment balance, and repetition of historical largest event. The results show that tsunami hazard in the SCS and adjoining basins is subject to large variations when adopting different corner magnitudes, with the upper bounds 2-6 times of the lower. The probabilistic tsunami hazard maps for specified return periods reveal much higher threat from Cotabato Trench and Sulawesi Trench in the Celebes Sea, whereas tsunami hazard received by the coasts of the SCS and Sulu Sea is relatively moderate, yet non-negligible. By combining empirical method with numerical study of historical tsunami events, the present PTHA results are tentatively validated. The correspondence lends confidence to our study. Considering the proximity of major sources to population-laden cities around the SCS region, the tsunami hazard and risk should be further highlighted in the future.

  11. The effect of directivity in a PSHA framework

    NASA Astrophysics Data System (ADS)

    Spagnuolo, E.; Herrero, A.; Cultrera, G.

    2012-09-01

    We propose a method to introduce a refined representation of the ground motion in the framework of the Probabilistic Seismic Hazard Analysis (PSHA). This study is especially oriented to the incorporation of a priori information about source parameters, by focusing on the directivity effect and its influence on seismic hazard maps. Two strategies have been followed. One considers the seismic source as an extended source, and it is valid when the PSHA seismogenetic sources are represented as fault segments. We show that the incorporation of variables related to the directivity effect can lead to variations up to 20 per cent of the hazard level in case of dip-slip faults with uniform distribution of hypocentre location, in terms of spectral acceleration response at 5 s, exceeding probability of 10 per cent in 50 yr. The second one concerns the more general problem of the seismogenetic areas, where each point is a seismogenetic source having the same chance of enucleate a seismic event. In our proposition the point source is associated to the rupture-related parameters, defined using a statistical description. As an example, we consider a source point of an area characterized by strike-slip faulting style. With the introduction of the directivity correction the modulation of the hazard map reaches values up to 100 per cent (for strike-slip, unilateral faults). The introduction of directivity does not increase uniformly the hazard level, but acts more like a redistribution of the estimation that is consistent with the fault orientation. A general increase appears only when no a priori information is available. However, nowadays good a priori knowledge exists on style of faulting, dip and orientation of faults associated to the majority of the seismogenetic zones of the present seismic hazard maps. The percentage of variation obtained is strongly dependent on the type of model chosen to represent analytically the directivity effect. Therefore, it is our aim to emphasize more on the methodology following which, all the information collected may be easily converted to obtain a more comprehensive and meaningful probabilistic seismic hazard formulation.

  12. Reassessment of probabilistic seismic hazard in the Marmara region

    USGS Publications Warehouse

    Kalkan, Erol; Gulkan, Polat; Yilmaz, Nazan; Çelebi, Mehmet

    2009-01-01

    In 1999, the eastern coastline of the Marmara region (Turkey) witnessed increased seismic activity on the North Anatolian fault (NAF) system with two damaging earthquakes (M 7.4 Kocaeli and M 7.2 D??zce) that occurred almost three months apart. These events have reduced stress on the western segment of the NAF where it continues under the Marmara Sea. The undersea fault segments have been recently explored using bathymetric and reflection surveys. These recent findings helped scientists to understand the seismotectonic environment of the Marmara basin, which has remained a perplexing tectonic domain. On the basis of collected new data, seismic hazard of the Marmara region is reassessed using a probabilistic approach. Two different earthquake source models: (1) the smoothed-gridded seismicity model and (2) fault model and alternate magnitude-frequency relations, Gutenberg-Richter and characteristic, were used with local and imported ground-motion-prediction equations. Regional exposure is computed and quantified on a set of hazard maps that provide peak horizontal ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 sec on uniform firm-rock site condition (760 m=sec average shear wave velocity in the upper 30 m). These acceleration levels were computed for ground motions having 2% and 10% probabilities of exceedance in 50 yr, corresponding to return periods of about 2475 and 475 yr, respectively. The maximum PGA computed (at rock site) is 1.5g along the fault segments of the NAF zone extending into the Marmara Sea. The new maps generally show 10% to 15% increase for PGA, 0.2 and 1.0 sec spectral acceleration values across much of Marmara compared to previous regional hazard maps. Hazard curves and smooth design spectra for three site conditions: rock, soil, and soft-soil are provided for the Istanbul metropolitan area as possible tools in future risk estimates.

  13. Probabilistic seismic hazard maps for Sinai Peninsula, Egypt

    NASA Astrophysics Data System (ADS)

    Deif, A.; Abou Elenean, K.; El Hadidy, M.; Tealeb, A.; Mohamed, A.

    2009-09-01

    Sinai experienced the largest Egyptian earthquake with moment magnitude (Mw) 7.2 in 1995 in the Gulf of Aqaba, 350 km from Cairo. It is characterized by the presence of many tourist projects in addition to different natural resources. The aim of the current study is to present, for the first time, the probabilistic spectral hazard maps for Sinai. Revised earthquake catalogues for Sinai and its surroundings, from 112 BC to 2006 AD with magnitude equal or greater than 3.0, are used to calculate seismic hazard in the region of interest between 27°N and 31.5°N and 32°E and 36°E. We declustered these catalogues to include only independent events. The catalogues were tested for the completeness of different magnitude ranges. 28 seismic source zones are used to define the seismicity. The recurrence rates and the maximum earthquakes across these zones were also determined from these modified catalogues. Strong ground motion relations for rock are used to produce 5% damped spectral acceleration values for four different periods (0.2, 0.5, 1.0 and 2.0 s) to define the uniform response spectra at each site (grid of 0.2° × 0.2° all over the area). Maps showing spectral acceleration values at 0.2, 0.5, 1.0 and 2.0 s periods as well as peak ground acceleration (PGA) for the return period of 475 years (equivalent to 90% probability on non-exceedence in 50 years) are presented. In addition, Uniform Hazard Spectra (UHS) at 25 different periods for the four main cities (Hurghda, Sharm El-Sheikh, Nuweibaa and Suez) are graphed. The highest hazard is found in the Gulf of Aqaba with maximum spectral accelerations 356 cm s-2 at a period of 0.22 s for a return period of 475 years.

  14. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications here used as examples of the pyPHaz potentialities, that are focused on a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra dispersal and fallout applied to the municipality of Naples.

  15. Dynamic Statistical Models for Pyroclastic Density Current Generation at Soufrière Hills Volcano

    NASA Astrophysics Data System (ADS)

    Wolpert, Robert L.; Spiller, Elaine T.; Calder, Eliza S.

    2018-05-01

    To mitigate volcanic hazards from pyroclastic density currents, volcanologists generate hazard maps that provide long-term forecasts of areas of potential impact. Several recent efforts in the field develop new statistical methods for application of flow models to generate fully probabilistic hazard maps that both account for, and quantify, uncertainty. However a limitation to the use of most statistical hazard models, and a key source of uncertainty within them, is the time-averaged nature of the datasets by which the volcanic activity is statistically characterized. Where the level, or directionality, of volcanic activity frequently changes, e.g. during protracted eruptive episodes, or at volcanoes that are classified as persistently active, it is not appropriate to make short term forecasts based on longer time-averaged metrics of the activity. Thus, here we build, fit and explore dynamic statistical models for the generation of pyroclastic density current from Soufrière Hills Volcano (SHV) on Montserrat including their respective collapse direction and flow volumes based on 1996-2008 flow datasets. The development of this approach allows for short-term behavioral changes to be taken into account in probabilistic volcanic hazard assessments. We show that collapses from the SHV lava dome follow a clear pattern, and that a series of smaller flows in a given direction often culminate in a larger collapse and thereafter directionality of the flows change. Such models enable short term forecasting (weeks to months) that can reflect evolving conditions such as dome and crater morphology changes and non-stationary eruptive behavior such as extrusion rate variations. For example, the probability of inundation of the Belham Valley in the first 180 days of a forecast period is about twice as high for lava domes facing Northwest toward that valley as it is for domes pointing East toward the Tar River Valley. As rich multi-parametric volcano monitoring dataset become increasingly available, eruption forecasting is becoming an increasingly viable and important research field. We demonstrate an approach to utilize such data in order to appropriately 'tune' probabilistic hazard assessments for pyroclastic flows. Our broader objective with development of this method is to help advance time-dependent volcanic hazard assessment, by bridging the

  16. Seismic hazard in the Istanbul metropolitan area: A preliminary re-evaluation

    USGS Publications Warehouse

    Kalkan, E.; Gulkan, Polat; Ozturk, N.Y.; Celebi, M.

    2008-01-01

    In 1999, two destructive earthquakes (M7.4 Kocaeli and M7.2 Duzce) occurred in the north west of Turkey and resulted in major stress-drops on the western segment of the North Anatolian Fault system where it continues under the Marmara Sea. These undersea fault segments were recently explored using bathymetric and reflection surveys. These recent findings helped to reshape the seismotectonic environment of the Marmara basin, which is a perplexing tectonic domain. Based on collected new information, seismic hazard of the Marmara region, particularly Istanbul Metropolitan Area and its vicinity, were re-examined using a probabilistic approach. Two seismic source and alternate recurrence models combined with various indigenous and foreign attenuation relationships were adapted within a logic tree formulation to quantify and project the regional exposure on a set of hazard maps. The hazard maps show the peak horizontal ground acceleration and spectral acceleration at 1.0 s. These acceleration levels were computed for 2 and 10 % probabilities of transcendence in 50 years.

  17. Volcanic hazard maps of the Nevado del Ruiz volcano, Colombia

    NASA Astrophysics Data System (ADS)

    Parra, Eduardo; Cepeda, Hector

    1990-07-01

    Although the potential hazards associated with an eruption of Nevado del Ruiz volcano were known to civil authorities before the catastrophic eruption there in November 1985, their low perception of risk and the long quiescent period since the last eruption (140 years), caused them to wait for stronger activity before developing an eruption alert system. Unfortunately, the eruption occurred suddenly after a period of relative quiet, and as a result more than 25,000 people were killed. Although it was accurate and reasonably comprehensive, the hazard map that existed before the eruption was poorly understood by the authorities and even less so by the general population, because the scientific terminology and probabilistic approach to natural hazards were unfamiliar to many of them. This confusion was shared by the communication media, which at critical times placed undue emphasis on the possibility of lava flows rather than on the more imminent threat from mudflows, in keeping with the popular but often inaccurate perception of volcanic eruptions. This work presents an updated hazard map of Nevado del Ruiz that combines information on various hazardous phenomena with their relative probability of occurrence in order to depict numerical "hazard levels" that are easily comprehensible to nonspecialists and therefore less susceptible to misinterpretation. The scale of relative risk is arbitrary, ranging from five to one, and is intended to provide an intuitive indication of danger to people, property and crops. The map is meant to facilitate emergency preparedness and management by political and civil authorities, to educate the public concerning volcanic hazards and to assist in land-use planning decisions.

  18. Earthquake parametrics based protection for microfinance disaster management in Indonesia

    NASA Astrophysics Data System (ADS)

    Sedayo, M. H.; Damanik, R.

    2017-07-01

    Financial institutions included microfinance institutions those lend money to people also face the risk when catastrophe event hit their operation area. Liquidity risk when withdrawal amount and Non Performance Loan (NPL) hiking fast in the same time could hit their cash flow. There are products in market that provide backup fund for this kind of situation. Microfinance institution needs a guideline too make contingency plan in their disaster management program. We develop a probabilistic seismic hazard, index and zonation map as a tool to help in making financial disaster impact reduction program for microfinance in Indonesia. GMPE was used to estimate PGA for each Kabupaten points. PGA to MMI conversion was done by applied empirical relationship. We used loan distribution data from Financial Service Authority and Bank Indonesia as exposure in indexing. Index level from this study could be use as rank of urgency. Probabilistic hazard map was used to pricing two backup scenarios and to make a zonation. We proposed three zones with annual average cost 0.0684‰, 0.4236‰ and 1.4064 for first scenario and 0.3588‰, 2.6112‰, and 6.0816‰ for second scenario.

  19. Landslide Hazard Probability Derived from Inherent and Dynamic Determinants

    NASA Astrophysics Data System (ADS)

    Strauch, Ronda; Istanbulluoglu, Erkan

    2016-04-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach combines an empirical inherent landslide probability with a numerical dynamic probability, generated by combining routed recharge from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model run in a Monte Carlo simulation. Landslide hazard mapping is advanced by adjusting the dynamic model of stability with an empirically-based scalar representing the inherent stability of the landscape, creating a probabilistic quantitative measure of geohazard prediction at a 30-m resolution. Climatology, soil, and topography control the dynamic nature of hillslope stability and the empirical information further improves the discriminating ability of the integrated model. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex, a rugged terrain with nearly 2,700 m (9,000 ft) of vertical relief, covering 2757 sq km (1064 sq mi) in northern Washington State, U.S.A.

  20. Seaside, Oregon Tsunami Pilot Study - modernization of FEMA flood hazard maps

    USGS Publications Warehouse

    ,

    2006-01-01

    FEMA Flood Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study. Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Assessment (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State Agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey and the National Oceanic and Atmospheric Administration, in collaboration with the University of Southern California, Middle East Technical University. Portland State University, Horning Geosciences, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. Draft copies and a briefing on the contents, results and recommendations of this document were provided to FEMA officials before final publication.

  1. Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2006-01-01

    Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.

  2. Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)

    NASA Astrophysics Data System (ADS)

    Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián

    2015-04-01

    The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases towards the northeast across the Reverse Faults Province and up to Yucatan Platform, where the lowest values are reached. We also produced uniform hazard spectra (UHS) for the three main cities of Chiapas. Tapachula city presents the highest spectral accelerations, while Tuxtla Gutierrez and San Cristobal de las Casas cities show similar values. We conclude that seismic hazard in Chiapas is chiefly controlled by the subduction of the Cocos beneath Northamerica and Caribe tectonic plates, that makes the coastal areas the most hazardous. Additionally, the Motagua and Polochic Fault Zones are also important, increasing the hazard particularly in southeastern Chiapas.

  3. The effect of the sea on hazard assessment for tephra fallout at Campi Flegrei: a preliminary approach through the use of pyPHaz, an open tool to analyze and visualize probabilistic hazards

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Costa, Antonio; Selva, Jacopo

    2014-05-01

    Campi Flegrei (CF) is a large volcanic field located west of the Gulf of Naples, characterized by a wide and almost circular caldera which is partially submerged beneath the Gulf of Pozzuoli. It is known that the magma-water interaction is a key element to determine the character of submarine eruptions and their impact on the surrounding areas, but this phenomenon is still not well understood and it is rarely considered in hazard assessment. The aim of the present work is to present a preliminary study of the effect of the sea on the tephra fall hazard from CF on the municipality of Naples, by introducing a variability in the probability of tephra production according to the eruptive scale (defined on the basis of the erupted volume) and the depth of the opening submerged vents. Four different Probabilistic Volcanic Hazard Assessment (PVHA) models have been defined through the application of the model BET_VH at CF, by accounting for different modeling procedures and assumptions for the submerged part of the caldera. In particular, we take into account: 1) the effect of the sea as null, i.e. as if the water were not present; 2) the effect of the sea as a cap that totally blocks the explosivity of eruptions and consequently the tephra production; 3) an ensemble model between the two models described at the previous points 1) and 2); 4) a variable probability of tephra production depending on the depth of the submerged vent. The PVHA models are then input to pyPHaz, a tool developed and designed at INGV to visualize, analyze and merge into ensemble models PVHA's results and, potentially, any other kind of probabilistic hazard assessment, both natural and anthropic, in order to evaluate the importance of considering a variability among subaerial and submerged vents on tephra fallout hazard from CF in Naples. The analysis is preliminary and does not pretend to be exhaustive, but on one hand it represents a starting point for future works; on the other hand, it is a good case study to show the potentiality of the pyPHaz tool that, thanks to a dedicated Graphical User Interface (GUI), allows to interactively manage and visualize results of probabilistic hazards (hazard curves together with probability and hazard maps for different levels of uncertainties), and to compare or merge different hazard models producing ensemble models. This work has been developed in the framework of two Italian projects, "ByMuR (Bayesian Multi-Risk Assessment: a case study for natural risks in the city of Naples)" funded by the Italian Ministry of Education, Universities and Research (MIUR), and "V1: Probabilistic Volcanic Hazard Assessments" funded by the Italian Department of Civil Protection (DPC).

  4. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  5. Integrated multi-parameters Probabilistic Seismic Landslide Hazard Analysis (PSLHA): the case study of Ischia island, Italy

    NASA Astrophysics Data System (ADS)

    Caccavale, Mauro; Matano, Fabio; Sacchi, Marco; Mazzola, Salvatore; Somma, Renato; Troise, Claudia; De Natale, Giuseppe

    2014-05-01

    The Ischia island is a large, complex, partly submerged, active volcanic field located about 20 km east to the Campi Flegrei, a major active volcano-tectonic area near Naples. The island is morphologically characterized in its central part by the resurgent block of Mt. Epomeo, controlled by NW-SE and NE-SW trending fault systems, by mountain stream basin with high relief energy and by a heterogeneous coastline with alternation of beach and tuff/lava cliffs in a continuous reshape due to the weather and sea erosion. The volcano-tectonic process is a main factor for slope stability, as it produces seismic activity and generated steep slopes in volcanic deposits (lava, tuff, pumice and ash layers) characterized by variable strength. In the Campi Flegrei and surrounding areas the possible occurrence of a moderate/large seismic event represents a serious threat for the inhabitants, for the infrastructures as well as for the environment. The most relevant seismic sources for Ischia are represented by the Campi Flegrei caldera and a 5 km long fault located below the island north coast. However those sources are difficult to constrain. The first one due to the on-shore and off-shore extension not yet completely defined. The second characterized only by few large historical events is difficult to parameterize in the framework of probabilistic hazard approach. The high population density, the presence of many infrastructures and the more relevant archaeological sites associated with the natural and artistic values, makes this area a strategic natural laboratory to develop new methodologies. Moreover Ischia represents the only sector, in the Campi Flegrei area, with documented historical landslides originated by earthquake, allowing for the possibility of testing the adequacy and stability of the method. In the framework of the Italian project MON.I.C.A (infrastructural coastlines monitoring) an innovative and dedicated probabilistic methodology has been applied to identify the areas with higher susceptibility of landslide occurrence due to the seismic effect. The (PSLHA) combines the probability of exceedance maps for different GM parameters with the geological and geomorphological information, in terms of critical acceleration and dynamic stability factor. Generally the maps are evaluated for Peak Ground Acceleration, Velocity or Intensity, are well related with anthropic infrastructures (e.g. streets, building, etc.). Each ground motion parameter represents a different aspect in the hazard and has a different correlation with the generation of possible damages. Many works pointed out that other GM like Arias and Housner intensity and the absolute displacement could represent a better choice to analyse for example the cliffs stability. The selection of the GM parameter is of crucial importance to obtain the most useful hazard maps. However in the last decades different Ground Motion Prediction Equations for a new set of GM parameters have been published. Based on this information a series of landslide hazard maps can be produced. The new maps will lead to the identification of areas with highest probability of landslide induced by an earthquake. In a strategic site like Ischia this new methodologies will represent an innovative and advanced tool for the landslide hazard mitigation.

  6. Variabilities in probabilistic seismic hazard maps for natural and induced seismicity in the central and eastern United States

    USGS Publications Warehouse

    Mousavi, S. Mostafa; Beroza, Gregory C.; Hoover, Susan M.

    2018-01-01

    Probabilistic seismic hazard analysis (PSHA) characterizes ground-motion hazard from earthquakes. Typically, the time horizon of a PSHA forecast is long, but in response to induced seismicity related to hydrocarbon development, the USGS developed one-year PSHA models. In this paper, we present a display of the variability in USGS hazard curves due to epistemic uncertainty in its informed submodel using a simple bootstrapping approach. We find that variability is highest in low-seismicity areas. On the other hand, areas of high seismic hazard, such as the New Madrid seismic zone or Oklahoma, exhibit relatively lower variability simply because of more available data and a better understanding of the seismicity. Comparing areas of high hazard, New Madrid, which has a history of large naturally occurring earthquakes, has lower forecast variability than Oklahoma, where the hazard is driven mainly by suspected induced earthquakes since 2009. Overall, the mean hazard obtained from bootstrapping is close to the published model, and variability increased in the 2017 one-year model relative to the 2016 model. Comparing the relative variations caused by individual logic-tree branches, we find that the highest hazard variation (as measured by the 95% confidence interval of bootstrapping samples) in the final model is associated with different ground-motion models and maximum magnitudes used in the logic tree, while the variability due to the smoothing distance is minimal. It should be pointed out that this study is not looking at the uncertainty in the hazard in general, but only as it is represented in the USGS one-year models.

  7. Evaluation of potential surface rupture and review of current seismic hazards program at the Los Alamos National Laboratory. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-12-09

    This report summarizes the authors review and evaluation of the existing seismic hazards program at Los Alamos National Laboratory (LANL). The report recommends that the original program be augmented with a probabilistic analysis of seismic hazards involving assignment of weighted probabilities of occurrence to all potential sources. This approach yields a more realistic evaluation of the likelihood of large earthquake occurrence particularly in regions where seismic sources may have recurrent intervals of several thousand years or more. The report reviews the locations and geomorphic expressions of identified fault lines along with the known displacements of these faults and last knowmore » occurrence of seismic activity. Faults are mapped and categorized into by their potential for actual movement. Based on geologic site characterization, recommendations are made for increased seismic monitoring; age-dating studies of faults and geomorphic features; increased use of remote sensing and aerial photography for surface mapping of faults; the development of a landslide susceptibility map; and to develop seismic design standards for all existing and proposed facilities at LANL.« less

  8. Probabilistic tsunami inundation map based on stochastic earthquake source model: A demonstration case in Macau, the South China Sea

    NASA Astrophysics Data System (ADS)

    Li, Linlin; Switzer, Adam D.; Wang, Yu; Chan, Chung-Han; Qiu, Qiang; Weiss, Robert

    2017-04-01

    Current tsunami inundation maps are commonly generated using deterministic scenarios, either for real-time forecasting or based on hypothetical "worst-case" events. Such maps are mainly used for emergency response and evacuation planning and do not include the information of return period. However, in practice, probabilistic tsunami inundation maps are required in a wide variety of applications, such as land-use planning, engineer design and for insurance purposes. In this study, we present a method to develop the probabilistic tsunami inundation map using a stochastic earthquake source model. To demonstrate the methodology, we take Macau a coastal city in the South China Sea as an example. Two major advances of this method are: it incorporates the most updated information of seismic tsunamigenic sources along the Manila megathrust; it integrates a stochastic source model into a Monte Carlo-type simulation in which a broad range of slip distribution patterns are generated for large numbers of synthetic earthquake events. When aggregated the large amount of inundation simulation results, we analyze the uncertainties associated with variability of earthquake rupture location and slip distribution. We also explore how tsunami hazard evolves in Macau in the context of sea level rise. Our results suggest Macau faces moderate tsunami risk due to its low-lying elevation, extensive land reclamation, high coastal population and major infrastructure density. Macau consists of four districts: Macau Peninsula, Taipa Island, Coloane island and Cotai strip. Of these Macau Peninsula is the most vulnerable to tsunami due to its low-elevation and exposure to direct waves and refracted waves from the offshore region and reflected waves from mainland. Earthquakes with magnitude larger than Mw8.0 in the northern Manila trench would likely cause hazardous inundation in Macau. Using a stochastic source model, we are able to derive a spread of potential tsunami impacts for earthquakes with the same magnitude. The diversity is caused by both random rupture locations and heterogeneous slip distribution. Adding the sea level rise component, the inundated depth caused by 1 m sea level rise is equivalent to the one caused by 90 percentile of an ensemble of Mw8.4 earthquakes.

  9. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to decrease future risks. Preliminary results show that the urban extent in Indonesia is projected to increase within 211 to 351% over the period 2000-2030 (5 and 95 percentile). Mainly driven by this rapid urbanization, potential flood losses in Indonesia increase rapidly and are primarily concentrated on the island of Java. The results reveal the large risk-reducing potential of adaptation measures. Since much of the urban development between 2000 and 2030 takes place in flood-prone areas, strategic urban planning (i.e. building in safe areas) may significantly reduce the urban population and infrastructure exposed to flooding. We conclude that a probabilistic risk approach in future flood risk assessment is vital; the drivers behind risk trends (exposure, hazard, vulnerability) should be understood to develop robust and efficient adaptation pathways.

  10. Advancing the citizen scientist's contributions to documenting and understanding natural hazards: a proof of concept for linking crowdsourced and remotely sensed data on landslide hazards in El Salvador

    NASA Astrophysics Data System (ADS)

    Anderson, E. R.; Griffin, R.; Markert, K. N.

    2017-12-01

    Scientists, practitioners, policymakers, and citizen groups, share a role in ensuring "that all sectors have access to, understand and can use scientific information for better informed decision-making" (Sendai Framework 2015-2030). When it comes to understanding hazards and exposure, inventories on disaster events are often limited. Thus, there are many opportunities for citizen scientists to engage in improving the collective understanding—and ultimately reduction—of disaster risk. Landslides are very difficult to forecast on spatial and temporal scales meaningful for early warning and evacuation. Heuristic hazard mapping methods are very common in regional hazard zonation and rely on expert knowledge of previous events and local conditions, but they often lack a temporal component. As new data analysis packages are becoming more open and accessible, probabilistic approaches that consider high resolution spatial and temporal dimensions are becoming more common, but this is only possible when rich inventories of landslide events exist. The work presented offers a proof of concept on incorporating crowd-sourced data to improve landslide hazard model performance. Starting with a national inventory of 90 catalogued landslides in El Salvador for a study period of 1998 to 2011, we simulate the addition of over 600 additional crowd-sourced landslide events that would have been identified through human interpretation of high resolution imagery in the Google Earth time slider feature. There is a noticeable improvement in performance statistics between static heuristic hazard models and probabilistic models that incorporate the events identified by the "crowd." Such a dynamic incorporation of crowd-sourced data on hazard events is not so far-fetched. Given the engagement of "local observers" in El Salvador who augment in situ hydro-meteorological measurements, the growing access to Earth observation data to the lay person, and immense interest behind connecting citizen scientists to remote sensing data through hackathons such as the NASA Space Apps Challenges, we envision a much more dynamic, collective understanding of landslide hazards. Here we present a better scenario of what we could have known had data from the crowd been incorporated into probabilistic hazard models on a regular basis.

  11. Qualitative landslide susceptibility assessment by multicriteria analysis: A case study from San Antonio del Sur, Guantánamo, Cuba

    NASA Astrophysics Data System (ADS)

    Castellanos Abella, Enrique A.; Van Westen, Cees J.

    Geomorphological information can be combined with decision-support tools to assess landslide hazard and risk. A heuristic model was applied to a rural municipality in eastern Cuba. The study is based on a terrain mapping units (TMU) map, generated at 1:50,000 scale by interpretation of aerial photos, satellite images and field data. Information describing 603 terrain units was collected in a database. Landslide areas were mapped in detail to classify the different failure types and parts. Three major landslide regions are recognized in the study area: coastal hills with rockfalls, shallow debris flows and old rotational rockslides denudational slopes in limestone, with very large deep-seated rockslides related to tectonic activity and the Sierra de Caujerí scarp, with large rockslides. The Caujerí scarp presents the highest hazard, with recent landslides and various signs of active processes. The different landforms and the causative factors for landslides were analyzed and used to develop the heuristic model. The model is based on weights assigned by expert judgment and organized in a number of components such as slope angle, internal relief, slope shape, geological formation, active faults, distance to drainage, distance to springs, geomorphological subunits and existing landslide zones. From these variables a hierarchical heuristic model was applied in which three levels of weights were designed for classes, variables, and criteria. The model combines all weights into a single hazard value for each pixel of the landslide hazard map. The hazard map was then divided by two scales, one with three classes for disaster managers and one with 10 detailed hazard classes for technical staff. The range of weight values and the number of existing landslides is registered for each class. The resulting increasing landslide density with higher hazard classes indicates that the output map is reliable. The landslide hazard map was used in combination with existing information on buildings and infrastructure to prepare a qualitative risk map. The complete lack of historical landslide information and geotechnical data precludes the development of quantitative deterministic or probabilistic models.

  12. Effect of Fault Parameter Uncertainties on PSHA explored by Monte Carlo Simulations: A case study for southern Apennines, Italy

    NASA Astrophysics Data System (ADS)

    Akinci, A.; Pace, B.

    2017-12-01

    In this study, we discuss the seismic hazard variability of peak ground acceleration (PGA) at 475 years return period in the Southern Apennines of Italy. The uncertainty and parametric sensitivity are presented to quantify the impact of the several fault parameters on ground motion predictions for 10% exceedance in 50-year hazard. A time-independent PSHA model is constructed based on the long-term recurrence behavior of seismogenic faults adopting the characteristic earthquake model for those sources capable of rupturing the entire fault segment with a single maximum magnitude. The fault-based source model uses the dimensions and slip rates of mapped fault to develop magnitude-frequency estimates for characteristic earthquakes. Variability of the selected fault parameter is given with a truncated normal random variable distribution presented by standard deviation about a mean value. A Monte Carlo approach, based on the random balanced sampling by logic tree, is used in order to capture the uncertainty in seismic hazard calculations. For generating both uncertainty and sensitivity maps, we perform 200 simulations for each of the fault parameters. The results are synthesized both in frequency-magnitude distribution of modeled faults as well as the different maps: the overall uncertainty maps provide a confidence interval for the PGA values and the parameter uncertainty maps determine the sensitivity of hazard assessment to variability of every logic tree branch. These branches of logic tree, analyzed through the Monte Carlo approach, are maximum magnitudes, fault length, fault width, fault dip and slip rates. The overall variability of these parameters is determined by varying them simultaneously in the hazard calculations while the sensitivity of each parameter to overall variability is determined varying each of the fault parameters while fixing others. However, in this study we do not investigate the sensitivity of mean hazard results to the consideration of different GMPEs. Distribution of possible seismic hazard results is illustrated by 95% confidence factor map, which indicates the dispersion about mean value, and coefficient of variation map, which shows percent variability. The results of our study clearly illustrate the influence of active fault parameters to probabilistic seismic hazard maps.

  13. Implementation of NGA-West2 ground motion models in the 2014 U.S. National Seismic Hazard Maps

    USGS Publications Warehouse

    Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Harmsen, Stephen C.; Frankel, Arthur D.

    2014-01-01

    The U.S. National Seismic Hazard Maps (NSHMs) have been an important component of seismic design regulations in the United States for the past several decades. These maps present earthquake ground shaking intensities at specified probabilities of being exceeded over a 50-year time period. The previous version of the NSHMs was developed in 2008; during 2012 and 2013, scientists at the U.S. Geological Survey have been updating the maps based on their assessment of the “best available science,” resulting in the 2014 NSHMs. The update includes modifications to the seismic source models and the ground motion models (GMMs) for sites across the conterminous United States. This paper focuses on updates in the Western United States (WUS) due to the use of new GMMs for shallow crustal earthquakes in active tectonic regions developed by the Next Generation Attenuation (NGA-West2) project. Individual GMMs, their weighted combination, and their impact on the hazard maps relative to 2008 are discussed. In general, the combined effects of lower medians and increased standard deviations in the new GMMs have caused only small changes, within 5–20%, in the probabilistic ground motions for most sites across the WUS compared to the 2008 NSHMs.

  14. How well should probabilistic seismic hazard maps work?

    NASA Astrophysics Data System (ADS)

    Vanneste, K.; Stein, S.; Camelbeeck, T.; Vleminckx, B.

    2016-12-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in earthquake hazard maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSHA model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating the shaking history of an area with assumed distribution of earthquakes, frequency-magnitude relation, temporal occurrence model, and ground-motion prediction equation. We compare the "observed" shaking at many sites over time to that predicted by a hazard map generated for the same set of parameters. PSHA predicts that the fraction of sites at which shaking will exceed that mapped is p = 1 - exp(t/T), where t is the duration of observations and T is the map's return period. This implies that shaking in large earthquakes is typically greater than shown on hazard maps, as has occurred in a number of cases. A large number of simulated earthquake histories yield distributions of shaking consistent with this forecast, with a scatter about this value that decreases as t/T increases. The median results are somewhat lower than predicted for small values of t/T and approach the predicted value for larger values of t/T. Hence, the algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. Validation is more complicated because a real observed earthquake history can yield a fractional exceedance significantly higher or lower than that predicted while still being consistent with the hazard map in question. As a result, given that in the real world we have only a single sample, it is hard to assess whether a misfit between a map and observations arises by chance or reflects a biased map.

  15. Naples between two fires: eruptive scenarios for the next eruptions by an integrated volcanological-probabilistic approach.

    NASA Astrophysics Data System (ADS)

    Mastrolorenzo, G.; Pappalardo, L.; de Natale, G.; Troise, C.; Rossano, S.; Panizza, A.

    2009-04-01

    Probabilistic approaches based on available volcanological data from real eruptions of Campi Flegrei and Somma-Vesuvius, are assembled in a comprehensive assessment of volcanic hazards at the Neapolitan area. This allows to compare the volcanic hazards related to the different types of events, which can be used for evaluating the conditional probability of flows and falls hazard in case of a volcanic crisis. Hazard maps are presented, based on a rather complete set of numerical simulations, produced using field and laboratory data as input parameters relative to a large range (VEI 1 to 5) of fallout and pyroclastic-flow events and their relative occurrence. The results allow us to quantitatively evaluate and compare the hazard related to pyroclastic fallout and density currents (PDCs) at the Neapolitan volcanoes and their surroundings, including the city of Naples. Due to its position between the two volcanic areas, the city of Naples is particularly exposed to volcanic risk from VEI>2 eruptions, as recorded in the local volcanic succession. Because dominant wind directions, the area of Naples is particularly prone to fallout hazard from Campi Flegrei caldera eruptions in the VEI range 2-5. The hazard from PDCs decreases roughly radially with distance from the eruptive vents and is strongly controlled by the topographic heights. Campi Flegrei eruptions are particularly hazardous for Naples, although the Camaldoli and Posillipo hills produce an effective barrier to propagation to the very central part of Naples. PDCs from Vesuvius eruptions with VEI>4 can cover the city of Naples, whereas even VEI>3 eruptions have a moderate fallout hazard there.

  16. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    USGS Publications Warehouse

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  17. Issues in testing the new national seismic hazard model for Italy

    NASA Astrophysics Data System (ADS)

    Stein, S.; Peresan, A.; Kossobokov, V. G.; Brooks, E. M.; Spencer, B. D.

    2016-12-01

    It is important to bear in mind that we know little about how earthquake hazard maps actually describe the shaking that will actually occur in the future, and have no agreed way of assessing how well a map performed in the past, and, thus, whether one map performs better than another. Moreover, we should not forget that different maps can be useful for different end users, who may have different cost-and-benefit strategies. Thus, regardless of the specific tests we chose to use, the adopted testing approach should have several key features: We should assess map performance using all the available instrumental, paleo seismology, and historical intensity data. Instrumental data alone span a period much too short to capture the largest earthquakes - and thus strongest shaking - expected from most faults. We should investigate what causes systematic misfit, if any, between the longest record we have - historical intensity data available for the Italian territory from 217 B.C. to 2002 A.D. - and a given hazard map. We should compare how seismic hazard maps developed over time. How do the most recent maps for Italy compare to earlier ones? It is important to understand local divergences that show how the models are developing to the most recent one. The temporal succession of maps is important: we have to learn from previous errors. We should use the many different tests that have been proposed. All are worth trying, because different metrics of performance show different aspects of how a hazard map performs and can be used. We should compare other maps to the ones we are testing. Maps can be made using a wide variety of assumptions, which will lead to different predicted shaking. It is possible that maps derived by other approaches may perform better. Although Italian current codes are based on probabilistic maps, it is important from both a scientific and societal perspective to look at all options including deterministic scenario based ones. Comparing what works better, according to different performance measures, will give valuable insight into key map parameters and assumptions. We may well find that different maps perform better in different applications.

  18. Application of Gumbel I and Monte Carlo methods to assess seismic hazard in and around Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2018-05-01

    A proper assessment of seismic hazard is of considerable importance in order to achieve suitable building construction criteria. This paper presents probabilistic seismic hazard assessment in and around Pakistan (23° N-39° N; 59° E-80° E) in terms of peak ground acceleration (PGA). Ground motion is calculated in terms of PGA for a return period of 475 years using a seismogenic-free zone method of Gumbel's first asymptotic distribution of extreme values and Monte Carlo simulation. Appropriate attenuation relations of universal and local types have been used in this study. The results show that for many parts of Pakistan, the expected seismic hazard is relatively comparable with the level specified in the existing PGA maps.

  19. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and themore » study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area and Not hazardous area. The maps was assessed using ROC (Rate of Curve) based on the area under the curve method (AUC). The result indicates an increase of accuracy from 77.76% (with all 22 factors) to 79.00% (with 14 dominant factors) in the prediction of landslide occurrence.« less

  20. Seismic Hazard Maps for the Maltese Archipelago: Preliminary Results

    NASA Astrophysics Data System (ADS)

    D'Amico, S.; Panzera, F.; Galea, P. M.

    2013-12-01

    The Maltese islands form an archipelago of three major islands lying in the Sicily channel at about 140 km south of Sicily and 300 km north of Libya. So far very few investigations have been carried out on seismicity around the Maltese islands and no maps of seismic hazard for the archipelago are available. Assessing the seismic hazard for the region is currently of prime interest for the near-future development of industrial and touristic facilities as well as for urban expansion. A culture of seismic risk awareness has never really been developed in the country, and the public perception is that the islands are relatively safe, and that any earthquake phenomena are mild and infrequent. However, the Archipelago has been struck by several moderate/large events. Although recent constructions of a certain structural and strategic importance have been built according to high engineering standards, the same probably cannot be said for all residential buildings, many higher than 3 storeys, which have mushroomed rapidly in recent years. Such buildings are mostly of unreinforced masonry, with heavy concrete floor slabs, which are known to be highly vulnerable to even moderate ground shaking. We can surely state that in this context planning and design should be based on available national hazard maps. Unfortunately, these kinds of maps are not available for the Maltese islands. In this paper we attempt to compute a first and preliminary probabilistic seismic hazard assessment of the Maltese islands in terms of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) at different periods. Seismic hazard has been computed using the Esteva-Cornell (1968) approach which is the most widely utilized probabilistic method. It is a zone-dependent approach: seismotectonic and geological data are used coupled with earthquake catalogues to identify seismogenic zones within which earthquakes occur at certain rates. Therefore the earthquake catalogues can be reduced to the activity rate, the b-value of the Gutenberg-Richter relationship and an estimate of the maximum magnitude. In this article we also defined a new seismogenic zones in the central Mediterranean never considered before. In order to determine the ground motion parameters related to a specified probability of exceedance, the above statistical parameters are combined with ground motion prediction equations. Seismic hazard computations have been performed within the island boundaries. The preliminary maps for PGA distribution on rock site obtained for a 10% probability of exceedance shows values ranging between 0.09-0.18 g whereas, SA for 0.2, 04, 1.0 s show values of about 0.21-0.40 g, 0.14-0.24 g and 0.05-0.08 g respectively.

  1. Documentation for Initial Seismic Hazard Maps for Haiti

    USGS Publications Warehouse

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2010-01-01

    In response to the urgent need for earthquake-hazard information after the tragic disaster caused by the moment magnitude (M) 7.0 January 12, 2010, earthquake, we have constructed initial probabilistic seismic hazard maps for Haiti. These maps are based on the current information we have on fault slip rates and historical and instrumental seismicity. These initial maps will be revised and improved as more data become available. In the short term, more extensive logic trees will be developed to better capture the uncertainty in key parameters. In the longer term, we will incorporate new information on fault parameters and previous large earthquakes obtained from geologic fieldwork. These seismic hazard maps are important for the management of the current crisis and the development of building codes and standards for the rebuilding effort. The boundary between the Caribbean and North American Plates in the Hispaniola region is a complex zone of deformation. The highly oblique ~20 mm/yr convergence between the two plates (DeMets and others, 2000) is partitioned between subduction zones off of the northern and southeastern coasts of Hispaniola and strike-slip faults that transect the northern and southern portions of the island. There are also thrust faults within the island that reflect the compressional component of motion caused by the geometry of the plate boundary. We follow the general methodology developed for the 1996 U.S. national seismic hazard maps and also as implemented in the 2002 and 2008 updates. This procedure consists of adding the seismic hazard calculated from crustal faults, subduction zones, and spatially smoothed seismicity for shallow earthquakes and Wadati-Benioff-zone earthquakes. Each one of these source classes will be described below. The lack of information on faults in Haiti requires many assumptions to be made. These assumptions will need to be revisited and reevaluated as more fieldwork and research are accomplished. We made two sets of maps using different assumptions about site conditions. One set of maps is for a firm-rock site condition (30-m averaged shear-wave velocity, Vs30, of 760 m/s). We also developed hazard maps that contain site amplification based on a grid of Vs30 values estimated from topographic slope. These maps take into account amplification from soils. We stress that these new maps are designed to quantify the hazard for Haiti; they do not consider all the sources of earthquake hazard that affect the Dominican Republic and therefore should not be considered as complete hazard maps for eastern Hispaniola. For example, we have not included hazard from earthquakes in the Mona Passage nor from large earthquakes on the subduction zone interface north of Puerto Rico. Furthermore, they do not capture all the earthquake hazards for eastern Cuba.

  2. Seismic hazard assessment for Guam and the Northern Mariana Islands

    USGS Publications Warehouse

    Mueller, Charles S.; Haller, Kathleen M.; Luco, Nicholas; Petersen, Mark D.; Frankel, Arthur D.

    2012-01-01

    We present the results of a new probabilistic seismic hazard assessment for Guam and the Northern Mariana Islands. The Mariana island arc has formed in response to northwestward subduction of the Pacific plate beneath the Philippine Sea plate, and this process controls seismic activity in the region. Historical seismicity, the Mariana megathrust, and two crustal faults on Guam were modeled as seismic sources, and ground motions were estimated by using published relations for a firm-rock site condition. Maps of peak ground acceleration, 0.2-second spectral acceleration for 5 percent critical damping, and 1.0-second spectral acceleration for 5 percent critical damping were computed for exceedance probabilities of 2 percent and 10 percent in 50 years. For 2 percent probability of exceedance in 50 years, probabilistic peak ground acceleration is 0.94 gravitational acceleration at Guam and 0.57 gravitational acceleration at Saipan, 0.2-second spectral acceleration is 2.86 gravitational acceleration at Guam and 1.75 gravitational acceleration at Saipan, and 1.0-second spectral acceleration is 0.61 gravitational acceleration at Guam and 0.37 gravitational acceleration at Saipan. For 10 percent probability of exceedance in 50 years, probabilistic peak ground acceleration is 0.49 gravitational acceleration at Guam and 0.29 gravitational acceleration at Saipan, 0.2-second spectral acceleration is 1.43 gravitational acceleration at Guam and 0.83 gravitational acceleration at Saipan, and 1.0-second spectral acceleration is 0.30 gravitational acceleration at Guam and 0.18 gravitational acceleration at Saipan. The dominant hazard source at the islands is upper Benioff-zone seismicity (depth 40–160 kilometers). The large probabilistic ground motions reflect the strong concentrations of this activity below the arc, especially near Guam.

  3. New Criterion and Tool for Caltrans Seismic Hazard Characterization

    NASA Astrophysics Data System (ADS)

    Shantz, T.; Merriam, M.; Turner, L.; Chiou, B.; Liu, X.

    2008-12-01

    Caltrans recently adopted new procedures for the development of response spectra for structure design. These procedures incorporate both deterministic and probabilistic criteria. The Next Generation Attenuation (NGA) models (2008) are used for deterministic assessment (using a revised late-Quaternary age fault database), and the USGS 2008 5% in 50-year hazard maps are used for probabilistic assessment. A minimum deterministic spectrum based on a M6.5 earthquake at 12 km is also included. These spectra are enveloped and the largest values used. A new publicly available web-based design tool for calculating the design spectrum will be used for calculations. The tool is built on a Windows-Apache-MySQL-PHP (WAMP) platform and integrates GoogleMaps for increased flexibility in the tool's use. Links to Caltrans data such as pre-construction logs of test borings assist in the estimation of Vs30 values used in the new procedures. Basin effects based on new models developed for the CFM, for the San Francisco Bay area by the USGS, and by Thurber (2008) are also incorporated. It is anticipated that additional layers such as CGS Seismic Hazard Zone maps will be added in the future. Application of the new criterion will result in expected higher levels of ground motion at many bridges west of the Coast Ranges. In eastern California, use of the NGA relationships for strike-slip faulting (the dominant sense of motion in California) will often result in slightly lower expected values for bridges. The expected result is a more realistic prediction of ground motions at bridges, in keeping with those motions developed for other large-scale and important structures. The tool is based on a simplified fault map of California, so it will not be used for more detailed evaluations such as surface rupture determination. Announcements regarding tool availability (expected to be in early 2009) are at http://www.dot.ca.gov/research/index.htm

  4. Application of decision tree model for the ground subsidence hazard mapping near abandoned underground coal mines.

    PubMed

    Lee, Saro; Park, Inhye

    2013-09-30

    Subsidence of ground caused by underground mines poses hazards to human life and property. This study analyzed the hazard to ground subsidence using factors that can affect ground subsidence and a decision tree approach in a geographic information system (GIS). The study area was Taebaek, Gangwon-do, Korea, where many abandoned underground coal mines exist. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 50/50 for training and validation of the models. A data-mining classification technique was applied to the GSH mapping, and decision trees were constructed using the chi-squared automatic interaction detector (CHAID) and the quick, unbiased, and efficient statistical tree (QUEST) algorithms. The frequency ratio model was also applied to the GSH mapping for comparing with probabilistic model. The resulting GSH maps were validated using area-under-the-curve (AUC) analysis with the subsidence area data that had not been used for training the model. The highest accuracy was achieved by the decision tree model using CHAID algorithm (94.01%) comparing with QUEST algorithms (90.37%) and frequency ratio model (86.70%). These accuracies are higher than previously reported results for decision tree. Decision tree methods can therefore be used efficiently for GSH analysis and might be widely used for prediction of various spatial events. Copyright © 2013. Published by Elsevier Ltd.

  5. Updating the USGS seismic hazard maps for Alaska

    USGS Publications Warehouse

    Mueller, Charles; Briggs, Richard; Wesson, Robert L.; Petersen, Mark D.

    2015-01-01

    The U.S. Geological Survey makes probabilistic seismic hazard maps and engineering design maps for building codes, emergency planning, risk management, and many other applications. The methodology considers all known earthquake sources with their associated magnitude and rate distributions. Specific faults can be modeled if slip-rate or recurrence information is available. Otherwise, areal sources are developed from earthquake catalogs or GPS data. Sources are combined with ground-motion estimates to compute the hazard. The current maps for Alaska were developed in 2007, and included modeled sources for the Alaska-Aleutian megathrust, a few crustal faults, and areal seismicity sources. The megathrust was modeled as a segmented dipping plane with segmentation largely derived from the slip patches of past earthquakes. Some megathrust deformation is aseismic, so recurrence was estimated from seismic history rather than plate rates. Crustal faults included the Fairweather-Queen Charlotte system, the Denali–Totschunda system, the Castle Mountain fault, two faults on Kodiak Island, and the Transition fault, with recurrence estimated from geologic data. Areal seismicity sources were developed for Benioff-zone earthquakes and for crustal earthquakes not associated with modeled faults. We review the current state of knowledge in Alaska from a seismic-hazard perspective, in anticipation of future updates of the maps. Updated source models will consider revised seismicity catalogs, new information on crustal faults, new GPS data, and new thinking on megathrust recurrence, segmentation, and geometry. Revised ground-motion models will provide up-to-date shaking estimates for crustal earthquakes and subduction earthquakes in Alaska.

  6. Probabilistic seismic hazard assessment for the two layer fault system of Antalya (SW Turkey) area

    NASA Astrophysics Data System (ADS)

    Dipova, Nihat; Cangir, Bülent

    2017-09-01

    Southwest Turkey, along Mediterranean coast, is prone to large earthquakes resulting from subduction of the African plate under the Eurasian plate and shallow crustal faults. Maximum observed magnitude of subduction earthquakes is Mw = 6.5 whereas that of crustal earthquakes is Mw = 6.6. Crustal earthquakes are sourced from faults which are related with Isparta Angle and Cyprus Arc tectonic structures. The primary goal of this study is to assess seismic hazard for Antalya area (SW Turkey) using a probabilistic approach. A new earthquake catalog for Antalya area, with unified moment magnitude scale, was prepared in the scope of the study. Seismicity of the area has been evaluated by the Gutenberg-Richter recurrence relationship. For hazard computation, CRISIS2007 software was used following the standard Cornell-McGuire methodology. Attenuation model developed by Youngs et al. Seismol Res Lett 68(1):58-73, (1997) was used for deep subduction earthquakes and Chiou and Youngs Earthq Spectra 24(1):173-215, (2008) model was used for shallow crustal earthquakes. A seismic hazard map was developed for peak ground acceleration and for rock ground with a hazard level of a 10% probability of exceedance in 50 years. Results of the study show that peak ground acceleration values on bedrock change between 0.215 and 0.23 g in the center of Antalya.

  7. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    USGS Publications Warehouse

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude distribution similar to that including characteristic earthquakes. The island chain northwest of Hawaii Island is seismically and volcanically much less active. We model its seismic hazard with a combination of a linearly decaying ramp fit to the cataloged seismicity and spatially smoothed seismicity with a smoothing half-width of 10 km. We use a combination of up to four attenuation relations for each map because for either PGA or SA, there is no single relation that represents ground motion for all distance and magnitude ranges. Great slumps and landslides visible on the ocean floor correspond to catastrophes with effective energy magnitudes ME above 8.0. A crude estimate of their frequency suggests that the probabilistic earthquake hazard is at least an order of magnitude higher for flank earthquakes than that from submarine slumps.

  8. Probabilistic Approach to Conditional Probability of Release of Hazardous Materials from Railroad Tank Cars during Accidents

    DOT National Transportation Integrated Search

    2009-10-13

    This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...

  9. New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.

    2013-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.

  10. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    NASA Astrophysics Data System (ADS)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2018-02-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  11. Development of regional liquefaction-induced deformation hazard maps

    USGS Publications Warehouse

    Rosinski, A.; Knudsen, K.-L.; Wu, J.; Seed, R.B.; Real, C.R.; ,

    2004-01-01

    This paper describes part of a project to assess the feasibility of producing regional (1:24,000-scale) liquefaction hazard maps that are based-on potential liquefaction-induced deformation. The study area is the central Santa Clara Valley, at the south end of San Francisco Bay in Central California. The information collected and used includes: a) detailed Quaternary geological mapping, b) over 650 geotechnical borings, c) probabilistic earthquake shaking information, and d) ground-water levels. Predictions of strain can be made using either empirical formulations or numerical simulations. In this project lateral spread displacements are estimated and new empirical relations to estimate future volumetric and shear strain are used. Geotechnical boring data to are used to: (a) develop isopach maps showing the thickness of sediment thatis likely to liquefy and deform under earthquake shaking; and (b) assess the variability in engineering properties within and between geologic map units. Preliminary results reveal that late Holocene deposits are likely to experience the greatest liquefaction-induced strains, while Holocene and late Pleistocene deposits are likely to experience significantly less horizontal and vertical strain in future earthquakes. Development of maps based on these analyses is feasible.

  12. Re-evaluation and updating of the seismic hazard of Lebanon

    NASA Astrophysics Data System (ADS)

    Huijer, Carla; Harajli, Mohamed; Sadek, Salah

    2016-01-01

    This paper presents the results of a study undertaken to evaluate the implications of the newly mapped offshore Mount Lebanon Thrust (MLT) fault system on the seismic hazard of Lebanon and the current seismic zoning and design parameters used by the local engineering community. This re-evaluation is critical, given that the MLT is located at close proximity to the major cities and economic centers of the country. The updated seismic hazard was assessed using probabilistic methods of analysis. The potential sources of seismic activities that affect Lebanon were integrated along with any/all newly established characteristics within an updated database which includes the newly mapped fault system. The earthquake recurrence relationships of these sources were developed from instrumental seismology data, historical records, and earlier studies undertaken to evaluate the seismic hazard of neighboring countries. Maps of peak ground acceleration contours, based on 10 % probability of exceedance in 50 years (as per Uniform Building Code (UBC) 1997), as well as 0.2 and 1 s peak spectral acceleration contours, based on 2 % probability of exceedance in 50 years (as per International Building Code (IBC) 2012), were also developed. Finally, spectral charts for the main coastal cities of Beirut, Tripoli, Jounieh, Byblos, Saida, and Tyre are provided for use by designers.

  13. Seismic Hazard Assessment at Esfaraen‒Bojnurd Railway, North‒East of Iran

    NASA Astrophysics Data System (ADS)

    Haerifard, S.; Jarahi, H.; Pourkermani, M.; Almasian, M.

    2018-01-01

    The objective of this study is to evaluate the seismic hazard at the Esfarayen-Bojnurd railway using the probabilistic seismic hazard assessment (PSHA) method. This method was carried out based on a recent data set to take into account the historic seismicity and updated instrumental seismicity. A homogenous earthquake catalogue was compiled and a proposed seismic sources model was presented. Attenuation equations that recently recommended by experts and developed based upon earthquake data obtained from tectonic environments similar to those in and around the studied area were weighted and used for assessment of seismic hazard in the frame of logic tree approach. Considering a grid of 1.2 × 1.2 km covering the study area, ground acceleration for every node was calculated. Hazard maps at bedrock conditions were produced for peak ground acceleration, in addition to return periods of 74, 475 and 2475 years.

  14. Probability-Based Design Criteria of the ASCE 7 Tsunami Loads and Effects Provisions (Invited)

    NASA Astrophysics Data System (ADS)

    Chock, G.

    2013-12-01

    Mitigation of tsunami risk requires a combination of emergency preparedness for evacuation in addition to providing structural resilience of critical facilities, infrastructure, and key resources necessary for immediate response and economic and social recovery. Critical facilities would include emergency response, medical, tsunami refuges and shelters, ports and harbors, lifelines, transportation, telecommunications, power, financial institutions, and major industrial/commercial facilities. The Tsunami Loads and Effects Subcommittee of the ASCE/SEI 7 Standards Committee is developing a proposed new Chapter 6 - Tsunami Loads and Effects for the 2016 edition of the ASCE 7 Standard. ASCE 7 provides the minimum design loads and requirements for structures subject to building codes such as the International Building Code utilized in the USA. In this paper we will provide a review emphasizing the intent of these new code provisions and explain the design methodology. The ASCE 7 provisions for Tsunami Loads and Effects enables a set of analysis and design methodologies that are consistent with performance-based engineering based on probabilistic criteria. . The ASCE 7 Tsunami Loads and Effects chapter will be initially applicable only to the states of Alaska, Washington, Oregon, California, and Hawaii. Ground shaking effects and subsidence from a preceding local offshore Maximum Considered Earthquake will also be considered prior to tsunami arrival for Alaska and states in the Pacific Northwest regions governed by nearby offshore subduction earthquakes. For national tsunami design provisions to achieve a consistent reliability standard of structural performance for community resilience, a new generation of tsunami inundation hazard maps for design is required. The lesson of recent tsunami is that historical records alone do not provide a sufficient measure of the potential heights of future tsunamis. Engineering design must consider the occurrence of events greater than scenarios in the historical record, and should properly be based on the underlying seismicity of subduction zones. Therefore, Probabilistic Tsunami Hazard Analysis (PTHA) consistent with source seismicity must be performed in addition to consideration of historical event scenarios. A method of Probabilistic Tsunami Hazard Analysis has been established that is generally consistent with Probabilistic Seismic Hazard Analysis in the treatment of uncertainty. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. Structural member acceptability criteria will be based on performance objectives for a 2,500-year Maximum Considered Tsunami. The approach developed by the ASCE Tsunami Loads and Effects Subcommittee of the ASCE 7 Standard would result in the first national unification of tsunami hazard criteria for design codes reflecting the modern approach of Performance-Based Engineering.

  15. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    NASA Astrophysics Data System (ADS)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  16. A reliable simultaneous representation of seismic hazard and of ground shaking recurrence

    NASA Astrophysics Data System (ADS)

    Peresan, A.; Panza, G. F.; Magrin, A.; Vaccari, F.

    2015-12-01

    Different earthquake hazard maps may be appropriate for different purposes - such as emergency management, insurance and engineering design. Accounting for the lower occurrence rate of larger sporadic earthquakes may allow to formulate cost-effective policies in some specific applications, provided that statistically sound recurrence estimates are used, which is not typically the case of PSHA (Probabilistic Seismic Hazard Assessment). We illustrate the procedure to associate the expected ground motions from Neo-deterministic Seismic Hazard Assessment (NDSHA) to an estimate of their recurrence. Neo-deterministic refers to a scenario-based approach, which allows for the construction of a broad range of earthquake scenarios via full waveforms modeling. From the synthetic seismograms the estimates of peak ground acceleration, velocity and displacement, or any other parameter relevant to seismic engineering, can be extracted. NDSHA, in its standard form, defines the hazard computed from a wide set of scenario earthquakes (including the largest deterministically or historically defined credible earthquake, MCE) and it does not supply the frequency of occurrence of the expected ground shaking. A recent enhanced variant of NDSHA that reliably accounts for recurrence has been developed and it is applied to the Italian territory. The characterization of the frequency-magnitude relation can be performed by any statistically sound method supported by data (e.g. multi-scale seismicity model), so that a recurrence estimate is associated to each of the pertinent sources. In this way a standard NDSHA map of ground shaking is obtained simultaneously with the map of the corresponding recurrences. The introduction of recurrence estimates in NDSHA naturally allows for the generation of ground shaking maps at specified return periods. This permits a straightforward comparison between NDSHA and PSHA maps.

  17. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and Siracusa (Italy). The latter target analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for Siracusa are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore, they can represent a further valuable source of information to benchmark and strengthen the results of such kind of studies. The work is funded by the Italian Flagship Project RITMARE, the two EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, the TSUMAPS-NEAM (Grant agreement ECHO/SUB/2015/718568/PREV26) project and the INGV-DPC Agreement.

  18. Risk-targeted versus current seismic design maps for the conterminous United States

    USGS Publications Warehouse

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  19. Probabilistic tephra hazard maps for the Neapolitan area: Quantitative volcanological study of Campi Flegrei eruptions

    NASA Astrophysics Data System (ADS)

    Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.

    2008-07-01

    Tephra fall is a relevant hazard of Campi Flegrei caldera (Southern Italy), due to the high vulnerability of Naples metropolitan area to such an event. Here, tephra derive from magmatic as well as phreatomagmatic activity. On the basis of both new and literature data on known, past eruptions (Volcanic Explosivity Index (VEI), grain size parameters, velocity at the vent, column heights and erupted mass), and factors controlling tephra dispersion (wind velocity and direction), 2D numerical simulations of fallout dispersion and deposition have been performed for a large number of case events. A bayesian inversion has been applied to retrieve the best values of critical parameters (e.g., vertical mass distribution, diffusion coefficients, velocity at the vent), not directly inferable by volcanological study. Simulations are run in parallel on multiple processors to allow a fully probabilistic analysis, on a very large catalogue preserving the statistical proprieties of past eruptive history. Using simulation results, hazard maps have been computed for different scenarios: upper limit scenario (worst-expected scenario), eruption-range scenario, and whole-eruption scenario. Results indicate that although high hazard characterizes the Campi Flegrei caldera, the territory to the east of the caldera center, including the whole district of Naples, is exposed to high hazard values due to the dominant westerly winds. Consistently with the stratigraphic evidence of nature of past eruptions, our numerical simulations reveal that even in the case of a subplinian eruption (VEI = 3), Naples is exposed to tephra fall thicknesses of some decimeters, thereby exceeding the critical limit for roof collapse. Because of the total number of people living in Campi Flegrei and the city of Naples (ca. two million of inhabitants), the tephra fallout risk related to a plinian eruption of Campi Flegrei largely matches or exceeds the risk related to a similar eruption at Vesuvius.

  20. 3-D ballistic transport of ellipsoidal volcanic projectiles considering horizontal wind field and variable shape-dependent drag coefficients

    NASA Astrophysics Data System (ADS)

    Bertin, Daniel

    2017-02-01

    An innovative 3-D numerical model for the dynamics of volcanic ballistic projectiles is presented here. The model focuses on ellipsoidal particles and improves previous approaches by considering horizontal wind field, virtual mass forces, and drag forces subjected to variable shape-dependent drag coefficients. Modeling suggests that the projectile's launch velocity and ejection angle are first-order parameters influencing ballistic trajectories. The projectile's density and minor radius are second-order factors, whereas both intermediate and major radii of the projectile are of third order. Comparing output parameters, assuming different input data, highlights the importance of considering a horizontal wind field and variable shape-dependent drag coefficients in ballistic modeling, which suggests that they should be included in every ballistic model. On the other hand, virtual mass forces should be discarded since they almost do not contribute to ballistic trajectories. Simulation results were used to constrain some crucial input parameters (launch velocity, ejection angle, wind speed, and wind azimuth) of the block that formed the biggest and most distal ballistic impact crater during the 1984-1993 eruptive cycle of Lascar volcano, Northern Chile. Subsequently, up to 106 simulations were performed, whereas nine ejection parameters were defined by a Latin-hypercube sampling approach. Simulation results were summarized as a quantitative probabilistic hazard map for ballistic projectiles. Transects were also done in order to depict aerial hazard zones based on the same probabilistic procedure. Both maps combined can be used as a hazard prevention tool for ground and aerial transits nearby unresting volcanoes.

  1. Probabilistic and Scenario Seismic and Liquefaction Hazard Analysis of the Mississippi Embayment Incorporating Nonlinear Site Effects

    NASA Astrophysics Data System (ADS)

    Cramer, C. H.; Dhar, M. S.

    2017-12-01

    The influence of deep sediment deposits of the Mississippi Embayment (ME) on the propagation of seismic waves is poorly understood and remains a major source of uncertainty for site response analysis. Many researchers have studied the effects of these deposits on seismic hazard of the area using available information at the time. In this study, we have used updated and newly available resources for seismic and liquefaction hazard analyses of the ME. We have developed an improved 3D geological model. Additionally, we used surface geological maps from Cupples and Van Arsdale (2013) to prepare liquefaction hazard maps. Both equivalent linear and nonlinear site response codes were used to develop site amplification distributions for use in generating hazard maps. The site amplification distributions are created using the Monte Carlo approach of Cramer et al. (2004, 2006) on a 0.1-degree grid. The 2014 National Seismic Hazard model and attenuation relations (Petersen et al., 2014) are used to prepare seismic hazard maps. Then liquefaction hazard maps are generated using liquefaction probability curves from Holzer (2011) and Cramer et al. (2015). Equivalent linear response (w/ increased precision, restricted nonlinear behavior with depth) shows similar hazard for the ME compared to nonlinear analysis (w/o pore pressure) results. At short periods nonlinear deamplification dominates the hazard, but at long periods resonance amplification dominates. The liquefaction hazard tends to be high in Holocene and late Pleistocene lowland sediments, even with lowered ground water levels, and low in Pleistocene loess of the uplands. Considering pore pressure effects in nonlinear site response analysis at a test site on the lowlands shows amplification of ground motion at short periods. PGA estimates from ME liquefaction and MMI observations are in the 0.25 to 0.4 g range. Our estimated M7.5 PGA hazard within 10 km of the fault can exceed this. Ground motion observations from liquefaction sites in New Zealand and Japan support PGAs below 0.4 g, except at sites within 20 km exhibiting pore-pressure induced acceleration spikes due to cyclic mobility where PGA ranges from 0.5 to 1.5 g. This study is being extended to more detailed seismic and liquefaction hazard studies in five western Tennessee counties under a five year grant from HUD.

  2. Probabilistic seismic hazard assessment of the Eastern and Central groups of the Azores - Portugal

    NASA Astrophysics Data System (ADS)

    Fontiela, João; Bezzeghoud, Mourad; Rosset, Philippe; Borges, José; Rodrigues, Francisco; Caldeira, Bento

    2017-04-01

    Azores islands of the Eastern and Central groups are located at the triple junction of the American, Eurasian and Nubian plates inducing a large number of low magnitude earthquakes. Since its settlement in the 15th century, 33 earthquakes with intensity ≥ VII have caused severe damage and high death toll. The most severe ones occurred in 1522 at São Miguel Island with a maximum MM intensity of X; in 1614 at Terceira Island (X) in 1757 at São Jorge Island (XI); 1852 at São Miguel Island (VIII); 1926 at Faial Island (Mb 5.3-5.9); in 1980 at Terceira Island (Mw7.1) and in 1998 at Faial Island (Mw6.2). The analysis of the Probabilistic Seismic Hazard Assessment (PSHA) were carried out using the classical Cornell-McGuire approach using seismogenic zones recently defined by Fontiela et al. (2014). We create a new earthquake catalogue merging local and global datasets with a large time span (1522 - 2016) to calculate recurrence times and maximum magnitudes. In order to reduce the epistemic uncertainties, we test several ground motion prediction equations in agreement with the geological heterogeneities typical of young volcanic islands. Probabilistic seismic hazard maps are proposed for 475 and 975 years returns periods as well as hazard curves and uniform hazard spectra for the main cities. REFERENCES: Fontiela, J. et al., 2014. Azores seismogenic zones. Comunicações Geológicas, 101(1), pp.351-354. ACKNOWLEDGMENTS: João Fontiela is supported by grant M3.1.2/F/060/2011 of Regional Science Fund of the Regional Government Azores and this study is co-funded by the European Union through the European fund of Regional Development, framed in COMPETE 2020 (Operational Competitiveness Programme and Internationalization) through the ICT project (UID/GEO/04683/2013) with the reference POCI-01-0145-FEDER-007690.

  3. Reply

    NASA Astrophysics Data System (ADS)

    Wang, Zhenming; Shi, Baoping; Kiefer, John D.; Woolery, Edward W.

    2004-06-01

    Musson's comments on our article, ``Communicating with uncertainty: A critical issue with probabilistic seismic hazard analysis'' are an example of myths and misunderstandings. We did not say that probabilistic seismic hazard analysis (PSHA) is a bad method, but we did say that it has some limitations that have significant implications. Our response to these comments follows. There is no consensus on exactly how to select seismological parameters and to assign weights in PSHA. This was one of the conclusions reached by a senior seismic hazard analysis committee [SSHAC, 1997] that included C. A. Cornell, founder of the PSHA methodology. The SSHAC report was reviewed by a panel of the National Research Council and was well accepted by seismologists and engineers. As an example of the lack of consensus, Toro and Silva [2001] produced seismic hazard maps for the central United States region that are quite different from those produced by Frankel et al. [2002] because they used different input seismological parameters and weights (see Table 1). We disagree with Musson's conclusion that ``because a method may be applied badly on one occasion does not mean the method itself is bad.'' We do not say that the method is poor, but rather that those who use PSHA need to document their inputs and communicate them fully to the users. It seems that Musson is trying to create myth by suggesting his own methods should be used.

  4. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-01-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  5. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-05-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  6. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and resultant loss of income produces widespread default on payments. With increased computational power and more complete inventories of exposure, Monte Carlo methods may provide more accurate estimation of severe losses and the opportunity to increase resilience of vulnerable systems and communities.

  7. Applications of the seismic hazard model of Italy: from a new building code to the L'Aquila trial against seismologists

    NASA Astrophysics Data System (ADS)

    Meletti, C.

    2013-05-01

    In 2003, a large national project fur updating the seismic hazard map and the seismic zoning in Italy started, according to the rules fixed by an Ordinance by Italian Prime Minister. New input elements for probabilistic seismic hazard assessment were compiled: the earthquake catalogue, the seismogenic zonation, the catalogue completeness, a set of new attenuation relationships. The map of expected PGA on rock soil condition with 10% probability of exceedance is the new reference seismic hazard map for Italy (http://zonesismiche.mi.ingv.it). In the following, further 9 probabilities of exceedance and the uniform hazard spectra up to 2 seconds together with the disaggregation of the PGA was also released. A comprehensive seismic hazard model that fully describes the seismic hazard in Italy was then available, accessible by a webGis application (http://esse1-gis.mi.ingv.it/en.php). The detailed information make possible to change the approach for evaluating the proper seismic action for designing: from a zone-dependent approach (in Italy there were 4 seismic zones, each one with a single design spectrum) to a site-dependent approach: the design spectrum is now defined at each site of a grid of about 11000 points covering the whole national territory. The new building code becomes mandatory only after the 6 April 2009 L'Aquila earthquake, the first strong event in Italy after the release of the seismic hazard map. The large number of recordings and the values of the experienced accelerations suggested the comparisons between the recorded spectra and spectra defined in the seismic codes Even if such comparisons could be robust only after several consecutive 50-year periods of observation and in a probabilistic approach it is not a single observation that can validate or not the hazard estimate, some of the comparisons that can be undertaken between the observed ground motions and the hazard model used for the seismic code have been performed and have shown that the assumptions and modeling choices made in the Italian hazard study are in line with the observations, by considering different return period, the soil condition at the recording stations and the uncertainties of the model. A further application of Italian seismic hazard model is in the identification of buildings and factories struck by the 2012 Emilia (Italy) earthquakes to be investigated in order to determine if they were still safe or not. The law states that no safety check is needed if the construction experienced a shaking greater than 70% of the design acceleration expected at the site, without abandoning the elastic behavior. The ground motion values are evaluated from the shakemaps available (http://shakemap.rm.ingv.it) and the design accelerations derived from the Building Code, which is based on the reference Italian seismic hazard model. Finally, the national seismic hazard model was one the most debated element during the trial in L'Aquila against the seismologists, experts of Civil Protection Department, sentenced to six years in prison on charges of manslaughter, because, according to the judge, they underestimated the risk in the region, giving a wrong message to the people, before the strong 2009 L'Aquila earthquake.

  8. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the Gulf of Mexico, and improved the accuracy and resolution of the Probabilistic Storm Surge model.

  9. Global forecasting of thermal health hazards: the skill of probabilistic predictions of the Universal Thermal Climate Index (UTCI).

    PubMed

    Pappenberger, F; Jendritzky, G; Staiger, H; Dutra, E; Di Giuseppe, F; Richardson, D S; Cloke, H L

    2015-03-01

    Although over a hundred thermal indices can be used for assessing thermal health hazards, many ignore the human heat budget, physiology and clothing. The Universal Thermal Climate Index (UTCI) addresses these shortcomings by using an advanced thermo-physiological model. This paper assesses the potential of using the UTCI for forecasting thermal health hazards. Traditionally, such hazard forecasting has had two further limitations: it has been narrowly focused on a particular region or nation and has relied on the use of single 'deterministic' forecasts. Here, the UTCI is computed on a global scale, which is essential for international health-hazard warnings and disaster preparedness, and it is provided as a probabilistic forecast. It is shown that probabilistic UTCI forecasts are superior in skill to deterministic forecasts and that despite global variations, the UTCI forecast is skilful for lead times up to 10 days. The paper also demonstrates the utility of probabilistic UTCI forecasts on the example of the 2010 heat wave in Russia.

  10. Landslide hazard analysis for pipelines: The case of the Simonette river crossing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grivas, D.A.; Schultz, B.C.; O`Neil, G.

    1995-12-31

    The overall objective of this study is to develop a probabilistic methodology to analyze landslide hazards and their effects on the safety of buried pipelines. The methodology incorporates a range of models that can accommodate differences in the ground movement modes and the amount and type of information available at various site locations. Two movement modes are considered, namely (a) instantaneous (catastrophic) slides, and (b) gradual ground movement which may result in cumulative displacements over the pipeline design life (30--40 years) that are in excess of allowable values. Probabilistic analysis is applied in each case to address the uncertainties associatedmore » with important factors that control slope stability. Availability of information ranges from relatively well studied, instrumented installations to cases where data is limited to what can be derived from topographic and geologic maps. The methodology distinguishes between procedures applied where there is little information and those that can be used when relatively extensive data is available. important aspects of the methodology are illustrated in a case study involving a pipeline located in Northern Alberta, Canada, in the Simonette river valley.« less

  11. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

    2015-08-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and ways for their usage in flood risk management are outlined.

  12. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  13. Considering the ranges of uncertainties in the New Probabilistic Seismic Hazard Assessment of Germany - Version 2016

    NASA Astrophysics Data System (ADS)

    Grunthal, Gottfried; Stromeyer, Dietrich; Bosse, Christian; Cotton, Fabrice; Bindi, Dino

    2017-04-01

    The seismic load parameters for the upcoming National Annex to the Eurocode 8 result from the reassessment of the seismic hazard supported by the German Institution for Civil Engineering . This 2016 version of hazard assessment for Germany as target area was based on a comprehensive involvement of all accessible uncertainties in models and parameters into the approach and the provision of a rational framework for facilitating the uncertainties in a transparent way. The developed seismic hazard model represents significant improvements; i.e. it is based on updated and extended databases, comprehensive ranges of models, robust methods and a selection of a set of ground motion prediction equations of their latest generation. The output specifications were designed according to the user oriented needs as suggested by two review teams supervising the entire project. In particular, seismic load parameters were calculated for rock conditions with a vS30 of 800 ms-1 for three hazard levels (10%, 5% and 2% probability of occurrence or exceedance within 50 years) in form of, e.g., uniform hazard spectra (UHS) based on 19 sprectral periods in the range of 0.01 - 3s, seismic hazard maps for spectral response accelerations for different spectral periods or for macroseismic intensities. The developed hazard model consists of a logic tree with 4040 end branches and essential innovations employed to capture epistemic uncertainties and aleatory variabilities. The computation scheme enables the sound calculation of the mean and any quantile of required seismic load parameters. Mean, median and 84th percentiles of load parameters were provided together with the full calculation model to clearly illustrate the uncertainties of such a probabilistic assessment for a region of a low-to-moderate level of seismicity. The regional variations of these uncertainties (e.g. ratios between the mean and median hazard estimations) were analyzed and discussed.

  14. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  15. Probabilistic safety analysis for urgent situations following the accidental release of a pollutant in the atmosphere

    NASA Astrophysics Data System (ADS)

    Armand, P.; Brocheton, F.; Poulet, D.; Vendel, F.; Dubourg, V.; Yalamas, T.

    2014-10-01

    This paper is an original contribution to uncertainty quantification in atmospheric transport & dispersion (AT&D) at the local scale (1-10 km). It is proposed to account for the imprecise knowledge of the meteorological and release conditions in the case of an accidental hazardous atmospheric emission. The aim is to produce probabilistic risk maps instead of a deterministic toxic load map in order to help the stakeholders making their decisions. Due to the urge attached to such situations, the proposed methodology is able to produce such maps in a limited amount of time. It resorts to a Lagrangian particle dispersion model (LPDM) using wind fields interpolated from a pre-established database that collects the results from a computational fluid dynamics (CFD) model. This enables a decoupling of the CFD simulations from the dispersion analysis, thus a considerable saving of computational time. In order to make the Monte-Carlo-sampling-based estimation of the probability field even faster, it is also proposed to recourse to the use of a vector Gaussian process surrogate model together with high performance computing (HPC) resources. The Gaussian process (GP) surrogate modelling technique is coupled with a probabilistic principal component analysis (PCA) for reducing the number of GP predictors to fit, store and predict. The design of experiments (DOE) from which the surrogate model is built, is run over a cluster of PCs for making the total production time as short as possible. The use of GP predictors is validated by comparing the results produced by this technique with those obtained by crude Monte Carlo sampling.

  16. Integrating geological and geophysical data to improve probabilistic hazard forecasting of Arabian Shield volcanism

    NASA Astrophysics Data System (ADS)

    Runge, Melody G.; Bebbington, Mark S.; Cronin, Shane J.; Lindsay, Jan M.; Moufti, Mohammed R.

    2016-02-01

    During probabilistic volcanic hazard analysis of volcanic fields, a greater variety of spatial data on crustal features should help improve forecasts of future vent locations. Without further examination, however, geophysical estimations of crustal or other features may be non-informative. Here, we present a new, robust, non-parametric method to quantitatively determine the existence of any relationship between natural phenomena (e.g., volcanic eruptions) and a variety of geophysical data. This provides a new validation tool for incorporating a range of potentially hazard-diagnostic observable data into recurrence rate estimates and hazard analyses. Through this study it is shown that the location of Cenozoic volcanic fields across the Arabian Shield appear to be related to locations of major and minor faults, at higher elevations, and regions where gravity anomaly values were between - 125 mGal and 0 mGal. These findings support earlier hypotheses that the western shield uplift was related to Cenozoic volcanism. At the harrat (volcanic field)-scale, higher vent density regions are related to both elevation and gravity anomaly values. A by-product of this work is the collection of existing data on the volcanism across Saudi Arabia, with all vent locations provided herein, as well as updated maps for Harrats Kura, Khaybar, Ithnayn, Kishb, and Rahat. This work also highlights the potential dangers of assuming relationships between observed data and the occurrence of a natural phenomenon without quantitative assessment or proper consideration of the effects of data resolution.

  17. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part A, Prehistoric earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax, the maximum earthquake magnitude thought to be possible within a specified geographic region. This report is Part A of an Open-File Report that describes the construction of a global catalog of moderate to large earthquakes, from which one can estimate Mmax for most of the Central and Eastern United States and adjacent Canada. The catalog and Mmax estimates derived from it were used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. This Part A discusses prehistoric earthquakes that occurred in eastern North America, northwestern Europe, and Australia, whereas a separate Part B deals with historical events.

  18. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  19. Seismic hazard analysis for Jayapura city, Papua

    NASA Astrophysics Data System (ADS)

    Robiana, R.; Cipta, A.

    2015-04-01

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 - 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  20. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part II: Inundation Modelling and Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Lane, E. M.; Gillibrand, P. A.; Wang, X.; Power, W.

    2013-09-01

    Regional source tsunamis pose a potentially devastating hazard to communities and infrastructure on the New Zealand coast. But major events are very uncommon. This dichotomy of infrequent but potentially devastating hazards makes realistic assessment of the risk challenging. Here, we describe a method to determine a probabilistic assessment of the tsunami hazard by regional source tsunamis with an "Average Recurrence Interval" of 2,500-years. The method is applied to the east Auckland region of New Zealand. From an assessment of potential regional tsunamigenic events over 100,000 years, the inundation of the Auckland region from the worst 100 events is modelled using a hydrodynamic model and probabilistic inundation depths on a 2,500-year time scale were determined. Tidal effects on the potential inundation were included by coupling the predicted wave heights with the probability density function of tidal heights at the inundation site. Results show that the more exposed northern section of the east coast and outer islands in the Hauraki Gulf face the greatest hazard from regional tsunamis in the Auckland region. Incorporating tidal effects into predictions of inundation reduced the predicted hazard compared to modelling all the tsunamis arriving at high tide giving a more accurate hazard assessment on the specified time scale. This study presents the first probabilistic analysis of dynamic modelling of tsunami inundation for the New Zealand coast and as such provides the most comprehensive assessment of tsunami inundation of the Auckland region from regional source tsunamis available to date.

  1. Probabilistic seismic hazard analyses for ground motions and fault displacement at Yucca Mountain, Nevada

    USGS Publications Warehouse

    Stepp, J.C.; Wong, I.; Whitney, J.; Quittmeyer, R.; Abrahamson, N.; Toro, G.; Young, S.R.; Coppersmith, K.; Savy, J.; Sullivan, T.

    2001-01-01

    Probabilistic seismic hazard analyses were conducted to estimate both ground motion and fault displacement hazards at the potential geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain, Nevada. The study is believed to be the largest and most comprehensive analyses ever conducted for ground-shaking hazard and is a first-of-a-kind assessment of probabilistic fault displacement hazard. The major emphasis of the study was on the quantification of epistemic uncertainty. Six teams of three experts performed seismic source and fault displacement evaluations, and seven individual experts provided ground motion evaluations. State-of-the-practice expert elicitation processes involving structured workshops, consensus identification of parameters and issues to be evaluated, common sharing of data and information, and open exchanges about the basis for preliminary interpretations were implemented. Ground-shaking hazard was computed for a hypothetical rock outcrop at -300 m, the depth of the potential waste emplacement drifts, at the designated design annual exceedance probabilities of 10-3 and 10-4. The fault displacement hazard was calculated at the design annual exceedance probabilities of 10-4 and 10-5.

  2. Incorporating climate change and morphological uncertainty into coastal change hazard assessments

    USGS Publications Warehouse

    Baron, Heather M.; Ruggiero, Peter; Wood, Nathan J.; Harris, Erica L.; Allan, Jonathan; Komar, Paul D.; Corcoran, Patrick

    2015-01-01

    Documented and forecasted trends in rising sea levels and changes in storminess patterns have the potential to increase the frequency, magnitude, and spatial extent of coastal change hazards. To develop realistic adaptation strategies, coastal planners need information about coastal change hazards that recognizes the dynamic temporal and spatial scales of beach morphology, the climate controls on coastal change hazards, and the uncertainties surrounding the drivers and impacts of climate change. We present a probabilistic approach for quantifying and mapping coastal change hazards that incorporates the uncertainty associated with both climate change and morphological variability. To demonstrate the approach, coastal change hazard zones of arbitrary confidence levels are developed for the Tillamook County (State of Oregon, USA) coastline using a suite of simple models and a range of possible climate futures related to wave climate, sea-level rise projections, and the frequency of major El Niño events. Extreme total water levels are more influenced by wave height variability, whereas the magnitude of erosion is more influenced by sea-level rise scenarios. Morphological variability has a stronger influence on the width of coastal hazard zones than the uncertainty associated with the range of climate change scenarios.

  3. Tools used by the insurance industry to assess risk from hydroclimatic extremes

    NASA Astrophysics Data System (ADS)

    Higgs, Stephanie; McMullan, Caroline

    2016-04-01

    Probabilistic catastrophe models are widely used within the insurance industry to assess and price the risk of natural hazards to individual residences through to portfolios of millions of properties. Over the relatively short period that catastrophe models have been available (almost 30 years), the insurance industry has built up a financial resilience to key natural hazards in certain areas (e.g. US tropical cyclone, European extra-tropical cyclone and flood). However, due the rapidly expanding global population and increase in wealth, together with uncertainties in the behaviour of meteorological phenomena introduced by climate change, the domain in which natural hazards impact society is growing. As a result, the insurance industry faces new challenges in assessing the risk and uncertainty from natural hazards. As a catastrophe modelling company, AIR Worldwide has a toolbox of options available to help the insurance industry assess extreme climatic events and their associated uncertainty. Here we discuss several of these tools: from helping analysts understand how uncertainty is inherently built in to probabilistic catastrophe models, to understanding alternative stochastic catalogs for tropical cyclone based on climate conditioning. Through the use of stochastic extreme disaster events such as those provided through AIR's catalogs or through the Lloyds of London marketplace (RDS's) to provide useful benchmarks for the loss probability exceedence and tail-at-risk metrics outputted from catastrophe models; to the visualisation of 1000+ year event footprints and hazard intensity maps. Ultimately the increased transparency of catastrophe models and flexibility of a software platform that allows for customisation of modelled and non-modelled risks will drive a greater understanding of extreme hydroclimatic events within the insurance industry.

  4. Probabilistic seismic hazard assessments of Sabah, east Malaysia: accounting for local earthquake activity near Ranau

    NASA Astrophysics Data System (ADS)

    Khalil, Amin E.; Abir, Ismail A.; Ginsos, Hanteh; Abdel Hafiez, Hesham E.; Khan, Sohail

    2018-02-01

    Sabah state in eastern Malaysia, unlike most of the other Malaysian states, is characterized by common seismological activity; generally an earthquake of moderate magnitude is experienced at an interval of roughly every 20 years, originating mainly from two major sources, either a local source (e.g. Ranau and Lahad Dato) or a regional source (e.g. Kalimantan and South Philippines subductions). The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, these zones are near Ranau (near Kota Kinabalu) and Lahad Datu in the southeast of Sabah. The seismicity record of Ranau begins in 1991, according to the international seismicity bulletins (e.g. United States Geological Survey and the International Seismological Center), and this short record is not sufficient for seismic source characterization. Fortunately, active Quaternary fault systems are delineated in the area. Henceforth, the seismicity of the area is thus determined as line sources referring to these faults. Two main fault systems are believed to be the source of such activities; namely, the Mensaban fault zone and the Crocker fault zone in addition to some other faults in their vicinity. Seismic hazard assessments became a very important and needed study for the extensive developing projects in Sabah especially with the presence of earthquake activities. Probabilistic seismic hazard assessments are adopted for the present work since it can provide the probability of various ground motion levels during expected from future large earthquakes. The output results are presented in terms of spectral acceleration curves and uniform hazard curves for periods of 500, 1000 and 2500 years. Since this is the first time that a complete hazard study has been done for the area, the output will be a base and standard for any future strategic plans in the area.

  5. SCEC Earthquake System Science Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.

    2008-12-01

    The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.

  6. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera.

    PubMed

    Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun

    2015-08-31

    Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments.

  7. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera

    PubMed Central

    Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun

    2015-01-01

    Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments. PMID:26404284

  8. Probabilistic Volcanic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.

    2007-08-01

    Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).

  9. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  10. A method for producing digital probabilistic seismic landslide hazard maps

    USGS Publications Warehouse

    Jibson, R.W.; Harp, E.L.; Michael, J.A.

    2000-01-01

    The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include: (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24 000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10 m grid spacing using ARC/INFO GIS software on a UNIX computer. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure. ?? 2000 Elsevier Science B.V. All rights reserved.

  11. A method for producing digital probabilistic seismic landslide hazard maps; an example from the Los Angeles, California, area

    USGS Publications Warehouse

    Jibson, Randall W.; Harp, Edwin L.; Michael, John A.

    1998-01-01

    The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24,000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10-m grid spacing in the ARC/INFO GIS platform. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure.

  12. Probabilistic seismic hazard assessment for northern Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Kosuwan, S.; Nguyen, M. L.; Shi, X.; Sieh, K.

    2016-12-01

    We assess seismic hazard for northern Southeast Asia through constructing an earthquake and fault database, conducting a series of ground-shaking scenarios and proposing regional seismic hazard maps. Our earthquake database contains earthquake parameters from global and local seismic catalogues, including the ISC, ISC-GEM, the global ANSS Comprehensive Catalogues, Seismological Bureau, Thai Meteorological Department, Thailand, and Institute of Geophysics Vietnam Academy of Science and Technology, Vietnam. To harmonize the earthquake parameters from various catalogue sources, we remove duplicate events and unify magnitudes into the same scale. Our active fault database include active fault data from previous studies, e.g. the active fault parameters determined by Wang et al. (2014), Department of Mineral Resources, Thailand, and Institute of Geophysics, Vietnam Academy of Science and Technology, Vietnam. Based on the parameters from analysis of the databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and time elapsed of last events), we determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the felt intensities of historical earthquakes to the modelled ground motions using ground motion prediction equations (GMPEs). By incorporating the best-fitting GMPEs and site conditions, we utilized site effect and assessed probabilistic seismic hazard. The highest seismic hazard is in the region close to the Sagaing Fault, which cuts through some major cities in central Myanmar. The northern segment of Sunda megathrust, which could potentially cause M8-class earthquake, brings significant hazard along the Western Coast of Myanmar and eastern Bangladesh. Besides, we conclude a notable hazard level in northern Vietnam and the boundary between Myanmar, Thailand and Laos, due to a series of strike-slip faults, which could potentially cause moderate-large earthquakes. Note that although much of the region has a low probability of damaging shaking, low-probability events have resulted in much destruction recently in SE Asia (e.g. 2008 Wenchuan, 2015 Sabah earthquakes).

  13. Modeling landslide recurrence in Seattle, Washington, USA

    USGS Publications Warehouse

    Salciarini, Diana; Godt, Jonathan W.; Savage, William Z.; Baum, Rex L.; Conversini, Pietro

    2008-01-01

    To manage the hazard associated with shallow landslides, decision makers need an understanding of where and when landslides may occur. A variety of approaches have been used to estimate the hazard from shallow, rainfall-triggered landslides, such as empirical rainfall threshold methods or probabilistic methods based on historical records. The wide availability of Geographic Information Systems (GIS) and digital topographic data has led to the development of analytic methods for landslide hazard estimation that couple steady-state hydrological models with slope stability calculations. Because these methods typically neglect the transient effects of infiltration on slope stability, results cannot be linked with historical or forecasted rainfall sequences. Estimates of the frequency of conditions likely to cause landslides are critical for quantitative risk and hazard assessments. We present results to demonstrate how a transient infiltration model coupled with an infinite slope stability calculation may be used to assess shallow landslide frequency in the City of Seattle, Washington, USA. A module called CRF (Critical RainFall) for estimating deterministic rainfall thresholds has been integrated in the TRIGRS (Transient Rainfall Infiltration and Grid-based Slope-Stability) model that combines a transient, one-dimensional analytic solution for pore-pressure response to rainfall infiltration with an infinite slope stability calculation. Input data for the extended model include topographic slope, colluvial thickness, initial water-table depth, material properties, and rainfall durations. This approach is combined with a statistical treatment of rainfall using a GEV (General Extreme Value) probabilistic distribution to produce maps showing the shallow landslide recurrence induced, on a spatially distributed basis, as a function of rainfall duration and hillslope characteristics.

  14. Phase transitions in coupled map lattices and in associated probabilistic cellular automata.

    PubMed

    Just, Wolfram

    2006-10-01

    Analytical tools are applied to investigate piecewise linear coupled map lattices in terms of probabilistic cellular automata. The so-called disorder condition of probabilistic cellular automata is closely related with attracting sets in coupled map lattices. The importance of this condition for the suppression of phase transitions is illustrated by spatially one-dimensional systems. Invariant densities and temporal correlations are calculated explicitly. Ising type phase transitions are found for one-dimensional coupled map lattices acting on repelling sets and for a spatially two-dimensional Miller-Huse-like system with stable long time dynamics. Critical exponents are calculated within a finite size scaling approach. The relevance of detailed balance of the resulting probabilistic cellular automaton for the critical behavior is pointed out.

  15. Data fusion strategies for hazard detection and safe site selection for planetary and small body landings

    NASA Astrophysics Data System (ADS)

    Câmara, F.; Oliveira, J.; Hormigo, T.; Araújo, J.; Ribeiro, R.; Falcão, A.; Gomes, M.; Dubois-Matra, O.; Vijendran, S.

    2015-06-01

    This paper discusses the design and evaluation of data fusion strategies to perform tiered fusion of several heterogeneous sensors and a priori data. The aim is to increase robustness and performance of hazard detection and avoidance systems, while enabling safe planetary and small body landings anytime, anywhere. The focus is on Mars and asteroid landing mission scenarios and three distinct data fusion algorithms are introduced and compared. The first algorithm consists of a hybrid camera-LIDAR hazard detection and avoidance system, the H2DAS, in which data fusion is performed at both sensor-level data (reconstruction of the point cloud obtained with a scanning LIDAR using the navigation motion states and correcting the image for motion compensation using IMU data), feature-level data (concatenation of multiple digital elevation maps, obtained from consecutive LIDAR images, to achieve higher accuracy and resolution maps while enabling relative positioning) as well as decision-level data (fusing hazard maps from multiple sensors onto a single image space, with a single grid orientation and spacing). The second method presented is a hybrid reasoning fusion, the HRF, in which innovative algorithms replace the decision-level functions of the previous method, by combining three different reasoning engines—a fuzzy reasoning engine, a probabilistic reasoning engine and an evidential reasoning engine—to produce safety maps. Finally, the third method presented is called Intelligent Planetary Site Selection, the IPSIS, an innovative multi-criteria, dynamic decision-level data fusion algorithm that takes into account historical information for the selection of landing sites and a piloting function with a non-exhaustive landing site search capability, i.e., capable of finding local optima by searching a reduced set of global maps. All the discussed data fusion strategies and algorithms have been integrated, verified and validated in a closed-loop simulation environment. Monte Carlo simulation campaigns were performed for the algorithms performance assessment and benchmarking. The simulations results comprise the landing phases of Mars and Phobos landing mission scenarios.

  16. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published results at that time. CDMG eventually published the second edition map in 1992 following the Governor's Board of Inquiry on the 1989 Loma Prieta earthquake and at the demand of Caltrans. The third edition map was published by Caltrans in 1996 utilizing GIS technology to manage data that includes a simplified three-dimension geometry of faults and to facilitate efficient corrections and revisions of data and the map. The spatial relationship of fault hazards with highways, bridges or any other attribute can be efficiently managed and analyzed now in GIS at Caltrans. There has been great confidence in using DSHA in bridge engineering and other applications in California, and it can be confidently applied in any other earthquake-prone region. Earthquake hazards defined by DSHA are: (1) transparent and stable with robust MCE moment magnitudes; (2) flexible in their application to design considerations; (3) can easily incorporate advances in ground motion simulations; and (4) economical. DSHA and neo-DSHA have the same approach and applicability. The accuracy of DSHA has proven to be quite reasonable for practical applications within engineering design and always done with professional judgment. In the final analysis, DSHA is a reality-check for public safety and PSHA results. Although PSHA has been acclaimed as a better approach for seismic hazard assessment, it is DSHA, not PSHA, that has actually been used in seismic hazard assessment for building and bridge engineering, particularly in California.

  17. Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Reich, M.

    Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.

  18. When probabilistic seismic hazard climbs volcanoes: the Mt. Etna case, Italy - Part 1: Model components for sources parameterization

    NASA Astrophysics Data System (ADS)

    Azzaro, Raffaele; Barberi, Graziella; D'Amico, Salvatore; Pace, Bruno; Peruzza, Laura; Tuvè, Tiziana

    2017-11-01

    The volcanic region of Mt. Etna (Sicily, Italy) represents a perfect lab for testing innovative approaches to seismic hazard assessment. This is largely due to the long record of historical and recent observations of seismic and tectonic phenomena, the high quality of various geophysical monitoring and particularly the rapid geodynamics clearly demonstrate some seismotectonic processes. We present here the model components and the procedures adopted for defining seismic sources to be used in a new generation of probabilistic seismic hazard assessment (PSHA), the first results and maps of which are presented in a companion paper, Peruzza et al. (2017). The sources include, with increasing complexity, seismic zones, individual faults and gridded point sources that are obtained by integrating geological field data with long and short earthquake datasets (the historical macroseismic catalogue, which covers about 3 centuries, and a high-quality instrumental location database for the last decades). The analysis of the frequency-magnitude distribution identifies two main fault systems within the volcanic complex featuring different seismic rates that are controlled essentially by volcano-tectonic processes. We discuss the variability of the mean occurrence times of major earthquakes along the main Etnean faults by using an historical approach and a purely geologic method. We derive a magnitude-size scaling relationship specifically for this volcanic area, which has been implemented into a recently developed software tool - FiSH (Pace et al., 2016) - that we use to calculate the characteristic magnitudes and the related mean recurrence times expected for each fault. Results suggest that for the Mt. Etna area, the traditional assumptions of uniform and Poissonian seismicity can be relaxed; a time-dependent fault-based modeling, joined with a 3-D imaging of volcano-tectonic sources depicted by the recent instrumental seismicity, can therefore be implemented in PSHA maps. They can be relevant for the retrofitting of the existing building stock and for driving risk reduction interventions. These analyses do not account for regional M > 6 seismogenic sources which dominate the hazard over long return times (≥ 500 years).

  19. Probabilistic seismic hazard at the archaeological site of Gol Gumbaz in Vijayapura, south India

    NASA Astrophysics Data System (ADS)

    Patil, Shivakumar G.; Menon, Arun; Dodagoudar, G. R.

    2018-03-01

    Probabilistic seismic hazard analysis (PSHA) is carried out for the archaeological site of Vijayapura in south India in order to obtain hazard consistent seismic input ground-motions for seismic risk assessment and design of seismic protection measures for monuments, where warranted. For this purpose the standard Cornell-McGuire approach, based on seismogenic zones with uniformly distributed seismicity is employed. The main features of this study are the usage of an updated and unified seismic catalogue based on moment magnitude, new seismogenic source models and recent ground motion prediction equations (GMPEs) in logic tree framework. Seismic hazard at the site is evaluated for level and rock site condition with 10% and 2% probabilities of exceedance in 50 years, and the corresponding peak ground accelerations (PGAs) are 0.074 and 0.142 g, respectively. In addition, the uniform hazard spectra (UHS) of the site are compared to the Indian code-defined spectrum. Comparisons are also made with results from National Disaster Management Authority (NDMA 2010), in terms of PGA and pseudo spectral accelerations (PSAs) at T = 0.2, 0.5, 1.0 and 1.25 s for 475- and 2475-yr return periods. Results of the present study are in good agreement with the PGA calculated from isoseismal map of the Killari earthquake, {M}w = 6.4 (1993). Disaggregation of PSHA results for the PGA and spectral acceleration ({S}a) at 0.5 s, displays the controlling scenario earthquake for the study region as low to moderate magnitude with the source being at a short distance from the study site. Deterministic seismic hazard (DSHA) is also carried out by taking into account three scenario earthquakes. The UHS corresponding to 475-yr return period (RP) is used to define the target spectrum and accordingly, the spectrum-compatible natural accelerograms are selected from the suite of recorded accelerograms.

  20. The Global Tsunami Model (GTM)

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Løvholt, F.; Harbitz, C. B.; Polet, J.; Lorito, S.; Basili, R.; Volpe, M.; Romano, F.; Selva, J.; Piatanesi, A.; Davies, G.; Griffin, J.; Baptista, M. A.; Omira, R.; Babeyko, A. Y.; Power, W. L.; Salgado Gálvez, M.; Behrens, J.; Yalciner, A. C.; Kanoglu, U.; Pekcan, O.; Ross, S.; Parsons, T.; LeVeque, R. J.; Gonzalez, F. I.; Paris, R.; Shäfer, A.; Canals, M.; Fraser, S. A.; Wei, Y.; Weiss, R.; Zaniboni, F.; Papadopoulos, G. A.; Didenkulova, I.; Necmioglu, O.; Suppasri, A.; Lynett, P. J.; Mokhtari, M.; Sørensen, M.; von Hillebrandt-Andrade, C.; Aguirre Ayerbe, I.; Aniel-Quiroga, Í.; Guillas, S.; Macias, J.

    2016-12-01

    The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  1. The Global Tsunami Model (GTM)

    NASA Astrophysics Data System (ADS)

    Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.

    2017-12-01

    The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  2. The Global Tsunami Model (GTM)

    NASA Astrophysics Data System (ADS)

    Løvholt, Finn

    2017-04-01

    The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  3. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018)

    USGS Publications Warehouse

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-06-30

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and also for preparing emergency response plans.The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault Zone to the east of the study area.In this report, an earthquake scenario is intended to depict the potential consequences of significant earthquakes. A scenario earthquake is not necessarily the largest or most damaging earthquake possible on a recognized fault. Rather it is both large enough and likely enough that emergency planners should consider it in regional emergency response plans. In particular, the ground motion predicted for a given scenario earthquake does not represent a full probabilistic hazard assessment, and thus it does not provide the basis for hazard zoning and earthquake-resistant building design.Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM). Alternatives to the NSHM scenario were developed for the Hilton Creek and Hartley Springs Faults to account for different opinions in how far these two faults extend into Long Valley Caldera. For each scenario, ground motions were calculated using the current standard practice: the deterministic seismic hazard analysis program developed by Art Frankel of USGS and three Next Generation Ground Motion Attenuation (NGA) models. Ground motion calculations incorporated the potential amplification of seismic shaking by near-surface soils defined by a map of the average shear wave velocity in the uppermost 30 m (VS30) developed by CGS.In addition to ground shaking and shaking-related ground failure such as liquefaction and earthquake induced landslides, earthquakes cause surface rupture displacement, which can lead to severe damage of buildings and lifelines. For each earthquake scenario, potential surface fault displacements are estimated using deterministic and probabilistic approaches. Liquefaction occurs when saturated sediments lose their strength because of ground shaking. Zones of potential liquefaction are mapped by incorporating areas where loose sandy sediments, shallow groundwater, and strong earthquake shaking coincide in the earthquake scenario. The process for defining zones of potential landslide and rockfall incorporates rock strength, surface slope, and existing landslides, with ground motions caused by the scenario earthquake.Each scenario is illustrated with maps of seismic shaking potential and fault displacement, liquefaction, and landslide potential. Seismic shaking is depicted by the distribution of shaking intensity, peak ground acceleration, and 1.0-second spectral acceleration. One-second spectral acceleration correlates well with structural damage to surface facilities. Acceleration greater than 0.2 g is often associated with strong ground shaking and may cause moderate to heavy damage. The extent of strong shaking is influenced by subsurface fault dip and near surface materials. Strong shaking is more widespread in the hanging wall regions of a normal fault. Larger ground motions also occur where young alluvial sediments amplify the shaking. Both of these effects can lead to strong shaking that extends farther from the fault on the valley side than on the hill side.The effect of fault rupture displacements may be localized along the surface trace of the mapped earthquake fault if fault geometry is simple and the fault traces are accurately located. However, surface displacement hazards can spread over a few hundred meters to a few kilometers if the earthquake fault has numerous splays or branches, such as the Hilton Creek Fault. Faulting displacements are estimated to be about 1 meter along normal faults in the study area and close to 2 meters along the White Mountains Fault Zone.All scenarios show the possibility of widespread ground failure. Liquefaction damage would likely occur in the areas of higher ground shaking near the faults where there are sandy/silty sediments and the depth to groundwater is 6.1 meters (20 feet) or less. Generally, this means damage is most common near lakes and streams in the areas of strongest shaking. Landslide potential exists throughout the study region. All steep slopes (>30 degrees) present a potential hazard at any level of shaking. Lesser slopes may have landslides within the areas of the higher ground shaking. The landslide hazard zones also are likely sources for snow avalanches during winter months and for large boulders that can be shaken loose and roll hundreds of feet down hill, which happened during the 1980 Mammoth Lakes earthquakes.Whereas methodologies used in estimating ground shaking, liquefaction, and landslides are well developed and have been applied in published hazard maps; methodologies used in estimating surface fault displacement are still being developed. Therefore, this report provides a more in-depth and detailed discussion of methodologies used for deterministic and probabilistic fault displacement hazard analyses for this project.

  4. Study of Seismic Hazards in the Center of the State of Veracruz, MÉXICO.

    NASA Astrophysics Data System (ADS)

    Torres Morales, G. F.; Leonardo Suárez, M.; Dávalos Sotelo, R.; Mora González, I.; Castillo Aguilar, S.

    2015-12-01

    Preliminary results obtained from the project "Microzonation of geological and hydrometeorological hazards for conurbations of Orizaba, Veracruz, and major sites located in the lower sub-basins: The Antigua and Jamapa" are presented. These project was supported by the Joint Funds CONACyT-Veracruz state government. It was developed a probabilistic seismic hazard assessment (henceforth PSHA) in the central area of Veracruz State, mainly in a region bounded by the watersheds of the rivers Jamapa and Antigua, whit the aim to evaluate the geological and hydrometeorological hazards in this region. The project pays most attention to extreme weather phenomena, floods and earthquakes, in order to calculate the risk induced by previous for landslides and rock falls. In addition, as part of the study, the PSHA was developed considered the site effect in the urban zones of the cities Xalapa and Orizaba; the site effects were incorporated by a standard format proposed in studies of microzonation and its application in computer systems, which allows to optimize and condense microzonation studies in a city. The results obtained from the PSHA are presented through to seismic hazard maps (hazard footprints), exceedance rate curves and uniform hazard spectrum for different spectral ordinates, between 0.01 and 5.0 seconds, associated to selected return periods: 72, 225, 475 and 2475 years.

  5. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.

  6. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part B, historical earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax: the moment magnitude of the largest earthquake that is thought to be possible within a specified geographic region. The region specified in this report is the Central and Eastern United States and adjacent Canada. Parts A and B of this report describe the construction of a global catalog of moderate to large earthquakes that occurred worldwide in tectonic analogs of the Central and Eastern United States. Examination of histograms of the magnitudes of these earthquakes allows estimation of Central and Eastern United States Mmax. The catalog and Mmax estimates derived from it are used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. Part A deals with prehistoric earthquakes, and this part deals with historical events.

  7. Tephra Fallout Hazard Assessment for VEI5 Plinian Eruption at Kuju Volcano, Japan, Using TEPHRA2

    NASA Astrophysics Data System (ADS)

    Tsuji, Tomohiro; Ikeda, Michiharu; Kishimoto, Hiroshi; Fujita, Koji; Nishizaka, Naoki; Onishi, Kozo

    2017-06-01

    Tephra fallout has a potential impact on engineered structures and systems at nuclear power plants. We provide the first report estimating potential accumulations of tephra fallout as big as VEI5 eruption from Kuju Volcano and calculated hazard curves at the Ikata Power Plant, using the TEPHRA2 computer program. We reconstructed the eruptive parameters of Kj-P1 tephra fallout deposit based on geological survey and literature review. A series of parameter studies were carried out to determine the best values of empirical parameters, such as diffusion coefficient and the fall time threshold. Based on such a reconstruction, we represent probabilistic analyses which assess the variation in meteorological condition, using wind profiles extracted from a 22 year long wind dataset. The obtained hazard curves and probability maps of tephra fallout associated to a Plinian eruption were used to discuss the exceeding probability at the site and the implications of such a severe eruption scenario.

  8. Seismic hazard analysis for Jayapura city, Papua

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A.

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock typemore » and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.« less

  9. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  10. An operational procedure for rapid flood risk assessment in Europe

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc

    2017-07-01

    The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.

  11. Seaside, Oregon, Tsunami Vulnerability Assessment Pilot Study

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Dominey-Howes, D.; Varner, J.

    2006-12-01

    The results of a pilot study to assess the risk from tsunamis for the Seaside-Gearhart, Oregon region will be presented. To determine the risk from tsunamis, it is first necessary to establish the hazard or probability that a tsunami of a particular magnitude will occur within a certain period of time. Tsunami inundation maps that provide 100-year and 500-year probabilistic tsunami wave height contours for the Seaside-Gearhart, Oregon, region were developed as part of an interagency Tsunami Pilot Study(1). These maps provided the probability of the tsunami hazard. The next step in determining risk is to determine the vulnerability or degree of loss resulting from the occurrence of tsunamis due to exposure and fragility. The tsunami vulnerability assessment methodology used in this study was developed by M. Papathoma and others(2). This model incorporates multiple factors (e.g. parameters related to the natural and built environments and socio-demographics) that contribute to tsunami vulnerability. Data provided with FEMA's HAZUS loss estimation software and Clatsop County, Oregon, tax assessment data were used as input to the model. The results, presented within a geographic information system, reveal the percentage of buildings in need of reinforcement and the population density in different inundation depth zones. These results can be used for tsunami mitigation, local planning, and for determining post-tsunami disaster response by emergency services. (1)Tsunami Pilot Study Working Group, Seaside, Oregon Tsunami Pilot Study--Modernization of FEMA Flood Hazard Maps, Joint NOAA/USGS/FEMA Special Report, U.S. National Oceanic and Atmospheric Administration, U.S. Geological Survey, U.S. Federal Emergency Management Agency, 2006, Final Draft. (2)Papathoma, M., D. Dominey-Howes, D.,Y. Zong, D. Smith, Assessing Tsunami Vulnerability, an example from Herakleio, Crete, Natural Hazards and Earth System Sciences, Vol. 3, 2003, p. 377-389.

  12. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    PubMed

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  13. Methodological framework for the probabilistic risk assessment of multi-hazards at a municipal scale: a case study in the Fella river valley, Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; van Westen, Cees; Reichenbach, Paola

    2013-04-01

    Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.

  14. Implementing the effect of the rupture directivity on PSHA maps: Application to the Marmara Region (Turkey)

    NASA Astrophysics Data System (ADS)

    Herrero, Andre; Spagnuolo, Elena; Akinci, Aybige; Pucci, Stefano

    2016-04-01

    In the present study we attempted to improve the seismic hazard assessment taking into account possible sources of epistemic uncertainty and the azimuthal variability of the ground motions which, at a particular site, is significantly influenced by the rupture mechanism and the rupture direction relative to the site. As a study area we selected Marmara Region (Turkey), especially the city of Istanbul which is characterized by one of the highest levels of seismic risk in Europe and the Mediterranean region. The seismic hazard in the city is mainly associated with two active fault segments which are located at about 20-30 km south of Istanbul. In this perspective first we proposed a methodology to incorporate this new information such as nucleation point in a probabilistic seismic hazard analysis (PSHA) framework. Secondly we introduced information about those fault segments by focusing on the fault rupture characteristics which affect the azimuthal variations of the ground motion spatial distribution i.e. source directivity effect and its influence on the probabilistic seismic hazard analyses (PSHA). An analytical model developed by Spudich and Chiou (2008) is used as a corrective factor that modifies the Next Generation Attenuation (NGA, Power et al. 2008) ground motion predictive equations (GMPEs) introducing rupture related parameters that generally lump together into the term directivity effect. We used the GMPEs as derived by the Abrahamson and Silva (2008) and the Boore and Atkinson (2008); our results are given in terms of 10% probability of exceedance of PSHA (at several periods from 0.5 s to 10 s) in 50 years on rock site condition; the correction for directivity introduces a significant contribution to the percentage ratio between the seismic hazards computed using the directivity model respect to the seismic hazard standard practice. In particular, we benefited the dynamic simulation from a previous study (Aochi & Utrich, 2015) aimed at evaluating the seismic potential of the Marmara region to derive a statistical distribution for nucleation position. Our results suggest that accounting for rupture related parameters in a PSHA using deterministic information from dynamic models is feasible and in particular, the use of a non-uniform statistical distribution for nucleation position has serious consequences on the hazard assessment. Since the directivity effect is conditional on the nucleation position the hazard map changes with the assumptions made. A worst case scenario (both the faults are rupturing towards the city of Istanbul) predicts up to 25% change than the standard formulation at 2 sec and increases with longer periods. The former result is heavily different if a deterministically based nucleation position is assumed.

  15. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  16. Development of a liquefaction hazard screening tool for caltrans bridge sites

    USGS Publications Warehouse

    Knudsen, K.-L.; Bott, J.D.J.; Woods, M.O.; McGuire, T.L.

    2009-01-01

    We have developed a liquefaction hazard screening tool for the California Department of Transportation (Caltrans) that is being used to evaluate the liquefaction hazard to approximately 13,000 bridge sites in California. Because of the large number of bridge sites to be evaluated, we developed a tool that makes use of parameters not typically considered in site-specific liquefaction investigations. We assessed geologic, topographic, seismic hazard, and subsurface conditions at about 100 sites of past liquefaction in California. Among the parameters we found common to many of these sites are: (a) low elevations, (b) proximity to a water body, and (c) presence of geologically youthful deposits or artificial fill materials. The nature of the study necessitated the use of readily available data, preferably datasets that are consistent across the state. The screening tool we provided to Caltrans makes use of the following parameters: (1) proximity to a water body, (2) whether the bridge crosses a water body, (3) the age of site geologic materials and the environment in which the materials were deposited, as discerned from available digital geologic maps, (4) probabilistic shaking estimates, (5) the site elevation, (6) information from available liquefaction hazard maps [covering the 9-county San Francisco Bay Area and Ventura County] and California Geological Survey (CGS) Zones of Required Investigation. For bridge sites at which subsurface boring data were available (from CGS' existing database), we calculated Displacement Potential Index values using a methodology developed by Allison Faris and Jiaer Wu. Caltrans' staff will use this hazard-screening tool, along with other tools focused on bridges and foundations, to prioritize site-specific investigations. ?? 2009 ASCE.

  17. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.

  18. Revised seismic hazard map for the Kyrgyz Republic

    NASA Astrophysics Data System (ADS)

    Fleming, Kevin; Ullah, Shahid; Parolai, Stefano; Walker, Richard; Pittore, Massimiliano; Free, Matthew; Fourniadis, Yannis; Villiani, Manuela; Sousa, Luis; Ormukov, Cholponbek; Moldobekov, Bolot; Takeuchi, Ko

    2017-04-01

    As part of a seismic risk study sponsored by the World Bank, a revised seismic hazard map for the Kyrgyz Republic has been produced, using the OpenQuake-engine developed by the Global Earthquake Model Foundation (GEM). In this project, an earthquake catalogue spanning a period from 250 BCE to 2014 was compiled and processed through spatial and temporal declustering tools. The territory of the Kyrgyz Republic was divided into 31 area sources defined based on local seismicity, including a total area covering 200 km from the border. The results are presented in terms of Peak Ground Acceleration (PGA). In addition, macroseismic intensity estimates, making use of recent intensity prediction equations, were also provided, given that this measure is still widely used in Central Asia. In order to accommodate the associated epistemic uncertainty, three ground motion prediction equations were used in a logic tree structure. A set of representative earthquake scenarios were further identified based on historical data and the nature of the considered faults. The resulting hazard map, as expected, follows the country's seismicity, with the highest levels of hazard in the northeast, south and southwest of the country, with an elevated part around the centre. When considering PGA, the hazard is slightly greater for major urban centres than in previous works (e.g., Abdrakhmatov et al., 2003), although the macroseismic intensity estimates are less than previous studies, e.g., Ulomov (1999). For the scenario assessments, the examples that most affect the urban centres assessed are the Issyk Ata fault (in particular for Bishkek), the Chilik and Kemin faults (in particular Balykchy and Karakol), the Ferghana Valley fault system (in particular Osh, Jalah-Abad and Uzgen), the Oinik Djar fault (Naryn) and the central and western Talas-Ferghanafaukt (Talas). Finally, while site effects (in particular, those dependent on the upper-most geological structure) have an obvious effect on the final hazard level, this is still not fully accounted for, even if a nation-wide first order Vs30 model (i.e., from the USGS) is available. Abdrakhmatov, K., Havenith, H.-B., Delvaux, D., Jongsmans, D. and Trefois, P. (2003) Probabilistic PGA and Arias Intensity maps of Kyrgyzstan (Central Asia), Journal of Seismology, 7, 203-220. Ulomov, V.I., The GSHAP Region 7 working group (1999) Seismic hazard of Northern Eurasia, Annali di Geofisica, 42, 1012-1038.

  19. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  20. Considering potential seismic sources in earthquake hazard assessment for Northern Iran

    NASA Astrophysics Data System (ADS)

    Abdollahzadeh, Gholamreza; Sazjini, Mohammad; Shahaky, Mohsen; Tajrishi, Fatemeh Zahedi; Khanmohammadi, Leila

    2014-07-01

    Located on the Alpine-Himalayan earthquake belt, Iran is one of the seismically active regions of the world. Northern Iran, south of Caspian Basin, a hazardous subduction zone, is a densely populated and developing area of the country. Historical and instrumental documented seismicity indicates the occurrence of severe earthquakes leading to many deaths and large losses in the region. With growth of seismological and tectonic data, updated seismic hazard assessment is a worthwhile issue in emergency management programs and long-term developing plans in urban and rural areas of this region. In the present study, being armed with up-to-date information required for seismic hazard assessment including geological data and active tectonic setting for thorough investigation of the active and potential seismogenic sources, and historical and instrumental events for compiling the earthquake catalogue, probabilistic seismic hazard assessment is carried out for the region using three recent ground motion prediction equations. The logic tree method is utilized to capture epistemic uncertainty of the seismic hazard assessment in delineation of the seismic sources and selection of attenuation relations. The results are compared to a recent practice in code-prescribed seismic hazard of the region and are discussed in detail to explore their variation in each branch of logic tree approach. Also, seismic hazard maps of peak ground acceleration in rock site for 475- and 2,475-year return periods are provided for the region.

  1. Combined fluvial and pluvial urban flood hazard analysis: concept development and application to Can Tho city, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen

    2016-04-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of coincidence into account. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation taking into account the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and their usage in flood risk management are outlined.

  2. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  3. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located about 5 km from the southern boundary of Budapest. The quake caused serious damages in the epicentral area and in the southern districts of the capital. The epicentral area of the earthquake is located along the Danube River. Sand boils were observed in some locations that indicated the occurrence of liquefaction. Because their exact locations were recorded at the time of the earthquake, in situ geotechnical measurements (CPT and SPT) could be performed at two (Dunaharaszti and Taksony) sites. The different types of measurements enabled the probabilistic liquefaction hazard computations at the two studied sites. We have compared the return periods of liquefaction that were computed using different built-in simplified stress based methods.

  4. Seismic‐hazard forecast for 2016 including induced and natural earthquakes in the central and eastern United States

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in north‐central Texas near Dallas–Fort Worth. The chance of having levels of ground motions corresponding to modified Mercalli intensity (MMI) VI or greater earthquake shaking is 2%–12% per year in north‐central Oklahoma and southern Kansas and New Madrid similar to the chance of damage at sites in high‐hazard portions of California caused by natural earthquakes. Hazard is also significant in the Raton basin of Colorado/New Mexico; north‐central Arkansas; Dallas–Fort Worth, Texas; and in a few other areas. Hazard probabilities are much lower (by about half or more) for exceeding MMI VII or VIII. Hazard is 3‐ to 10‐fold higher near some areas of active‐induced earthquakes than in the 2014 USGS National Seismic Hazard Model (NSHM), which did not consider induced earthquakes. This study in conjunction with the LandScan TM Database (2013) indicates that about 8 million people live in areas of active injection wells that have a greater than 1% chance of experiencing damaging ground shaking (MMI≥VI) in 2016. The final model has high uncertainty, and engineers, regulators, and industry should use these assessments cautiously to make informed decisions on mitigating the potential effects of induced and natural earthquakes.

  5. A multi-source probabilistic hazard assessment of tephra dispersal in the Neapolitan area

    NASA Astrophysics Data System (ADS)

    Sandri, Laura; Costa, Antonio; Selva, Jacopo; Folch, Arnau; Macedonio, Giovanni; Tonini, Roberto

    2015-04-01

    In this study we present the results obtained from a long-term Probabilistic Hazard Assessment (PHA) of tephra dispersal in the Neapolitan area. Usual PHA for tephra dispersal needs the definition of eruptive scenarios (usually by grouping eruption sizes and possible vent positions in a limited number of classes) with associated probabilities, a meteorological dataset covering a representative time period, and a tephra dispersal model. PHA then results from combining simulations considering different volcanological and meteorological conditions through weights associated to their specific probability of occurrence. However, volcanological parameters (i.e., erupted mass, eruption column height, eruption duration, bulk granulometry, fraction of aggregates) typically encompass a wide range of values. Because of such a natural variability, single representative scenarios or size classes cannot be adequately defined using single values for the volcanological inputs. In the present study, we use a method that accounts for this within-size-class variability in the framework of Event Trees. The variability of each parameter is modeled with specific Probability Density Functions, and meteorological and volcanological input values are chosen by using a stratified sampling method. This procedure allows for quantifying hazard without relying on the definition of scenarios, thus avoiding potential biases introduced by selecting single representative scenarios. Embedding this procedure into the Bayesian Event Tree scheme enables the tephra fall PHA and its epistemic uncertainties. We have appied this scheme to analyze long-term tephra fall PHA from Vesuvius and Campi Flegrei, in a multi-source paradigm. We integrate two tephra dispersal models (the analytical HAZMAP and the numerical FALL3D) into BET_VH. The ECMWF reanalysis dataset are used for exploring different meteorological conditions. The results obtained show that PHA accounting for the whole natural variability are consistent with previous probabilities maps elaborated for Vesuvius and Campi Flegrei on the basis of single representative scenarios, but show significant differences. In particular, the area characterized by a 300 kg/m2-load exceedance probability larger than 5%, accounting for the whole range of variability (that is, from small violent strombolian to plinian eruptions), is similar to that displayed in the maps based on the medium magnitude reference eruption, but it is of a smaller extent. This is due to the relatively higher weight of the small magnitude eruptions considered in this study, but neglected in the reference scenario maps. On the other hand, in our new maps the area characterized by a 300 kg/m2-load exceedance probability larger than 1% is much larger than that of the medium magnitude reference eruption, due to the contribution of plinian eruptions at lower probabilities, again neglected in the reference scenario maps.

  6. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  7. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).

  8. High-Resolution Underwater Mapping Using Side-Scan Sonar

    PubMed Central

    2016-01-01

    The goal of this study is to generate high-resolution sea floor maps using a Side-Scan Sonar(SSS). This is achieved by explicitly taking into account the SSS operation as follows. First, the raw sensor data is corrected by means of a physics-based SSS model. Second, the data is projected to the sea-floor. The errors involved in this projection are thoroughfully analysed. Third, a probabilistic SSS model is defined and used to estimate the probability of each sea-floor region to be observed. This probabilistic information is then used to weight the contribution of each SSS measurement to the map. Because of these models, arbitrary map resolutions can be achieved, even beyond the sensor resolution. Finally, a geometric map building method is presented and combined with the probabilistic approach. The resulting map is composed of two layers. The echo intensity layer holds the most likely echo intensities at each point in the sea-floor. The probabilistic layer contains information about how confident can the user or the higher control layers be about the echo intensity layer data. Experimental results have been conducted in a large subsea region. PMID:26821379

  9. Climate services for adapting landslide hazard prevention measures in the Vrancea Seismic Region

    NASA Astrophysics Data System (ADS)

    Micu, Dana; Balteanu, Dan; Jurchescu, Marta; Sima, Mihaela; Micu, Mihai

    2014-05-01

    The Vrancea Seismic Region is covering an area of about 8 000 km2 in the Romanian Curvature Carpathians and Subcarpathians and it is considered one of Europe's most intensely multi-hazard-affected areas. Due to its geomorphic traits (heterogeneous morphostructural units of flysch mountains and molasse hills and depressions), the area is strongly impacted by extreme hydro-meteorological events which are potentially enhancing the numerous damages inflicted to a dense network of human settlements. An a priori knowledge of future climate change is a useful climate service for local authorities to develop regional adapting strategies and adequate prevention/preparedness frameworks. This paper aims at integrating the results of the high-resolution climate projections over the 21st century (within the FP7 ECLISE project) into the regional landslide hazard assessment. The requirements of users (Civil Protection, Land management, local authorities) for this area refer to reliable and high-resolution spatial data on landslide and flood hazard for short and medium-term risk management strategies. An insight into the future behavior of climate variability in the Vrancea Seismic Region, based on future climate projections of three regional models, under three RCPs (2.6, 4.5, 8.6), suggests a clear warming, both annually and seasonally and a rather limited annual precipitation decrease, but with a strong change of seasonality. A landslide inventory of 2485 cases (shallow and medium seated earth, debris and rock slides and earth and debris flows) was obtained based on large scale geomorphological mapping and aerial photos support (GeoEye, DigitalGlobe; provided by GoogleEarth and BingMaps). The landslides are uniformly distributed across the area, being considered representative for the entire morphostructural environment. Landslide susceptibility map was obtained using multivariate statistical analysis (logistic regression), while a relative landslide hazard index was computed based on semi-quantitative spatial multi-criteria evaluation (SMCE). The generation of the landslide hazard maps relies on the heuristic approach, since a historical record of landslide occurrences, necessary to produce magnitude-frequency relations, is lacking. Based on the assumption of Sanderson et al. (1996) that slopes' morphology has adjusted, since the last glaciation, to the region's "normal" climatic conditions in all aspects including failures, it becomes clear that an extreme character of precipitation would be highly likely to generate landslides. Therefore, in order to represent the landslides triggering factor, raster maps for both time horizons (present and future) as simulated within each climate scenario have been used to illustrate the probabilistic seasonal precipitation amounts expected within a 30-year and a 100-year return period respectively. These maps, considered reliable indicators for depicting the changes in landslide occurrence probability, were standardized according to their contribution to hazard before being included in the SMCE together with the susceptibility map. The resulted hazard index maps for the two time horizons were compared aiming at detecting the future potential climate-induced changes in the spatial patterns of landslide occurrence. The outcomes mostly meet the requirements for updating the regional landslide risk management strategies.

  10. Probability hazard map for future vent opening at Etna volcano (Sicily, Italy).

    NASA Astrophysics Data System (ADS)

    Brancato, Alfonso; Tusa, Giuseppina; Coltelli, Mauro; Proietti, Cristina

    2014-05-01

    Mount Etna is a composite stratovolcano located along the Ionian coast of eastern Sicily. The frequent flank eruptions occurrence (at an interval of years, mostly concentrated along the NE, S and W rift zones) lead to a high volcanic hazard that, linked with intense urbanization, poses a high volcanic risk. A long-term volcanic hazard assessment, mainly based on the past behaviour of the Etna volcano, is the basic tool for the evaluation of this risk. Then, a reliable forecast where the next eruption will occur is needed. A computer-assisted analysis and probabilistic evaluations will provide the relative map, thus allowing identification of the areas prone to the highest hazard. Based on these grounds, the use of a code such BET_EF (Bayesian Event Tree_Eruption Forecasting) showed that a suitable analysis can be explored (Selva et al., 2012). Following an analysis we are performing, a total of 6886 point-vents referring to the last 4.0 ka of Etna flank activity, and spread over an area of 744 km2 (divided into N=2976 squared cell, with side of 500 m), allowed us to estimate a pdf by applying a Gaussian kernel. The probability values represent a complete set of outcomes mutually exclusive and the relative sum is normalized to one over the investigated area; then, the basic assumptions of a Dirichlet distribution (the prior distribution set in the BET_EF code (Marzocchi et al., 2004, 2008)) still hold. One fundamental parameter is the the equivalent number of data, that depicts our confidence on the best guess probability. The BET_EF code also works with a likelihood function. This is modelled by a Multinomial distribution, with parameters representing the number of vents in each cell and the total number of past data (i.e. the 6886 point-vents). Given the grid of N cells, the final posterior distribution will be evaluated by multiplying the a priori Dirichlet probability distribution with the past data in each cell through the likelihood. The probability hazard map shows a tendency to concentrate along the NE and S rifts, as well as Valle del Bove, increasing the difference in probability between these areas and the rest of the volcano edifice. It is worthy notice that a higher significance is still evident along the W rift, even if not comparable with the ones of the above mentioned areas. References Marzocchi W., Sandri L., Gasparini P., Newhall C. and Boschi E.; 2004: Quantifying probabilities of volcanic events: The example of volcanic hazard at Mount Vesuvius, J. Geophys. Res., 109, B11201, doi:10.1029/2004JB00315U. Marzocchi W., Sandri, L. and Selva, J.; 2008: BET_EF: a probabilistic tool for long- and short-term eruption forecasting, Bull. Volcanol., 70, 623 - 632, doi: 10.1007/s00445-007-0157-y. Selva J., Orsi G., Di Vito M.A., Marzocchi W. And Sandri L.; 2012: Probability hazard mapfor future vent opening atthe Campi Flegrei caldera, Italy, Bull. Volcanol., 74, 497 - 510, doi: 10.1007/s00445-011-0528-2.

  11. SEISRISK II; a computer program for seismic hazard estimation

    USGS Publications Warehouse

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  12. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.

  13. Applicability of a neuroprobabilistic integral risk index for the environmental management of polluted areas: a case study.

    PubMed

    Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L

    2008-04-01

    Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.

  14. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.

  15. Automated segmentation of the prostate in 3D MR images using a probabilistic atlas and a spatially constrained deformable model.

    PubMed

    Martin, Sébastien; Troccaz, Jocelyne; Daanenc, Vincent

    2010-04-01

    The authors present a fully automatic algorithm for the segmentation of the prostate in three-dimensional magnetic resonance (MR) images. The approach requires the use of an anatomical atlas which is built by computing transformation fields mapping a set of manually segmented images to a common reference. These transformation fields are then applied to the manually segmented structures of the training set in order to get a probabilistic map on the atlas. The segmentation is then realized through a two stage procedure. In the first stage, the processed image is registered to the probabilistic atlas. Subsequently, a probabilistic segmentation is obtained by mapping the probabilistic map of the atlas to the patient's anatomy. In the second stage, a deformable surface evolves toward the prostate boundaries by merging information coming from the probabilistic segmentation, an image feature model and a statistical shape model. During the evolution of the surface, the probabilistic segmentation allows the introduction of a spatial constraint that prevents the deformable surface from leaking in an unlikely configuration. The proposed method is evaluated on 36 exams that were manually segmented by a single expert. A median Dice similarity coefficient of 0.86 and an average surface error of 2.41 mm are achieved. By merging prior knowledge, the presented method achieves a robust and completely automatic segmentation of the prostate in MR images. Results show that the use of a spatial constraint is useful to increase the robustness of the deformable model comparatively to a deformable surface that is only driven by an image appearance model.

  16. Seismic hazard assessment of the Kivu rift segment based on a new seismotectonic zonation model (western branch, East African Rift system)

    NASA Astrophysics Data System (ADS)

    Delvaux, Damien; Mulumba, Jean-Luc; Sebagenzi, Mwene Ntabwoba Stanislas; Bondo, Silvanos Fiama; Kervyn, François; Havenith, Hans-Balder

    2017-10-01

    In the frame of the Belgian GeoRisCA multi-risk assessment project focusing on the Kivu and northern Tanganyika rift region in Central Africa, a new probabilistic seismic hazard assessment has been performed for the Kivu rift segment in the central part of the western branch of the East African rift system. As the geological and tectonic setting of this region is incompletely known, especially the part lying in the Democratic Republic of the Congo, we compiled homogeneous cross-border tectonic and neotectonic maps. The seismic risk assessment is based on a new earthquake catalogue based on the ISC reviewed earthquake catalogue and supplemented by other local catalogues and new macroseismic epicenter data spanning 126 years, with 1068 events. The magnitudes have been homogenized to Mw and aftershocks removed. The final catalogue used for the seismic hazard assessment spans 60 years, from 1955 to 2015, with 359 events and a magnitude of completeness of 4.4. The seismotectonic zonation into 7 seismic source areas was done on the basis of the regional geological structure, neotectonic fault systems, basin architecture and distribution of thermal springs and earthquake epicenters. The Gutenberg-Richter seismic hazard parameters were determined by the least square linear fit and the maximum likelihood method. Seismic hazard maps have been computed using existing attenuation laws with the Crisis 2012 software. We obtained higher PGA values (475 years return period) for the Kivu rift region than the previous estimates. They also vary laterally in function of the tectonic setting, with the lowest value in the volcanically active Virunga - Rutshuru zone, highest in the currently non-volcanic parts of Lake Kivu, Rusizi valley and North Tanganyika rift zone, and intermediate in the regions flanking the axial rift zone.

  17. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  18. USGS Online Short-term Hazard Maps: Experiences in the First Year of Implementation

    NASA Astrophysics Data System (ADS)

    Gerstenberger, M. C.; Jones, L. M.

    2005-12-01

    In May of 2005, following review by the California Earthquake Prediction Evaluation Council, the USGS launched a website that displays the probability of experiencing Modified Mercalli Intensity VI in the next 24 hours. With a forecast based on a relatively simple application of the Gutenberg-Richter relationship and the modified Omori law, the maps are primarily aimed at providing information related to aftershock hazard. Initial response to the system has been mostly positive but has required an effort toward public education. Particularly, it has been difficult to communicate the important difference between a probabilistic forecast and a binary earthquake "prediction". Even with the familiar use of probabilities in weather maps and recent use of terms such as Modified Mercalli Intensity, these, and other terms, are often misunderstood by the media and public. Additionally, the fact that our methodology is not targeted at large independent events has sometimes been difficult to convey to scientists as well as the public. Initial interest in the webpages has been high with greater than 700,000 individual visits between going live in late May, 2005 and the end of June, 2005. This accounts for more than 1/3 of the visits to the USGS-Pasadena webpages in that period. Visits have declined through July and August, but individual daily visits average around 3,000/day.

  19. Probabilistic-numerical assessment of pyroclastic current hazard at Campi Flegrei and Naples city: Multi-VEI scenarios as a tool for "full-scale" risk management.

    PubMed

    Mastrolorenzo, Giuseppe; Palladino, Danilo M; Pappalardo, Lucia; Rossano, Sergio

    2017-01-01

    The Campi Flegrei volcanic field (Italy) poses very high risk to the highly urbanized Neapolitan area. Eruptive history was dominated by explosive activity producing pyroclastic currents (hereon PCs; acronym for Pyroclastic Currents) ranging in scale from localized base surges to regional flows. Here we apply probabilistic numerical simulation approaches to produce PC hazard maps, based on a comprehensive spectrum of flow properties and vent locations. These maps are incorporated in a Geographic Information System (GIS) and provide all probable Volcanic Explosivity Index (VEI) scenarios from different source vents in the caldera, relevant for risk management planning. For each VEI scenario, we report the conditional probability for PCs (i.e., the probability for a given area to be affected by the passage of PCs in case of a PC-forming explosive event) and related dynamic pressure. Model results indicate that PCs from VEI<4 events would be confined within the Campi Flegrei caldera, PC propagation being impeded by the northern and eastern caldera walls. Conversely, PCs from VEI 4-5 events could invade a wide area beyond the northern caldera rim, as well as part of the Naples metropolitan area to the east. A major controlling factor of PC dispersal is represented by the location of the vent area. PCs from the potentially largest eruption scenarios (analogous to the ~15 ka, VEI 6 Neapolitan Yellow Tuff or even the ~39 ka, VEI 7 Campanian Ignimbrite extreme event) would affect a large part of the Campanian Plain to the north and the city of Naples to the east. Thus, in case of renewal of eruptive activity at Campi Flegrei, up to 3 million people will be potentially exposed to volcanic hazard, pointing out the urgency of an emergency plan. Considering the present level of uncertainty in forecasting the future eruption type, size and location (essentially based on statistical analysis of previous activity), we suggest that appropriate planning measures should face at least the VEI 5 reference scenario (at least 2 occurrences documented in the last 10 ka).

  20. Probabilistic-numerical assessment of pyroclastic current hazard at Campi Flegrei and Naples city: Multi-VEI scenarios as a tool for “full-scale” risk management

    PubMed Central

    Mastrolorenzo, Giuseppe; Palladino, Danilo M.; Pappalardo, Lucia; Rossano, Sergio

    2017-01-01

    The Campi Flegrei volcanic field (Italy) poses very high risk to the highly urbanized Neapolitan area. Eruptive history was dominated by explosive activity producing pyroclastic currents (hereon PCs; acronym for Pyroclastic Currents) ranging in scale from localized base surges to regional flows. Here we apply probabilistic numerical simulation approaches to produce PC hazard maps, based on a comprehensive spectrum of flow properties and vent locations. These maps are incorporated in a Geographic Information System (GIS) and provide all probable Volcanic Explosivity Index (VEI) scenarios from different source vents in the caldera, relevant for risk management planning. For each VEI scenario, we report the conditional probability for PCs (i.e., the probability for a given area to be affected by the passage of PCs in case of a PC-forming explosive event) and related dynamic pressure. Model results indicate that PCs from VEI<4 events would be confined within the Campi Flegrei caldera, PC propagation being impeded by the northern and eastern caldera walls. Conversely, PCs from VEI 4–5 events could invade a wide area beyond the northern caldera rim, as well as part of the Naples metropolitan area to the east. A major controlling factor of PC dispersal is represented by the location of the vent area. PCs from the potentially largest eruption scenarios (analogous to the ~15 ka, VEI 6 Neapolitan Yellow Tuff or even the ~39 ka, VEI 7 Campanian Ignimbrite extreme event) would affect a large part of the Campanian Plain to the north and the city of Naples to the east. Thus, in case of renewal of eruptive activity at Campi Flegrei, up to 3 million people will be potentially exposed to volcanic hazard, pointing out the urgency of an emergency plan. Considering the present level of uncertainty in forecasting the future eruption type, size and location (essentially based on statistical analysis of previous activity), we suggest that appropriate planning measures should face at least the VEI 5 reference scenario (at least 2 occurrences documented in the last 10 ka). PMID:29020018

  1. Identification of elements at risk for a credible tsunami event for Istanbul

    NASA Astrophysics Data System (ADS)

    Hancilar, U.

    2012-01-01

    Physical and social elements at risk are identified for a credible tsunami event for Istanbul. For this purpose, inundation maps resulting from probabilistic tsunami hazard analysis for a 10% probability of exceedance in 50 yr are utilised in combination with the geo-coded inventories of building stock, lifeline systems and demographic data. The built environment on Istanbul's shorelines that is exposed to tsunami inundation comprises residential, commercial, industrial, public (governmental/municipal, schools, hospitals, sports and religious), infrastructure (car parks, garages, fuel stations, electricity transformer buildings) and military buildings, as well as piers and ports, gas tanks and stations and other urban elements (e.g., recreational facilities). Along the Marmara Sea shore, Tuzla shipyards and important port and petrochemical facilities at Ambarlı are expected to be exposed to tsunami hazard. Significant lifeline systems of the city of Istanbul such as natural gas, electricity, telecommunication and sanitary and waste-water transmission, are also under the threat of tsunamis. In terms of social risk, it is estimated that there are about 32 000 inhabitants exposed to tsunami hazard.

  2. Some aspects of risks and natural hazards in the rainfall variability space of Rwanda.

    NASA Astrophysics Data System (ADS)

    Nduwayezu, Emmanuel; Derron, Marc-Henri; Jaboyedoff, Michel; Penna, Ivanna; Kanevski, Mikhaïl

    2014-05-01

    Rwanda is facing challenges related to its dispersed population and their density. Risk assessment for natural disasters is becoming important in order to reduce the extent and damages of natural disasters. Rwanda is a country with a diversity of landscapes. Its mountains and marshes have been considered as a water reserve, a forest and grazing reserve by the population (currently around 11 million). Due to geologic and climate conditions, the country is subject of different natural processes, in particular hydrological events (flooding and also landslides), but also earthquakes and volcanism, which the communities have to live with in the western part. In the last years, population expansion for land by clearing of forests and draining marshes, seems to be acting as an aggravating factor. Therefore, a risk assessment for rainfall related hazards requires a deep understanding of the precipitation patterns. Based on satellite image interpretation, historical reports of events, and the analysis of rainfalls variability mapping and probabilistic analyses of events, the aim of this case study is to produce an overview and a preliminary assessment of the hazards scenario in Rwanda.

  3. Active Fault Near-Source Zones Within and Bordering the State of California for the 1997 Uniform Building Code

    USGS Publications Warehouse

    Petersen, M.D.; Toppozada, Tousson R.; Cao, T.; Cramer, C.H.; Reichle, M.S.; Bryant, W.A.

    2000-01-01

    The fault sources in the Project 97 probabilistic seismic hazard maps for the state of California were used to construct maps for defining near-source seismic coefficients, Na and Nv, incorporated in the 1997 Uniform Building Code (ICBO 1997). The near-source factors are based on the distance from a known active fault that is classified as either Type A or Type B. To determine the near-source factor, four pieces of geologic information are required: (1) recognizing a fault and determining whether or not the fault has been active during the Holocene, (2) identifying the location of the fault at or beneath the ground surface, (3) estimating the slip rate of the fault, and (4) estimating the maximum earthquake magnitude for each fault segment. This paper describes the information used to produce the fault classifications and distances.

  4. Tsunami Loss Assessment For Istanbul

    NASA Astrophysics Data System (ADS)

    Hancilar, Ufuk; Cakti, Eser; Zulfikar, Can; Demircioglu, Mine; Erdik, Mustafa

    2010-05-01

    Tsunami risk and loss assessment incorporating with the inundation mapping in Istanbul and the Marmara Sea region are presented in this study. The city of Istanbul is under the threat of earthquakes expected to originate from the Main Marmara branch of North Anatolian Fault System. In the Marmara region the earthquake hazard reached very high levels with 2% annual probability of occurrence of a magnitude 7+ earthquake on the Main Marmara Fault. Istanbul is the biggest city of Marmara region as well as of Turkey with its almost 12 million inhabitants. It is home to 40% of the industrial facilities in Turkey and operates as the financial and trade hub of the country. Past earthquakes have evidenced that the structural reliability of residential and industrial buildings, as well as that of lifelines including port and harbor structures in the country is questionable. These facts make the management of earthquake risks imperative for the reduction of physical and socio-economic losses. The level of expected tsunami hazard in Istanbul is low as compared to earthquake hazard. Yet the assets at risk along the shores of the city make a thorough assessment of tsunami risk imperative. Important residential and industrial centres exist along the shores of the Marmara Sea. Particularly along the northern and eastern shores we see an uninterrupted settlement pattern with industries, businesses, commercial centres and ports and harbours in between. Following the inundation maps resulting from deterministic and probabilistic tsunami hazard analyses, vulnerability and risk analyses are presented and the socio-economic losses are estimated. This study is part of EU-supported FP6 project ‘TRANSFER'.

  5. Probabilistic storm surge inundation maps for Metro Manila based on Philippine public storm warning signals

    NASA Astrophysics Data System (ADS)

    Tablazon, J.; Caro, C. V.; Lagmay, A. M. F.; Briones, J. B. L.; Dasallas, L.; Lapidez, J. P.; Santiago, J.; Suarez, J. K.; Ladiero, C.; Gonzalo, L. A.; Mungcal, M. T. F.; Malano, V.

    2015-03-01

    A storm surge is the sudden rise of sea water over the astronomical tides, generated by an approaching storm. This event poses a major threat to the Philippine coastal areas, as manifested by Typhoon Haiyan on 8 November 2013. This hydro-meteorological hazard is one of the main reasons for the high number of casualties due to the typhoon, with 6300 deaths. It became evident that the need to develop a storm surge inundation map is of utmost importance. To develop these maps, the Nationwide Operational Assessment of Hazards under the Department of Science and Technology (DOST-Project NOAH) simulated historical tropical cyclones that entered the Philippine Area of Responsibility. The Japan Meteorological Agency storm surge model was used to simulate storm surge heights. The frequency distribution of the maximum storm surge heights was calculated using simulation results of tropical cyclones under a specific public storm warning signal (PSWS) that passed through a particular coastal area. This determines the storm surge height corresponding to a given probability of occurrence. The storm surge heights from the model were added to the maximum astronomical tide data from WXTide software. The team then created maps of inundation for a specific PSWS using the probability of exceedance derived from the frequency distribution. Buildings and other structures were assigned a probability of exceedance depending on their occupancy category, i.e., 1% probability of exceedance for critical facilities, 10% probability of exceedance for special occupancy structures, and 25% for standard occupancy and miscellaneous structures. The maps produced show the storm-surge-vulnerable areas in Metro Manila, illustrated by the flood depth of up to 4 m and extent of up to 6.5 km from the coastline. This information can help local government units in developing early warning systems, disaster preparedness and mitigation plans, vulnerability assessments, risk-sensitive land use plans, shoreline defense efforts, and coastal protection measures. These maps can also determine the best areas to build critical structures, or at least determine the level of protection of these structures should they be built in hazard areas. Moreover, these will support the local government units' mandate to raise public awareness, disseminate information about storm surge hazards, and implement appropriate countermeasures for a given PSWS.

  6. Comparison and validation of shallow landslides susceptibility maps generated by bi-variate and multi-variate linear probabilistic GIS-based techniques. A case study from Ribeira Quente Valley (S. Miguel Island, Azores)

    NASA Astrophysics Data System (ADS)

    Marques, R.; Amaral, P.; Zêzere, J. L.; Queiroz, G.; Goulart, C.

    2009-04-01

    Slope instability research and susceptibility mapping is a fundamental component of hazard assessment and is of extreme importance for risk mitigation, land-use management and emergency planning. Landslide susceptibility zonation has been actively pursued during the last two decades and several methodologies are still being improved. Among all the methods presented in the literature, indirect quantitative probabilistic methods have been extensively used. In this work different linear probabilistic methods, both bi-variate and multi-variate (Informative Value, Fuzzy Logic, Weights of Evidence and Logistic Regression), were used for the computation of the spatial probability of landslide occurrence, using the pixel as mapping unit. The methods used are based on linear relationships between landslides and 9 considered conditioning factors (altimetry, slope angle, exposition, curvature, distance to streams, wetness index, contribution area, lithology and land-use). It was assumed that future landslides will be conditioned by the same factors as past landslides in the study area. The presented work was developed for Ribeira Quente Valley (S. Miguel Island, Azores), a study area of 9,5 km2, mainly composed of volcanic deposits (ash and pumice lapilli) produced by explosive eruptions in Furnas Volcano. This materials associated to the steepness of the slopes (38,9% of the area has slope angles higher than 35°, reaching a maximum of 87,5°), make the area very prone to landslide activity. A total of 1.495 shallow landslides were mapped (at 1:5.000 scale) and included in a GIS database. The total affected area is 401.744 m2 (4,5% of the study area). Most slope movements are translational slides frequently evolving into debris-flows. The landslides are elongated, with maximum length generally equivalent to the slope extent, and their width normally does not exceed 25 m. The failure depth rarely exceeds 1,5 m and the volume is usually smaller than 700 m3. For modelling purposes, the landslides were randomly divided in two sub-datasets: a modelling dataset with 748 events (2,2% of the study area) and a validation dataset with 747 events (2,3% of the study area). The susceptibility algorithms achieved with the different probabilistic techniques, were rated individually using success rate and prediction rate curves. The best model performance was obtained with the logistic regression, although the results from the different methods do not show significant differences neither in success nor in prediction rate curves. These evidences revealed that: (1) the modelling landslide dataset is representative of the entire landslide population characteristics; and (2) the increase of complexity and robustness in the probabilistic methodology did not produce a significant increase in success or prediction rates. Therefore, it was concluded that the resolution and quality of the input variables are much more important than the probabilistic model chosen to assess landslide susceptibility. This work was developed on the behalf of VOLCSOILRISK project (Volcanic Soils Geotechnical Characterization for Landslide Risk Mitigation), supported by Direcção Regional da Ciência e Tecnologia - Governo Regional dos Açores.

  7. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  8. Long-term volcanic hazard assessment on El Hierro (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Bartolini, S.; Sobradelo, R.; Martí, J.; Morales, J. M.; Galindo, I.

    2014-07-01

    Long-term hazard assessment, one of the bastions of risk-mitigation programs, is required for land-use planning and for developing emergency plans. To ensure quality and representative results, long-term volcanic hazard assessment requires several sequential steps to be completed, which include the compilation of geological and volcanological information, the characterisation of past eruptions, spatial and temporal probabilistic studies, and the simulation of different eruptive scenarios. Despite being a densely populated active volcanic region that receives millions of visitors per year, no systematic hazard assessment has ever been conducted on the Canary Islands. In this paper we focus our attention on El Hierro, the youngest of the Canary Islands and the most recently affected by an eruption. We analyse the past eruptive activity to determine the spatial and temporal probability, and likely style of a future eruption on the island, i.e. the where, when and how. By studying the past eruptive behaviour of the island and assuming that future eruptive patterns will be similar, we aim to identify the most likely volcanic scenarios and corresponding hazards, which include lava flows, pyroclastic fallout and pyroclastic density currents (PDCs). Finally, we estimate their probability of occurrence. The end result, through the combination of the most probable scenarios (lava flows, pyroclastic density currents and ashfall), is the first qualitative integrated volcanic hazard map of the island.

  9. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  10. Risk assessment of CST-7 proposed waste treatment and storage facilities Volume I: Limited-scope probabilistic risk assessment (PRA) of proposed CST-7 waste treatment & storage facilities. Volume II: Preliminary hazards analysis of proposed CST-7 waste storage & treatment facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sasser, K.

    1994-06-01

    In FY 1993, the Los Alamos National Laboratory Waste Management Group [CST-7 (formerly EM-7)] requested the Probabilistic Risk and Hazards Analysis Group [TSA-11 (formerly N-6)] to conduct a study of the hazards associated with several CST-7 facilities. Among these facilities are the Hazardous Waste Treatment Facility (HWTF), the HWTF Drum Storage Building (DSB), and the Mixed Waste Receiving and Storage Facility (MWRSF), which are proposed for construction beginning in 1996. These facilities are needed to upgrade the Laboratory`s storage capability for hazardous and mixed wastes and to provide treatment capabilities for wastes in cases where offsite treatment is not availablemore » or desirable. These facilities will assist Los Alamos in complying with federal and state requlations.« less

  11. A Comprehensive Probabilistic Tsunami Hazard Assessment: Multiple Sources and Short-Term Interactions

    NASA Astrophysics Data System (ADS)

    Anita, G.; Selva, J.; Laura, S.

    2011-12-01

    We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).

  12. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  13. Speech processing using maximum likelihood continuity mapping

    DOEpatents

    Hogden, John E.

    2000-01-01

    Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.

  14. Speech processing using maximum likelihood continuity mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, J.E.

    Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.

  15. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  16. New Science Applications Within the U.S. National Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.; Forson, C. K.; Horrillo, J. J.; Nicolsky, D.

    2017-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is a collaborative State and Federal program which supports consistent and cost effective tsunami preparedness and mitigation activities at a community level. The NTHMP is developing a new five-year Strategic Plan based on the 2017 Tsunami Warning, Education, and Research Act as well as recommendations the 2017 NTHMP External Review Panel. Many NTHMP activities are based on the best available scientific methods through the NTHMP Mapping and Modeling Subcommittee (MMS). The primary activities for the MMS member States are to characterize significant tsunami sources, numerically model those sources, and create tsunami inundation maps for evacuation planning. This work remains a focus for many unmapped coastlines. With the lessons learned from the 2004 Indian Ocean and 2011 Tohoku Japan tsunamis, where both immediate risks and long-term recovery issues where recognized, the NTHMP MMS is expanding efforts into other areas that address community resilience. Tsunami evacuation modeling based on both pedestrian and vehicular modes of transportation are being developed by NTHMP States. Products include tools for the public to create personal evacuation maps. New tsunami response planning tools are being developed for both maritime and coastal communities. Maritime planning includes tsunami current-hazard maps for in-harbor and offshore response activities. Multi-tiered tsunami evacuation plans are being developed in some states to address local- versus distant-source tsunamis, as well as real-time evacuation plans, or "playbooks," for distant-source tsunamis forecasted to be less than the worst-case flood event. Products to assist community mitigation and recovery are being developed at a State level. Harbor Improvement Reports, which evaluate the impacts of currents, sediment, and debris on harbor infrastructure, include direct mitigation activities for Local Hazard Mitigation Plans. Building code updates in the five Pacific states will include new sections on tsunami load analysis of structures, and require Tsunami Design Zones based on probabilistic analyses. Guidance for community recovery planning has also been initiated. These new projects are being piloted by some States and will help create guidance for other States in the future.

  17. A geomorphic approach to 100-year floodplain mapping for the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Jafarzadegan, Keighobad; Merwade, Venkatesh; Saksena, Siddharth

    2018-06-01

    Floodplain mapping using hydrodynamic models is difficult in data scarce regions. Additionally, using hydrodynamic models to map floodplain over large stream network can be computationally challenging. Some of these limitations of floodplain mapping using hydrodynamic modeling can be overcome by developing computationally efficient statistical methods to identify floodplains in large and ungauged watersheds using publicly available data. This paper proposes a geomorphic model to generate probabilistic 100-year floodplain maps for the Conterminous United States (CONUS). The proposed model first categorizes the watersheds in the CONUS into three classes based on the height of the water surface corresponding to the 100-year flood from the streambed. Next, the probability that any watershed in the CONUS belongs to one of these three classes is computed through supervised classification using watershed characteristics related to topography, hydrography, land use and climate. The result of this classification is then fed into a probabilistic threshold binary classifier (PTBC) to generate the probabilistic 100-year floodplain maps. The supervised classification algorithm is trained by using the 100-year Flood Insurance Rated Maps (FIRM) from the U.S. Federal Emergency Management Agency (FEMA). FEMA FIRMs are also used to validate the performance of the proposed model in areas not included in the training. Additionally, HEC-RAS model generated flood inundation extents are used to validate the model performance at fifteen sites that lack FEMA maps. Validation results show that the probabilistic 100-year floodplain maps, generated by proposed model, match well with both FEMA and HEC-RAS generated maps. On average, the error of predicted flood extents is around 14% across the CONUS. The high accuracy of the validation results shows the reliability of the geomorphic model as an alternative approach for fast and cost effective delineation of 100-year floodplains for the CONUS.

  18. Probabilistic self-organizing maps for continuous data.

    PubMed

    Lopez-Rubio, Ezequiel

    2010-10-01

    The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.

  19. Probabilistic seismic hazard study based on active fault and finite element geodynamic models

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-04-01

    We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and with their internal variability together with the choice of the ground motion prediction equations (GMPEs) are the most influencing parameter. Both of these parameters have significan affect on the hazard results. Thus having good knowledge of the existence of active faults and their geometric and activity characteristics is of key importance. We also show that PSHA models based exclusively on active faults and geodynamic inputs, which are thus not dependent on past earthquake occurrences, provide a valid method for seismic hazard calculation.

  20. H-SLAM: Rao-Blackwellized Particle Filter SLAM Using Hilbert Maps.

    PubMed

    Vallicrosa, Guillem; Ridao, Pere

    2018-05-01

    Occupancy Grid maps provide a probabilistic representation of space which is important for a variety of robotic applications like path planning and autonomous manipulation. In this paper, a SLAM (Simultaneous Localization and Mapping) framework capable of obtaining this representation online is presented. The H-SLAM (Hilbert Maps SLAM) is based on Hilbert Map representation and uses a Particle Filter to represent the robot state. Hilbert Maps offer a continuous probabilistic representation with a small memory footprint. We present a series of experimental results carried both in simulation and with real AUVs (Autonomous Underwater Vehicles). These results demonstrate that our approach is able to represent the environment more consistently while capable of running online.

  1. A framework for probabilistic pluvial flood nowcasting for urban areas

    NASA Astrophysics Data System (ADS)

    Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick

    2016-04-01

    Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the larger city of Gent, Belgium. After each of the different above-mentioned components were evaluated, they were combined and tested for recent historical flood events. The rainfall nowcasting, hydraulic sewer and 2D inundation modelling and socio-economical flood risk results each could be partly evaluated: the rainfall nowcasting results based on radar data and rain gauges; the hydraulic sewer model results based on water level and discharge data at pumping stations; the 2D inundation modelling results based on limited data on some recent flood locations and inundation depths; the results for the socio-economical flood consequences of the most extreme events based on claims in the database of the national disaster agency. Different methods for visualization of the probabilistic inundation results are proposed and tested.

  2. Probabilistic hurricane-induced storm surge hazard assessment in Guadeloupe, Lesser Antilles

    NASA Astrophysics Data System (ADS)

    Krien, Y.; Dudon, B.; Roger, J.; Zahibo, N.

    2015-01-01

    Current storm surge hazard maps in the French West Indies are essentially based on simple statistical methods using limited historical data and early low-resolution models which do not take the effect of waves into account. In this paper, we infer new 100 and 1000 year surge levels in Guadeloupe from the numerical modelling of storm surges induced by a large set of synthetic events that are in statistical agreement with features of historical hurricanes in the North Atlantic Basin between 1980 and 2011. Computations are performed using the wave-current coupled model ADCIRC-SWAN with high grid resolutions (up to 40-60 m) in the coastal and wave dissipation areas. This model is validated against observations during past events such as hurricane HUGO (1989). Results are generally found to be in reasonable agreement with past studies in areas where surge is essentially wind-driven, but to differ significantly in coastal regions where the transfer of momentum from waves to the water column constitutes a non-negligible part of the total surge. The methodology, which can be applied to other islands in the Lesser Antilles, allows to obtain storm surge level maps that can be of major interest for coastal planners and decision makers in terms of risk management.

  3. Probabilistic hurricane-induced storm surge hazard assessment in Guadeloupe, Lesser Antilles

    NASA Astrophysics Data System (ADS)

    Krien, Y.; Dudon, B.; Roger, J.; Zahibo, N.

    2015-08-01

    Current storm surge hazard maps in the French West Indies are essentially based on simple statistical methods using limited historical data and early low-resolution models which do not take the effect of waves into account. In this paper, we infer new 100-year and 1000-year surge levels in Guadeloupe from the numerical modelling of storm surges induced by a large set of synthetic events that are in statistical agreement with features of historical hurricanes in the North Atlantic Basin between 1980 and 2011. Computations are performed using the wave-current coupled model ADCIRC-SWAN with high grid resolutions (up to 40-60 m) in the coastal and wave dissipation areas. This model is validated against observations during past events such as hurricane HUGO (1989). Results are generally found to be in reasonable agreement with past studies in areas where surge is essentially wind-driven, but found to differ significantly in coastal regions where the transfer of momentum from waves to the water column constitutes a non-negligible part of the total surge. The methodology, which can be applied to other islands in the Lesser Antilles, allows storm surge level maps to be obtained that can be of major interest for coastal planners and decision makers in terms of risk management.

  4. Probabilistic assessment of landslide tsunami hazard for the northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Pampell-Manis, A.; Horrillo, J.; Shigihara, Y.; Parambath, L.

    2016-01-01

    The devastating consequences of recent tsunamis affecting Indonesia and Japan have prompted a scientific response to better assess unexpected tsunami hazards. Although much uncertainty exists regarding the recurrence of large-scale tsunami events in the Gulf of Mexico (GoM), geological evidence indicates that a tsunami is possible and would most likely come from a submarine landslide triggered by an earthquake. This study customizes for the GoM a first-order probabilistic landslide tsunami hazard assessment. Monte Carlo Simulation (MCS) is employed to determine landslide configurations based on distributions obtained from observational submarine mass failure (SMF) data. Our MCS approach incorporates a Cholesky decomposition method for correlated landslide size parameters to capture correlations seen in the data as well as uncertainty inherent in these events. Slope stability analyses are performed using landslide and sediment properties and regional seismic loading to determine landslide configurations which fail and produce a tsunami. The probability of each tsunamigenic failure is calculated based on the joint probability of slope failure and probability of the triggering earthquake. We are thus able to estimate sizes and return periods for probabilistic maximum credible landslide scenarios. We find that the Cholesky decomposition approach generates landslide parameter distributions that retain the trends seen in observational data, improving the statistical validity and relevancy of the MCS technique in the context of landslide tsunami hazard assessment. Estimated return periods suggest that probabilistic maximum credible SMF events in the north and northwest GoM have a recurrence of 5000-8000 years, in agreement with age dates of observed deposits.

  5. QVAST: a new Quantum GIS plugin for estimating volcanic susceptibility

    NASA Astrophysics Data System (ADS)

    Bartolini, S.; Cappello, A.; Martí, J.; Del Negro, C.

    2013-08-01

    One of the most important tasks of modern volcanology is the construction of hazard maps simulating different eruptive scenarios that can be used in risk-based decision-making in land-use planning and emergency management. The first step in the quantitative assessment of volcanic hazards is the development of susceptibility maps, i.e. the spatial probability of a future vent opening given the past eruptive activity of a volcano. This challenging issue is generally tackled using probabilistic methods that use the calculation of a kernel function at each data location to estimate probability density functions (PDFs). The smoothness and the modeling ability of the kernel function are controlled by the smoothing parameter, also known as the bandwidth. Here we present a new tool, QVAST, part of the open-source Geographic Information System Quantum GIS, that is designed to create user-friendly quantitative assessments of volcanic susceptibility. QVAST allows to select an appropriate method for evaluating the bandwidth for the kernel function on the basis of the input parameters and the shapefile geometry, and can also evaluate the PDF with the Gaussian kernel. When different input datasets are available for the area, the total susceptibility map is obtained by assigning different weights to each of the PDFs, which are then combined via a weighted summation and modeled in a non-homogeneous Poisson process. The potential of QVAST, developed in a free and user-friendly environment, is here shown through its application in the volcanic fields of Lanzarote (Canary Islands) and La Garrotxa (NE Spain).

  6. A framework for the probabilistic analysis of meteotsunamis

    USGS Publications Warehouse

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  7. The Making of a Tsunami Hazard Map: Lessons Learned from the TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, R.

    2017-12-01

    Following the worldwide surge of awareness toward tsunami hazard and risk in the last decade, Europe has promoted a better understanding of the tsunami phenomenon through research projects (e.g. TRANSFER, ASTARTE) and started programs for preventing the tsunami impact along the coastlines of the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region (e.g. the Tsunami Early Warning and Mitigation System, NEAMTWS, coordinated by IOC/UNESCO). An indispensable tool toward long-term coastal planning and an effective design and subsequent use of TWS is the availability of a comprehensive Probabilistic Tsunami Hazard Assessment (PTHA). The TSUMAPS-NEAM project took the pledge of producing the first region-wide long-term homogenous PTHA map from earthquake sources. The hazard assessment was built upon state-of-the-art procedures and standards, enriched by some rather innovative/experimental approaches such as: (1) the statistical treatment of potential seismic sources, combining all the available information (seismicity, moment tensors, tectonics), and considering earthquakes occurring on major crustal faults and subduction interfaces; (2) an intensive computational approach to tsunami generation and linear propagation across the sea up to an offshore fixed depth; (3) the use of approximations for shoaling and inundation, based on local bathymetry, and for tidal stages; and (4) the exploration of several alternatives for the basic input data and their parameters which produces a number of models that are treated through an ensemble uncertainty quantification. This presentation will summarize the TSUMAPS-NEAM project goals, implementation, and achieved results, as well as the humps and bumps we run into during its development. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  8. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  9. Including foreshocks and aftershocks in time-independent probabilistic seismic hazard analyses

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    Time‐independent probabilistic seismic‐hazard analysis treats each source as being temporally and spatially independent; hence foreshocks and aftershocks, which are both spatially and temporally dependent on the mainshock, are removed from earthquake catalogs. Yet, intuitively, these earthquakes should be considered part of the seismic hazard, capable of producing damaging ground motions. In this study, I consider the mainshock and its dependents as a time‐independent cluster, each cluster being temporally and spatially independent from any other. The cluster has a recurrence time of the mainshock; and, by considering the earthquakes in the cluster as a union of events, dependent events have an opportunity to contribute to seismic ground motions and hazard. Based on the methods of the U.S. Geological Survey for a high‐hazard site, the inclusion of dependent events causes ground motions that are exceeded at probability levels of engineering interest to increase by about 10% but could be as high as 20% if variations in aftershock productivity can be accounted for reliably.

  10. Probabilistic tsunami hazard assessment for the Makran region with focus on maximum magnitude assumption

    NASA Astrophysics Data System (ADS)

    Hoechner, Andreas; Babeyko, Andrey Y.; Zamora, Natalia

    2016-06-01

    Despite having been rather seismically quiescent for the last decades, the Makran subduction zone is capable of hosting destructive earthquakes and tsunami. In particular, the well-known thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss rare but significantly larger events at the Makran subduction zone as possible scenarios. We analyze the instrumental and historical seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 300 000 years with varying magnitude-frequency relations. For every event in the catalogs we compute estimated tsunami heights and present the resulting tsunami hazard along the coasts of Pakistan, Iran and Oman in the form of probabilistic tsunami hazard curves. We show how the hazard results depend on variation of the Gutenberg-Richter parameters and especially maximum magnitude assumption.

  11. Probabilistic tsunami hazard assessment for the Makran region with focus on maximum magnitude assumption

    NASA Astrophysics Data System (ADS)

    Hoechner, A.; Babeyko, A. Y.; Zamora, N.

    2015-09-01

    Despite having been rather seismically quiescent for the last decades, the Makran subduction zone is capable of hosting destructive earthquakes and tsunami. In particular, the well-known thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss rare but significantly larger events at the Makran subduction zone as possible scenarios. We analyze the instrumental and historical seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 300 000 years with varying magnitude-frequency relations. For every event in the catalogs we compute estimated tsunami heights and present the resulting tsunami hazard along the coasts of Pakistan, Iran and Oman in the form of probabilistic tsunami hazard curves. We show how the hazard results depend on variation of the Gutenberg-Richter parameters and especially maximum magnitude assumption.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha

    Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.

  13. Source mechanisms of volcanic tsunamis.

    PubMed

    Paris, Raphaël

    2015-10-28

    Volcanic tsunamis are generated by a variety of mechanisms, including volcano-tectonic earthquakes, slope instabilities, pyroclastic flows, underwater explosions, shock waves and caldera collapse. In this review, we focus on the lessons that can be learnt from past events and address the influence of parameters such as volume flux of mass flows, explosion energy or duration of caldera collapse on tsunami generation. The diversity of waves in terms of amplitude, period, form, dispersion, etc. poses difficulties for integration and harmonization of sources to be used for numerical models and probabilistic tsunami hazard maps. In many cases, monitoring and warning of volcanic tsunamis remain challenging (further technical and scientific developments being necessary) and must be coupled with policies of population preparedness. © 2015 The Author(s).

  14. Probabilistic Tsunami Hazard Assessment along Nankai Trough (2) a comprehensive assessment including a variety of earthquake source areas other than those that the Earthquake Research Committee, Japanese government (2013) showed

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2016-12-01

    For the forthcoming Nankai earthquake with M8 to M9 class, the Earthquake Research Committee(ERC)/Headquarters for Earthquake Research Promotion, Japanese government (2013) showed 15 examples of earthquake source areas (ESAs) as possible combinations of 18 sub-regions (6 segments along trough and 3 segments normal to trough) and assessed the occurrence probability within the next 30 years (from Jan. 1, 2013) was 60% to 70%. Hirata et al.(2015, AGU) presented Probabilistic Tsunami Hazard Assessment (PTHA) along Nankai Trough in the case where diversity of the next event's ESA is modeled by only the 15 ESAs. In this study, we newly set 70 ESAs in addition of the previous 15 ESAs so that total of 85 ESAs are considered. By producing tens of faults models, with various slip distribution patterns, for each of 85 ESAs, we obtain 2500 fault models in addition of previous 1400 fault models so that total of 3900 fault models are considered to model the diversity of the next Nankai earthquake rupture (Toyama et al.,2015, JpGU). For PTHA, the occurrence probability of the next Nankai earthquake is distributed to possible 3900 fault models in the viewpoint of similarity to the 15 ESAs' extents (Abe et al.,2015, JpGU). A major concept of the occurrence probability distribution is; (i) earthquakes rupturing on any of 15 ESAs that ERC(2013) showed most likely occur, (ii) earthquakes rupturing on any of ESAs whose along-trench extent is the same as any of 15 ESAs but trough-normal extent differs from it second likely occur, (iii) earthquakes rupturing on any of ESAs whose both of along-trough and trough-normal extents differ from any of 15 ESAs rarely occur. Procedures for tsunami simulation and probabilistic tsunami hazard synthesis are the same as Hirata et al (2015). A tsunami hazard map, synthesized under an assumption that the Nankai earthquakes can be modeled as a renewal process based on BPT distribution with a mean recurrence interval of 88.2 years (ERC, 2013) and an aperiodicity of 0.22, as the median of the values (0.20 to 0.24)that ERC (2013) recommended, suggests that several coastal segments along the southwest coast of Shikoku Island, the southeast coast of Kii Peninsula, and the west coast of Izu Peninsula show over 26 % in exceedance probability that maximum water rise exceeds 10 meters at any coastal point within the next 30 years.

  15. The influence of maximum magnitude on seismic-hazard estimates in the Central and Eastern United States

    USGS Publications Warehouse

    Mueller, C.S.

    2010-01-01

    I analyze the sensitivity of seismic-hazard estimates in the central and eastern United States (CEUS) to maximum magnitude (mmax) by exercising the U.S. Geological Survey (USGS) probabilistic hazard model with several mmax alternatives. Seismicity-based sources control the hazard in most of the CEUS, but data seldom provide an objective basis for estimating mmax. The USGS uses preferred mmax values of moment magnitude 7.0 and 7.5 for the CEUS craton and extended margin, respectively, derived from data in stable continental regions worldwide. Other approaches, for example analysis of local seismicity or judgment about a source's seismogenic potential, often lead to much smaller mmax. Alternative models span the mmax ranges from the 1980s Electric Power Research Institute/Seismicity Owners Group (EPRI/SOG) analysis. Results are presented as haz-ard ratios relative to the USGS national seismic hazard maps. One alternative model specifies mmax equal to moment magnitude 5.0 and 5.5 for the craton and margin, respectively, similar to EPRI/SOG for some sources. For 2% probability of exceedance in 50 years (about 0.0004 annual probability), the strong mmax truncation produces hazard ratios equal to 0.35-0.60 for 0.2-sec spectral acceleration, and 0.15-0.35 for 1.0-sec spectral acceleration. Hazard-controlling earthquakes interact with mmax in complex ways. There is a relatively weak dependence on probability level: hazardratios increase 0-15% for 0.002 annual exceedance probability and decrease 5-25% for 0.00001 annual exceedance probability. Although differences at some sites are tempered when faults are added, mmax clearly accounts for some of the discrepancies that are seen in comparisons between USGS-based and EPRI/SOG-based hazard results.

  16. Satellite-map position estimation for the Mars rover

    NASA Technical Reports Server (NTRS)

    Hayashi, Akira; Dean, Thomas

    1989-01-01

    A method for locating the Mars rover using an elevation map generated from satellite data is described. In exploring its environment, the rover is assumed to generate a local rover-centered elevation map that can be used to extract information about the relative position and orientation of landmarks corresponding to local maxima. These landmarks are integrated into a stochastic map which is then matched with the satellite map to obtain an estimate of the robot's current location. The landmarks are not explicitly represented in the satellite map. The results of the matching algorithm correspond to a probabilistic assessment of whether or not the robot is located within a given region of the satellite map. By assigning a probabilistic interpretation to the information stored in the satellite map, researchers are able to provide a precise characterization of the results computed by the matching algorithm.

  17. Assessment of a Tsunami Hazard for Mediterranean Coast of Egypt

    NASA Astrophysics Data System (ADS)

    Zaytsev, Andrey; Babeyko, Andrey; Yalciner, Ahmet; Pelinovsky, Efim

    2017-04-01

    Analysis of tsunami hazard for Egypt based on historic data and numerical modelling of historic and prognostic events is given. There are 13 historic events for 4000 years, including one instrumental record (1956). Tsunami database includes 12 earthquake tsunamis and 1 event of volcanic origin (Santorini eruption). Tsunami intensity of events (365, 881, 1303, 1870) is estimated as I = 3 led to tsunami wave height more than 6 m. Numerical simulation of some possible scenario of tsunamis of seismic and landslide origin is done with use of NAMI-DANCE software solved the shallow-water equations. The PTHA method (Probabilistic Tsunami Hazard Assessment - Probabilistic assessment of a tsunami hazard) for the Mediterranean Sea developed in (Sorensen M.B., Spada M., Babeyko A., Wiemer S., Grunthal G. Probabilistic tsunami hazard in the Mediterranean Sea. J Geophysical Research, 2012, vol. 117, B01305) is used to evaluate the probability of tsunami occurrence on the Egyptian coast. The synthetic catalogue of prognostic tsunamis of seismic origin with magnitude more than 6.5 includes 84 920 events for 100000 years. For the wave heights more 1 m the curve: exceedance probability - tsunami height can be approximated by exponential Gumbel function with two parameters which are determined for each coastal location in Egypt (totally. 24 points). Prognostic extreme highest events with probability less 10-4 are not satisfied to the Gumbel function (approximately 10 events) and required the special analysis. Acknowledgements: This work was supported EU FP7 ASTARTE Project [603839], and for EP - NS6637.2016.5.

  18. Probabilistic flood extent estimates from social media flood observations

    NASA Astrophysics Data System (ADS)

    Brouwer, Tom; Eilander, Dirk; van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen

    2017-05-01

    The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from Twitter messages that mention locations of flooding. A deterministic flood map created for the December 2015 flood in the city of York (UK) showed good performance (F(2) = 0.69; a statistic ranging from 0 to 1, with 1 expressing a perfect fit with validation data). The probabilistic flood maps we created showed that, in the York case study, the uncertainty in flood extent was mainly induced by errors in the precise locations of flood observations as derived from Twitter data. Errors in the terrain elevation data or in the parameters of the applied algorithm contributed less to flood extent uncertainty. Although these maps tended to overestimate the actual probability of flooding, they gave a reasonable representation of flood extent uncertainty in the area. This study illustrates that inherently uncertain data from social media can be used to derive information about flooding.

  19. Database of potential sources for earthquakes larger than magnitude 6 in Northern California

    USGS Publications Warehouse

    ,

    1996-01-01

    The Northern California Earthquake Potential (NCEP) working group, composed of many contributors and reviewers in industry, academia and government, has pooled its collective expertise and knowledge of regional tectonics to identify potential sources of large earthquakes in northern California. We have created a map and database of active faults, both surficial and buried, that forms the basis for the northern California portion of the national map of probabilistic seismic hazard. The database contains 62 potential sources, including fault segments and areally distributed zones. The working group has integrated constraints from broadly based plate tectonic and VLBI models with local geologic slip rates, geodetic strain rate, and microseismicity. Our earthquake source database derives from a scientific consensus that accounts for conflict in the diverse data. Our preliminary product, as described in this report brings to light many gaps in the data, including a need for better information on the proportion of deformation in fault systems that is aseismic.

  20. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madankan, R.; Pouget, S.; Singla, P., E-mail: psingla@buffalo.edu

    Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This papermore » presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.« less

  1. Insurance Applications of Active Fault Maps Showing Epistemic Uncertainty

    NASA Astrophysics Data System (ADS)

    Woo, G.

    2005-12-01

    Insurance loss modeling for earthquakes utilizes available maps of active faulting produced by geoscientists. All such maps are subject to uncertainty, arising from lack of knowledge of fault geometry and rupture history. Field work to undertake geological fault investigations drains human and monetary resources, and this inevitably limits the resolution of fault parameters. Some areas are more accessible than others; some may be of greater social or economic importance than others; some areas may be investigated more rapidly or diligently than others; or funding restrictions may have curtailed the extent of the fault mapping program. In contrast with the aleatory uncertainty associated with the inherent variability in the dynamics of earthquake fault rupture, uncertainty associated with lack of knowledge of fault geometry and rupture history is epistemic. The extent of this epistemic uncertainty may vary substantially from one regional or national fault map to another. However aware the local cartographer may be, this uncertainty is generally not conveyed in detail to the international map user. For example, an area may be left blank for a variety of reasons, ranging from lack of sufficient investigation of a fault to lack of convincing evidence of activity. Epistemic uncertainty in fault parameters is of concern in any probabilistic assessment of seismic hazard, not least in insurance earthquake risk applications. A logic-tree framework is appropriate for incorporating epistemic uncertainty. Some insurance contracts cover specific high-value properties or transport infrastructure, and therefore are extremely sensitive to the geometry of active faulting. Alternative Risk Transfer (ART) to the capital markets may also be considered. In order for such insurance or ART contracts to be properly priced, uncertainty should be taken into account. Accordingly, an estimate is needed for the likelihood of surface rupture capable of causing severe damage. Especially where a high deductible is in force, this requires estimation of the epistemic uncertainty on fault geometry and activity. Transport infrastructure insurance is of practical interest in seismic countries. On the North Anatolian Fault in Turkey, there is uncertainty over an unbroken segment between the eastern end of the Dazce Fault and Bolu. This may have ruptured during the 1944 earthquake. Existing hazard maps may simply use a question mark to flag uncertainty. However, a far more informative type of hazard map might express spatial variations in the confidence level associated with a fault map. Through such visual guidance, an insurance risk analyst would be better placed to price earthquake cover, allowing for epistemic uncertainty.

  2. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  3. Probabilistic volcanic hazard assessments of Pyroclastic Density Currents: ongoing practices and future perspectives

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto

    2014-05-01

    Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.

  4. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporatesmore » deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.« less

  5. The use of belief-based probabilistic methods in volcanology: Scientists' views and implications for risk assessments

    NASA Astrophysics Data System (ADS)

    Donovan, Amy; Oppenheimer, Clive; Bravo, Michael

    2012-12-01

    This paper constitutes a philosophical and social scientific study of expert elicitation in the assessment and management of volcanic risk on Montserrat during the 1995-present volcanic activity. It outlines the broader context of subjective probabilistic methods and then uses a mixed-method approach to analyse the use of these methods in volcanic crises. Data from a global survey of volcanologists regarding the use of statistical methods in hazard assessment are presented. Detailed qualitative data from Montserrat are then discussed, particularly concerning the expert elicitation procedure that was pioneered during the eruptions. These data are analysed and conclusions about the use of these methods in volcanology are drawn. The paper finds that while many volcanologists are open to the use of these methods, there are still some concerns, which are similar to the concerns encountered in the literature on probabilistic and determinist approaches to seismic hazard analysis.

  6. The European Drought Observatory (EDO): Current State and Future Directions

    NASA Astrophysics Data System (ADS)

    Vogt, J.; Singleton, A.; Sepulcre, G.; Micale, F.; Barbosa, P.

    2012-12-01

    Europe has repeatedly been affected by droughts, resulting in considerable ecological and economic damage and climate change studies indicate a trend towards increasing climate variability most likely resulting in more frequent drought occurrences also in Europe. Against this background, the European Commission's Joint Research Centre (JRC) is developing methods and tools for assessing, monitoring and forecasting droughts in Europe and develops a European Drought Observatory (EDO) to complement and integrate national activities with a European view. At the core of the European Drought Observatory (EDO) is a portal, including a map server, a metadata catalogue, a media-monitor and analysis tools. The map server presents Europe-wide up-to-date information on the occurrence and severity of droughts, which is complemented by more detailed information provided by regional, national and local observatories through OGC compliant web mapping and web coverage services. In addition, time series of historical maps as well as graphs of the temporal evolution of drought indices for individual grid cells and administrative regions in Europe can be retrieved and analysed. Current work is focusing on validating the available products, improving the functionalities, extending the linkage to additional national and regional drought information systems and improving medium to long-range probabilistic drought forecasting products. Probabilistic forecasts are attractive in that they provide an estimate of the range of uncertainty in a particular forecast. Longer-term goals include the development of long-range drought forecasting products, the analysis of drought hazard and risk, the monitoring of drought impact and the integration of EDO in a global drought information system. The talk will provide an overview on the development and state of EDO, the different products, and the ways to include a wide range of stakeholders (i.e. European, national river basin, and local authorities) in the development of the system as well as an outlook on the future developments.

  7. Site-specific probabilistic ecological risk assessment of a volatile chlorinated hydrocarbon-contaminated tidal estuary.

    PubMed

    Hunt, James; Birch, Gavin; Warne, Michael St J

    2010-05-01

    Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment. Copyright (c) 2010 SETAC.

  8. Seismic hazard, risk, and design for South America

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.

  9. Predicting coastal cliff erosion using a Bayesian probabilistic model

    USGS Publications Warehouse

    Hapke, Cheryl J.; Plant, Nathaniel G.

    2010-01-01

    Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70–90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale.

  10. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  11. Probabilistic seismic hazard characterization and design parameters for the Pantex Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernreuter, D. L.; Foxall, W.; Savy, J. B.

    1998-10-19

    The Hazards Mitigation Center at Lawrence Livermore National Laboratory (LLNL) updated the seismic hazard and design parameters at the Pantex Plant. The probabilistic seismic hazard (PSH) estimates were first updated using the latest available data and knowledge from LLNL (1993, 1998), Frankel et al. (1996), and other relevant recent studies from several consulting companies. Special attention was given to account for the local seismicity and for the system of potentially active faults associated with the Amarillo-Wichita uplift. Aleatory (random) uncertainty was estimated from the available data and the epistemic (knowledge) uncertainty was taken from results of similar studies. Special attentionmore » was given to soil amplification factors for the site. Horizontal Peak Ground Acceleration (PGA) and 5% damped uniform hazard spectra were calculated for six return periods (100 yr., 500 yr., 1000 yr., 2000 yr., 10,000 yr., and 100,000 yr.). The design parameters were calculated following DOE standards (DOE-STD-1022 to 1024). Response spectra for design or evaluation of Performance Category 1 through 4 structures, systems, and components are presented.« less

  12. Source processes for the probabilistic assessment of tsunami hazards

    USGS Publications Warehouse

    Geist, Eric L.; Lynett, Patrick J.

    2014-01-01

    The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.

  13. Magnetic Tunnel Junction Mimics Stochastic Cortical Spiking Neurons

    NASA Astrophysics Data System (ADS)

    Sengupta, Abhronil; Panda, Priyadarshini; Wijesinghe, Parami; Kim, Yusung; Roy, Kaushik

    2016-07-01

    Brain-inspired computing architectures attempt to mimic the computations performed in the neurons and the synapses in the human brain in order to achieve its efficiency in learning and cognitive tasks. In this work, we demonstrate the mapping of the probabilistic spiking nature of pyramidal neurons in the cortex to the stochastic switching behavior of a Magnetic Tunnel Junction in presence of thermal noise. We present results to illustrate the efficiency of neuromorphic systems based on such probabilistic neurons for pattern recognition tasks in presence of lateral inhibition and homeostasis. Such stochastic MTJ neurons can also potentially provide a direct mapping to the probabilistic computing elements in Belief Networks for performing regenerative tasks.

  14. A PROBABILISTIC METHOD FOR ESTIMATING MONITORING POINT DENSITY FOR CONTAINMENT SYSTEM LEAK DETECTION

    EPA Science Inventory

    The use of physical and hydraulic containment systems for the isolation of contaminated ground water and aquifer materials ssociated with hazardous waste sites has increased during the last decade. The existing methodologies for monitoring and evaluating leakage from hazardous w...

  15. Report of the Workshop on Extreme Ground Motions at Yucca Mountain, August 23-25, 2004

    USGS Publications Warehouse

    Hanks, T.C.; Abrahamson, N.A.; Board, M.; Boore, D.M.; Brune, J.N.; Cornell, C.A.

    2006-01-01

    This Workshop has its origins in the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain, the designated site of the underground repository for the nation's high-level radioactive waste. In 1998 the Nuclear Regulatory Commission's Senior Seismic Hazard Analysis Committee (SSHAC) developed guidelines for PSHA which were published as NUREG/CR-6372, 'Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and the use of experts,' (SSHAC, 1997). This Level-4 study was the most complicated and complex PSHA ever undertaken at the time. The procedures, methods, and results of this PSHA are described in Stepp et al. (2001), mostly in the context of a probability of exceedance (hazard) of 10-4/yr for ground motion at Site A, a hypothetical, reference rock outcrop site at the elevation of the proposed emplacement drifts within the mountain. Analysis and inclusion of both aleatory and epistemic uncertainty were significant and time-consuming aspects of the study, which took place over three years and involved several dozen scientists, engineers, and analysts.

  16. QVAST: a new Quantum GIS plugin for estimating volcanic susceptibility

    NASA Astrophysics Data System (ADS)

    Bartolini, S.; Cappello, A.; Martí, J.; Del Negro, C.

    2013-11-01

    One of the most important tasks of modern volcanology is the construction of hazard maps simulating different eruptive scenarios that can be used in risk-based decision making in land-use planning and emergency management. The first step in the quantitative assessment of volcanic hazards is the development of susceptibility maps (i.e., the spatial probability of a future vent opening given the past eruptive activity of a volcano). This challenging issue is generally tackled using probabilistic methods that use the calculation of a kernel function at each data location to estimate probability density functions (PDFs). The smoothness and the modeling ability of the kernel function are controlled by the smoothing parameter, also known as the bandwidth. Here we present a new tool, QVAST, part of the open-source geographic information system Quantum GIS, which is designed to create user-friendly quantitative assessments of volcanic susceptibility. QVAST allows the selection of an appropriate method for evaluating the bandwidth for the kernel function on the basis of the input parameters and the shapefile geometry, and can also evaluate the PDF with the Gaussian kernel. When different input data sets are available for the area, the total susceptibility map is obtained by assigning different weights to each of the PDFs, which are then combined via a weighted summation and modeled in a non-homogeneous Poisson process. The potential of QVAST, developed in a free and user-friendly environment, is here shown through its application in the volcanic fields of Lanzarote (Canary Islands) and La Garrotxa (NE Spain).

  17. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  18. Broadband Ground Motion Simulation Recipe for Scenario Hazard Assessment in Japan

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Fujiwara, H.; Irikura, K.

    2014-12-01

    The National Seismic Hazard Maps for Japan, which consist of probabilistic seismic hazard maps (PSHMs) and scenario earthquake shaking maps (SESMs), have been published every year since 2005 by the Earthquake Research Committee (ERC) in the Headquarter for Earthquake Research Promotion, which was established in the Japanese government after the 1995 Kobe earthquake. The publication was interrupted due to problems in the PSHMs revealed by the 2011 Tohoku earthquake, and the Subcommittee for Evaluations of Strong Ground Motions ('Subcommittee') has been examining the problems for two and a half years (ERC, 2013; Fujiwara, 2014). However, the SESMs and the broadband ground motion simulation recipe used in them are still valid at least for crustal earthquakes. Here, we outline this recipe and show the results of validation tests for it.Irikura and Miyake (2001) and Irikura (2004) developed a recipe for simulating strong ground motions from future crustal earthquakes based on a characterization of their source models (Irikura recipe). The result of the characterization is called a characterized source model, where a rectangular fault includes a few rectangular asperities. Each asperity and the background area surrounding the asperities have their own uniform stress drops. The Irikura recipe defines the parameters of the fault and asperities, and how to simulate broadband ground motions from the characterized source model. The recipe for the SESMs was constructed following the Irikura recipe (ERC, 2005). The National Research Institute for Earth Science and Disaster Prevention (NIED) then made simulation codes along this recipe to generate SESMs (Fujiwara et al., 2006; Morikawa et al., 2011). The Subcommittee in 2002 validated a preliminary version of the SESM recipe by comparing simulated and observed ground motions for the 2000 Tottori earthquake. In 2007 and 2008, the Subcommittee carried out detailed validations of the current version of the SESM recipe and the NIED codes using ground motions from the 2005 Fukuoka earthquake. Irikura and Miyake (2011) summarized the latter validations, concluding that the ground motions were successfully simulated as shown in the figure. This indicates that the recipe has enough potential to generate broadband ground motions for scenario hazard assessment in Japan.

  19. Probabilistic Seismic Hazard Analysis for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N. S.; Varazanashvili, O.; Sharia, T.; Arabidze, V.; Tibaldi, A.; Bonali, F. L. L.; Russo, E.; Pasquaré Mariotto, F.

    2017-12-01

    Nowadays, seismic hazard studies are developed in terms of the calculation of Peak Ground Acceleration (PGA), Spectral Acceleration (SA), Peak Ground Velocity (PGV) and other recorded parameters. In the frame of EMME project PSH were calculated for Georgia using GMPE based on selection criteria. In the frame of Project N 216758 (supported by Shota Rustaveli National Science Foundation (SRNF)) PSH maps were estimated using hybrid- empirical ground motion prediction equation developed for Georgia. Due to the paucity of seismically recorded information, in this work we focused our research on a more robust dataset related to macroseismic data,and attempted to calculate the probabilistic seismic hazard directly in terms of macroseismicintensity. For this reason, we started calculating new intensity prediction equations (IPEs)for Georgia taking into account different sets, belonging to the same new database, as well as distances from the seismic source.With respect to the seismic source, in order to improve the quality of the results, we have also hypothesized the size of faults from empirical relations, and calculated new IPEs also by considering Joyner-Boore and rupture distances in addition to epicentral and hypocentral distances. Finally, site conditions have been included as variables for IPEs calculation Regarding the database, we used a brand new revised set of macroseismic data and instrumental records for the significant earthquakes that struck Georgia between 1900 and 2002.Particularly, a large amount of research and documents related to macroseismic effects of individual earthquakes, stored in the archives of the Institute of Geophysics, were used as sources for the new macroseismic data. The latter are reported in the Medvedev-Sponheuer-Karnikmacroseismic scale (MSK64). For each earthquake the magnitude, the focal depth and the epicenter location are also reported. An online version of the database, with therelated metadata,has been produced for the 69 revised earthquakes and is available online (http://www.enguriproject.unimib.it/; .

  20. Building Better Volcanic Hazard Maps Through Scientific and Stakeholder Collaboration

    NASA Astrophysics Data System (ADS)

    Thompson, M. A.; Lindsay, J. M.; Calder, E.

    2015-12-01

    All across the world information about natural hazards such as volcanic eruptions, earthquakes and tsunami is shared and communicated using maps that show which locations are potentially exposed to hazards of varying intensities. Unlike earthquakes and tsunami, which typically produce one dominant hazardous phenomenon (ground shaking and inundation, respectively) volcanic eruptions can produce a wide variety of phenomena that range from near-vent (e.g. pyroclastic flows, ground shaking) to distal (e.g. volcanic ash, inundation via tsunami), and that vary in intensity depending on the type and location of the volcano. This complexity poses challenges in depicting volcanic hazard on a map, and to date there has been no consistent approach, with a wide range of hazard maps produced and little evaluation of their relative efficacy. Moreover, in traditional hazard mapping practice, scientists analyse data about a hazard, and then display the results on a map that is then presented to stakeholders. This one-way, top-down approach to hazard communication does not necessarily translate into effective hazard education, or, as tragically demonstrated by Nevado del Ruiz, Columbia in 1985, its use in risk mitigation by civil authorities. Furthermore, messages taken away from a hazard map can be strongly influenced by its visual design. Thus, hazard maps are more likely to be useful, usable and used if relevant stakeholders are engaged during the hazard map process to ensure a) the map is designed in a relevant way and b) the map takes into account how users interpret and read different map features and designs. The IAVCEI Commission on Volcanic Hazards and Risk has recently launched a Hazard Mapping Working Group to collate some of these experiences in graphically depicting volcanic hazard from around the world, including Latin America and the Caribbean, with the aim of preparing some Considerations for Producing Volcanic Hazard Maps that may help map makers in the future.

  1. Liquefaction potential index: Field assessment

    USGS Publications Warehouse

    Toprak, S.; Holzer, T.L.

    2003-01-01

    Cone penetration test (CPT) soundings at historic liquefaction sites in California were used to evaluate the predictive capability of the liquefaction potential index (LPI), which was defined by Iwasaki et al. in 1978. LPI combines depth, thickness, and factor of safety of liquefiable material inferred from a CPT sounding into a single parameter. LPI data from the Monterey Bay region indicate that the probability of surface manifestations of liquefaction is 58 and 93%, respectively, when LPI equals or exceeds 5 and 15. LPI values also generally correlate with surface effects of liquefaction: Decreasing from a median of 12 for soundings in lateral spreads to 0 for soundings where no surface effects were reported. The index is particularly promising for probabilistic liquefaction hazard mapping where it may be a useful parameter for characterizing the liquefaction potential of geologic units.

  2. An atlas of ShakeMaps for selected global earthquakes

    USGS Publications Warehouse

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul S.; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have merged all of the ShakeMaps in the Atlas to provide a global perspective of earthquake ground shaking for the past 35 years, allowing comparison with probabilistic hazard maps. The online Atlas and supporting databases can be found at http://earthquake.usgs.gov/eqcenter/shakemap/atlas.php/.

  3. Mapping visual cortex in monkeys and humans using surface-based atlases

    NASA Technical Reports Server (NTRS)

    Van Essen, D. C.; Lewis, J. W.; Drury, H. A.; Hadjikhani, N.; Tootell, R. B.; Bakircioglu, M.; Miller, M. I.

    2001-01-01

    We have used surface-based atlases of the cerebral cortex to analyze the functional organization of visual cortex in humans and macaque monkeys. The macaque atlas contains multiple partitioning schemes for visual cortex, including a probabilistic atlas of visual areas derived from a recent architectonic study, plus summary schemes that reflect a combination of physiological and anatomical evidence. The human atlas includes a probabilistic map of eight topographically organized visual areas recently mapped using functional MRI. To facilitate comparisons between species, we used surface-based warping to bring functional and geographic landmarks on the macaque map into register with corresponding landmarks on the human map. The results suggest that extrastriate visual cortex outside the known topographically organized areas is dramatically expanded in human compared to macaque cortex, particularly in the parietal lobe.

  4. Disaggregated seismic hazard and the elastic input energy spectrum: An approach to design earthquake selection

    NASA Astrophysics Data System (ADS)

    Chapman, Martin Colby

    1998-12-01

    The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression modeling does not resolve significant effects due to site class at frequencies greater than approximately 5 Hz. Disaggregation of general seismic hazard models using Vsbea indicates that the modal magnitudes for the higher frequency oscillators tend to be larger, and vary less with oscillator frequency, than those derived using PSV. Insofar as the elastic input energy may be a better parameter for quantifying the damage potential of ground motion, its use in probabilistic seismic hazard analysis could provide an improved means for selecting earthquake scenarios and establishing design earthquakes for many types of engineering analyses.

  5. Evaluating the Benefits of Adaptation of Critical Infrastructures to Hydrometeorological Risks.

    PubMed

    Thacker, Scott; Kelly, Scott; Pant, Raghav; Hall, Jim W

    2018-01-01

    Infrastructure adaptation measures provide a practical way to reduce the risk from extreme hydrometeorological hazards, such as floods and windstorms. The benefit of adapting infrastructure assets is evaluated as the reduction in risk relative to the "do nothing" case. However, evaluating the full benefits of risk reduction is challenging because of the complexity of the systems, the scarcity of data, and the uncertainty of future climatic changes. We address this challenge by integrating methods from the study of climate adaptation, infrastructure systems, and complex networks. In doing so, we outline an infrastructure risk assessment that incorporates interdependence, user demands, and potential failure-related economic losses. Individual infrastructure assets are intersected with probabilistic hazard maps to calculate expected annual damages. Protection measure costs are integrated to calculate risk reduction and associated discounted benefits, which are used to explore the business case for investment in adaptation. A demonstration of the methodology is provided for flood protection of major electricity substations in England and Wales. We conclude that the ongoing adaptation program for major electricity assets is highly cost beneficial. © 2017 Society for Risk Analysis.

  6. Numerical modelling of glacial lake outburst floods using physically based dam-breach models

    NASA Astrophysics Data System (ADS)

    Westoby, M. J.; Brasington, J.; Glasser, N. F.; Hambrey, M. J.; Reynolds, J. M.; Hassan, M. A. A. M.; Lowe, A.

    2015-03-01

    The instability of moraine-dammed proglacial lakes creates the potential for catastrophic glacial lake outburst floods (GLOFs) in high-mountain regions. In this research, we use a unique combination of numerical dam-breach and two-dimensional hydrodynamic modelling, employed within a generalised likelihood uncertainty estimation (GLUE) framework, to quantify predictive uncertainty in model outputs associated with a reconstruction of the Dig Tsho failure in Nepal. Monte Carlo analysis was used to sample the model parameter space, and morphological descriptors of the moraine breach were used to evaluate model performance. Multiple breach scenarios were produced by differing parameter ensembles associated with a range of breach initiation mechanisms, including overtopping waves and mechanical failure of the dam face. The material roughness coefficient was found to exert a dominant influence over model performance. The downstream routing of scenario-specific breach hydrographs revealed significant differences in the timing and extent of inundation. A GLUE-based methodology for constructing probabilistic maps of inundation extent, flow depth, and hazard is presented and provides a useful tool for communicating uncertainty in GLOF hazard assessment.

  7. Collaborative Cyber-infrastructures for the Management of the UNESCO-IGCP Research Project "Forecast of tephra fallout"

    NASA Astrophysics Data System (ADS)

    Folch, A.; Costa, A.; Cordoba, G.

    2009-04-01

    Tephra fallout following explosive volcanic eruptions produces several hazardous effects on inhabitants, infrastructure, and property and represents a serious threat for communities located around active volcanoes. In order to mitigate the effects on the surrounding areas, scientists and civil decision-making authorities need reliable short-term forecasts during episodes of eruptive crisis and long-term probabilistic maps to plan territorial policies and land use. Modelling, together with field studies and volcano monitoring, constitutes an indispensable tool to achieve these objectives. The UNESCO-IGCP research project proposal "Forecast of tephra fallout" has the aim to produce a series of tools capable to elaborate both short-term forecasts and long-term hazard assessments using the cutting-edge models for tephra transport and sedimentation. A special project website will be designed to supply a set of models, procedures and expertise to several Latino-American Institutes based in countries seriously threatened by this geo-hazard (Argentina, Chile, Colombia, Ecuador, Mexico, and Nicaragua). This will proportionate to the final users a tool to elaborate short-term forecasts of tephra deposition on the ground, and determine airborne ash concentrations (a quantity of special relevance for aerial navigation safety) during eruptions and emergencies. The project web-site will have a public section and a password-protected area to exchange information and data among participants and, eventually, to allow remote execution of high-resolution mesoscale meteorological forecasts at the BSC facilities. The public website section will be updated periodically and will include sections describing the project objectives and achievements as well as the hazard maps for the investigated volcanoes, and will be linked to other relevant websites such as IAVCEI, IGCP, IUGS and UNESCO homepages. A part of the public section of the website will be devoted to disseminate achieved scientific results, provide general advice, and display hazard maps to a larger public beyond the scientific community. The website private section will include a software and documentation download section as well as a gateway to run the WRF mesoscale meteorological model and the parallel version of the FALL3D model at the BSC facilities. It will be invaluable during an eventual emergency if the affected institution does not yet have an agreement with its national weather service.

  8. Regional liquefaction hazard evaluation following the 2010-2011 Christchurch (New Zealand) earthquake sequence

    NASA Astrophysics Data System (ADS)

    Begg, John; Brackley, Hannah; Irwin, Marion; Grant, Helen; Berryman, Kelvin; Dellow, Grant; Scott, David; Jones, Katie; Barrell, David; Lee, Julie; Townsend, Dougal; Jacka, Mike; Harwood, Nick; McCahon, Ian; Christensen, Steve

    2013-04-01

    Following the damaging 4 Sept 2010 Mw7.1 Darfield Earthquake, the 22 Feb 2011 Christchurch Earthquake and subsequent damaging aftershocks, we completed a liquefaction hazard evaluation for c. 2700 km2 of the coastal Canterbury region. Its purpose was to distinguish at a regional scale areas of land that, in the event of strong ground shaking, may be susceptible to damaging liquefaction from areas where damaging liquefaction is unlikely. This information will be used by local government for defining liquefaction-related geotechnical investigation requirements for consent applications. Following a review of historic records of liquefaction and existing liquefaction assessment maps, we undertook comprehensive new work that included: a geologic context from existing geologic maps; geomorphic mapping using LiDAR and integrating existing soil map data; compilation of lithological data for the surficial 10 m from an extensive drillhole database; modelling of depth to unconfined groundwater from existing subsurface and surface water data. Integrating and honouring all these sources of information, we mapped areas underlain by materials susceptible to liquefaction (liquefaction-prone lithologies present, or likely, in the near-surface, with shallow unconfined groundwater) from areas unlikely to suffer widespread liquefaction damage. Comparison of this work with more detailed liquefaction susceptibility assessment based on closely spaced geotechnical probes in Christchurch City provides a level of confidence in these results. We tested our susceptibility map by assigning a matrix of liquefaction susceptibility rankings to lithologies recorded in drillhole logs and local groundwater depths, then applying peak ground accelerations for four earthquake scenarios from the regional probabilistic seismic hazard model (25 year return = 0.13g; 100 year return = 0.22g; 500 year return = 0.38g and 2500 year return = 0.6g). Our mapped boundary between liquefaction-prone areas and areas unlikely to sustain heavy damage proved sound. In addition, we compared mapped liquefaction extents (derived from post-earthquake aerial photographs) from the 4 Sept 2010 Mw7.1 and 22 Feb 2011 Mw6.2 earthquakes with our liquefaction susceptibility map. The overall area of liquefaction for these two earthquakes was similar, and statistics show that for the first (large regional) earthquake, c. 93% of mapped liquefaction fell within the liquefaction-prone area, and for the second (local, high peak ground acceleration) earthquake, almost 99% fell within the liquefaction-prone area. We conclude that basic geological and groundwater data when coupled with LiDAR data can usefully delineate areas susceptible to liquefaction from those unlikely to suffer damaging liquefaction. We believe that these techniques can be used successfully in many other cities around the world.

  9. Design and evaluation guidelines for Department of Energy facilities subjected to natural phenomena hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, R.P.; Short, S.A.; McDonald, J.R.

    1990-06-01

    The Department of Energy (DOE) and the DOE Natural Phenomena Hazards Panel have developed uniform design and evaluation guidelines for protection against natural phenomena hazards at DOE sites throughout the United States. The goal of the guidelines is to assure that DOE facilities can withstand the effects of natural phenomena such as earthquakes, extreme winds, tornadoes, and flooding. The guidelines apply to both new facilities (design) and existing facilities (evaluation, modification, and upgrading). The intended audience is primarily the civil/structural or mechanical engineers conducting the design or evaluation of DOE facilities. The likelihood of occurrence of natural phenomena hazards atmore » each DOE site has been evaluated by the DOE Natural Phenomena Hazard Program. Probabilistic hazard models are available for earthquake, extreme wind/tornado, and flood. Alternatively, site organizations are encouraged to develop site-specific hazard models utilizing the most recent information and techniques available. In this document, performance goals and natural hazard levels are expressed in probabilistic terms, and design and evaluation procedures are presented in deterministic terms. Design/evaluation procedures conform closely to common standard practices so that the procedures will be easily understood by most engineers. Performance goals are expressed in terms of structure or equipment damage to the extent that: (1) the facility cannot function; (2) the facility would need to be replaced; or (3) personnel are endangered. 82 refs., 12 figs., 18 tabs.« less

  10. Heterogeneous Data Fusion Methods for Disaster Risk Assessment using Grid Infrastructure

    NASA Astrophysics Data System (ADS)

    Kussul, Nataliia; Skakun, Sergii; Shelestov, Andrii

    2014-05-01

    In recent years, a risk-oriented approach to manage disasters has been adopted. Risk is a function of two arguments: hazard probability and vulnerability [1]. In order to assess flood risk, for example, aggregation of heterogeneous data acquired from multiple sources is required. Outputs from hydrological and hydraulic models make it possible to predict floods; in situ observations such as river level and flows are used for early warning and models calibration. Remote sensing observations can be effectively used for rapid mapping in case of emergencies, and can be assimilated into models. One point that is mutual for all datasets is their geospatial nature. In order to enable operational assessment of disaster risk, appropriate technology is necessary. In this paper we discuss different strategies to heterogeneous data fusion and show their application in the domain of disaster monitoring and risk assessment. In particular, two case-studies are presented. The first one focuses on the use of time-series of satellite imagery to flood hazard mapping and flood risk assessment. Flooded areas are extracted from satellite images to generate a maximum flood extent image for each flood event. These maps are fused to determine relative frequency of inundation (RFI) [2]. The RFI values are compared to relative water depth generated from the LISFLOOD-FP model. The model is calibrated against the satellite-derived flood extent. The model with different combinations of Manning's parameters was run in the Grid environment at Space Research Institute NASU-SSAU [3], and the optimal set of parameters was found. It is shown that RFI and water depth exhibit the same probabilistic distribution which is confirmed by Kolmogorov-Smirnov test. Therefore, it justifies the use of RFI values for risk assessment. The second case-study deals with quantitative estimation of drought risk in Ukraine based on satellite data. Drought hazard mapping is performed based on the use of vegetation health index (VHI) derived from NOAA satellites, and the extreme value theory techniques. Drought vulnerability is assessed by estimating the crop areas and crop yield to quantify potential impact of a drought on crop production. Finally, drought hazard and vulnerability maps are fused to derive a drought risk map. [1] N.N. Kussul, B.V. Sokolov, Y.I. Zyelyk, V.A. Zelentsov, S.V. Skakun, and A.Yu. Shelestov, "Disaster Risk Assessment Based on Heterogeneous Geospatial Information," J. of Autom. and Inf. Sci., 42(12), pp. 32-45, 2010. [2] S. Skakun, N. Kussul, A. Shelestov, and O. Kussul, "Flood Hazard and Flood Risk Assessment Using a Time Series of Satellite Images: A Case Study in Namibia," Risk Analysis, 2013, doi: 10.1111/risa.12156. [3] L. Hluchy, N. Kussul, A. Shelestov, S. Skakun, O. Kravchenko, Y. Gripich, P. Kopp, E. Lupian, "The Data Fusion Grid Infrastructure: Project Objectives and Achievements," Computing and Informatics, vol. 29, no. 2, pp. 319-334, 2010.

  11. Landslide Hazard Assessment and Mapping in the Guil Catchment (Queyras, Southern French Alps): From Landslide Inventory to Susceptibility Modelling

    NASA Astrophysics Data System (ADS)

    Roulleau, Louise; Bétard, François; Carlier, Benoît; Lissak, Candide; Fort, Monique

    2016-04-01

    Landslides are common natural hazards in the Southern French Alps, where they may affect human lives and cause severe damages to infrastructures. As a part of the SAMCO research project dedicated to risk evaluation in mountain areas, this study focuses on the Guil river catchment (317 km2), Queyras, to assess landslide hazard poorly studied until now. In that area, landslides are mainly occasional, low amplitude phenomena, with limited direct impacts when compared to other hazards such as floods or snow avalanches. However, when interacting with floods during extreme rainfall events, landslides may have indirect consequences of greater importance because of strong hillslope-channel connectivity along the Guil River and its tributaries (i.e. positive feedbacks). This specific morphodynamic functioning reinforces the need to have a better understanding of landslide hazards and their spatial distribution at the catchment scale to prevent local population from disasters with multi-hazard origin. The aim of this study is to produce a landslide susceptibility mapping at 1:50 000 scale as a first step towards global estimation of landslide hazard and risk. The three main methodologies used for assessing landslide susceptibility are qualitative (i.e. expert opinion), deterministic (i.e. physics-based models) and statistical methods (i.e. probabilistic models). Due to the rapid development of geographical information systems (GIS) during the last two decades, statistical methods are today widely used because they offer a greater objectivity and reproducibility at large scales. Among them, multivariate analyses are considered as the most robust techniques, especially the logistic regression method commonly used in landslide susceptibility mapping. However, this method like others is strongly dependent on the accuracy of the input data to avoid significant errors in the final results. In particular, a complete and accurate landslide inventory is required before the modelling. The methodology used in our study includes five main steps: (i) a landslide inventory was compiled through extraction of landslide occurrences in existing national databases (BDMvt, RTM), photointerpretation of aerial photographs and extensive field surveys; (ii) the main predisposing factors were identified and implemented as digital layers into a GIS together with the landslide inventory map, thus constituting the predictive variables to introduce into the model; (iii) a logistic regression model was applied to analyze the spatial and mathematical relationships between the response variable (i.e. absence/presence of landslides) and the set of predictive variables (i.e. predisposing factors), after a selection procedure based on statistical tests (χ2-test and Cramer's V coefficient); (iv) an evaluation of the model performance and quality results was conducted using a validation strategy based on ROC curve and AUC analyses; (v) a final susceptibility map in four classes was proposed using a discretization method based on success/prediction rate curves. The results of the susceptibility modelling were finally interpreted and discussed in the light of what was previously known about landslide occurrence and triggering in the study area. The major influence of the distance-to-streams variable on the model confirms the strong hillslope-channel coupling observed empirically during rainfall-induced landslide events.

  12. Mean and modal ϵ in the deaggregation of probabilistic ground motion

    USGS Publications Warehouse

    Harmsen, Stephen C.

    2001-01-01

    Mean and modal ϵ exhibit a wide variation geographically for any specified PE. Modal ϵ for the 2% in 50 yr PE exceeds 2 near the most active western California faults, is less than –1 near some less active faults of the western United States (principally in the Basin and Range), and may be less than 0 in areal fault zones of the central and eastern United States (CEUS). This geographic variation is useful for comparing probabilistic ground motions with ground motions from scenario earthquakes on dominating faults, often used in seismic-resistant provisions of building codes. An interactive seismic-hazard deaggregation menu item has been added to the USGS probabilistic seismic-hazard analysis Web site, http://geohazards.cr.usgs.gov/eq/, allowing visitors to compute mean and modal distance, magnitude, and ϵ corresponding to ground motions having mean return times from 250 to 5000 yr for any site in the United States.

  13. The use of Near-surface Geophysics in Evaluating and Assessing Natural Hazards

    NASA Astrophysics Data System (ADS)

    Pellerin, L.

    2007-12-01

    The list of natural hazards that transform the physical environmental is extensive: earthquakes, tsunamis, floods, volcanoes, lahars, landslides and debris flows, avalanches, karst/cavern collapse, heavy-metal contamination, permafrost, liquefaction, and magnetic storms. Because these events or conditions can have significant negative impact on health and infrastructure, the need for knowledge about and education of natural hazards is important. Near-surface geophysics can contribute in significant ways to both the knowledge base and wider understanding of these hazards. The discipline encompasses a wide range of methodologies, some of which are described below. A post-tsunami helicopter electromagnetic (EM) survey along the coasts of Aceh, northern Sumatra was used to discriminate between fresh-water and saltwater aquifers., saltwater intrusion occurred close to the coast as a result of the tsunami and deep saltwater occurrences particularly around 30 m depth were mapped up to several kilometers inland. Based on the survey results recommendations were made to locate shallow hand-dug wells and medium depth (60m) water wells. Utilizing airborne EM and magnetic measurements, a detailed assessment of the internal distribution of altered zones within an active volcano; Mount Rainier (NW USA) showed that alteration is much more restricted than had been inferred from surficial exposures alone. The study also suggested that the collapse of fresh, unaltered portions of the volcano is possible, and no flank of the volcano can be considered immune from lahars during eruption. Ground penetrating radar (GPR) has been used worldwide in a variety of applications from geotechnical investigations related to geologic hazards. These include assessment of transportation infrastructure, which maybe be damaged due to a natural hazard, study of the movement of rock glaciers in the Swiss Alps, and search and recovery of avalanche victims. Permafrost is widespread in polar areas and cold mountain terrain. GPR, electrical resistivity and EM methods have been used successfully to map permafrost and massive ground ice. The stability of these materials has impact on building and development within these regions. Mass movements in lowland permafrost terrain, which have implications for climate change, are being monitoring with thermal borehole measurements. Whether in times of flood or draught, understanding our watersheds is an important use of near surface geophysics. Satellite-based remote sensing methods are used to efficiently obtain soil moisture measurements over large regions. Ground-based conductivity meters are used to map soil types that play a fundamental role affecting the pattern of stream flow response. Synthetic Aperture Radar (SAR) is used to directly measure surface deformation, which can be related to subsurface hydrological conditions, aseismic deformation or landslides. Resolution of fine as 2mm/year can be obtained from satellite-based measurements. This level of resolutions aids in seismic risk assessment and allows the extent of landslides to be mapped and monitored efficiently. A series of national probabilistic seismic shaking hazard maps are being produced by the US Geological Survey using gravity, magnetic and seismic data in addition to other information. They can be used as input for many policy decisions on building codes and land use, and to estimate the probabilities of strong earthquakes, detailed maps of shaking amplification and susceptibility to liquefaction and landslides, and planning scenarios of large urban earthquakes.

  14. Systems and methods that generate height map models for efficient three dimensional reconstruction from depth information

    DOEpatents

    Frahm, Jan-Michael; Pollefeys, Marc Andre Leon; Gallup, David Robert

    2015-12-08

    Methods of generating a three dimensional representation of an object in a reference plane from a depth map including distances from a reference point to pixels in an image of the object taken from a reference point. Weights are assigned to respective voxels in a three dimensional grid along rays extending from the reference point through the pixels in the image based on the distances in the depth map from the reference point to the respective pixels, and a height map including an array of height values in the reference plane is formed based on the assigned weights. An n-layer height map may be constructed by generating a probabilistic occupancy grid for the voxels and forming an n-dimensional height map comprising an array of layer height values in the reference plane based on the probabilistic occupancy grid.

  15. Impact from Magnitude-Rupture Length Uncertainty on Seismic Hazard and Risk

    NASA Astrophysics Data System (ADS)

    Apel, E. V.; Nyst, M.; Kane, D. L.

    2015-12-01

    In probabilistic seismic hazard and risk assessments seismic sources are typically divided into two groups: fault sources (to model known faults) and background sources (to model unknown faults). In areas like the Central and Eastern United States and Hawaii the hazard and risk is driven primarily by background sources. Background sources can be modeled as areas, points or pseudo-faults. When background sources are modeled as pseudo-faults, magnitude-length or magnitude-area scaling relationships are required to construct these pseudo-faults. However the uncertainty associated with these relationships is often ignored or discarded in hazard and risk models, particularly when faults sources are the dominant contributor. Conversely, in areas modeled only with background sources these uncertainties are much more significant. In this study we test the impact of using various relationships and the resulting epistemic uncertainties on the seismic hazard and risk in the Central and Eastern United States and Hawaii. It is common to use only one magnitude length relationship when calculating hazard. However, Stirling et al. (2013) showed that for a given suite of magnitude-rupture length relationships the variability can be quite large. The 2014 US National Seismic Hazard Maps (Petersen et al., 2014) used one magnitude-rupture length relationship (Somerville, et al., 2001) in the Central and Eastern United States, and did not consider variability in the seismogenic rupture plane width. Here we use a suite of metrics to compare the USGS approach with these variable uncertainty models to assess 1) the impact on hazard and risk and 2) the epistemic uncertainty associated with choice of relationship. In areas where the seismic hazard is dominated by larger crustal faults (e.g. New Madrid) the choice of magnitude-rupture length relationship has little impact on the hazard or risk. However away from these regions, the choice of relationship is more significant and may approach the size of the uncertainty associated with the ground motion prediction equation suite.

  16. Probabilistic seismic hazard assessment for the effect of vertical ground motions on seismic response of highway bridges

    NASA Astrophysics Data System (ADS)

    Yilmaz, Zeynep

    Typically, the vertical component of the ground motion is not considered explicitly in seismic design of bridges, but in some cases the vertical component can have a significant effect on the structural response. The key question of when the vertical component should be incorporated in design is answered by the probabilistic seismic hazard assessment study incorporating the probabilistic seismic demand models and ground motion models. Nonlinear simulation models with varying configurations of an existing bridge in California were considered in the analytical study. The simulation models were subjected to the set of selected ground motions in two stages: at first, only horizontal components of the motion were applied; while in the second stage the structures were subjected to both horizontal and vertical components applied simultaneously and the ground motions that produced the largest adverse effects on the bridge system were identified. Moment demand in the mid-span and at the support of the longitudinal girder and the axial force demand in the column are found to be significantly affected by the vertical excitations. These response parameters can be modeled using simple ground motion parameters such as horizontal spectral acceleration and vertical spectral acceleration within 5% to 30% error margin depending on the type of the parameter and the period of the structure. For a complete hazard assessment, both of these ground motion parameters explaining the structural behavior should also be modeled. For the horizontal spectral acceleration, Abrahamson and Silva (2008) model was used within many available standard model. A new NGA vertical ground motion model consistent with the horizontal model was constructed. These models are combined in a vector probabilistic seismic hazard analyses. Series of hazard curves developed and presented for different locations in Bay Area for soil site conditions to provide a roadmap for the prediction of these features for future earthquakes. Findings from this study will contribute to the development of revised guidelines to address vertical ground motion effects, particularly in the near fault regions, in the seismic design of highway bridges.

  17. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L. K.; Vogel, R. M.

    2015-11-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.

  18. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  19. Attenuation Tomography Based on Strong Motion Data: Case Study of Central Honshu Region, Japan

    NASA Astrophysics Data System (ADS)

    Kumar, Parveen; Joshi, A.; Verma, O. P.

    2013-12-01

    Three-dimensional frequency dependent S-wave quality factor (Qβ(f)) value for the central Honshu region of Japan has been determined in this paper using an algorithm based on inversion of strong motion data. The method of inversion for determination of three-dimensional attenuation coefficients is proposed by H ashida and S himazaki (J Phys Earth. 32, 299-316, 1984) and has been used and modified by J oshi (Curr Sci. 90, 581-585, 2006; Nat Hazards. 43, 129-146, 2007) and J oshi et al. (J. Seismol. 14, 247-272, 2010). Twenty-one earthquakes digitally recorded on strong motion stations of Kik-net network have been used in this work. The magnitude of these earthquake ranges from 3.1 to 4.2 and depth ranging from 5 to 20 km, respectively. The borehole data having high signal to noise ratio and minimum site effect is used in the present work. The attenuation structure is determined by dividing the entire area into twenty-five three-dimensional blocks of uniform thickness having different frequency-dependent shear wave quality factor. Shear wave quality factor values have been determined at frequencies of 2.5, 7.0 and 10 Hz from record in a rectangular grid defined by 35.4°N to 36.4°N and 137.2°E to 138.2°E. The obtained attenuation structure is compared with the available geological features in the region and comparison shows that the obtained structure is capable of resolving important tectonic features present in the area. The proposed attenuation structure is compared with the probabilistic seismic hazard map of the region and shows that it bears some remarkable similarity in the patterns seen in seismic hazard map.

  20. Global coastal flood hazard mapping

    NASA Astrophysics Data System (ADS)

    Eilander, Dirk; Winsemius, Hessel; Ward, Philip; Diaz Loaiza, Andres; Haag, Arjen; Verlaan, Martin; Luo, Tianyi

    2017-04-01

    Over 10% of the world's population lives in low-lying coastal areas (up to 10m elevation). Many of these areas are prone to flooding from tropical storm surges or extra-tropical high sea levels in combination with high tides. A 1 in 100 year extreme sea level is estimated to expose 270 million people and 13 trillion USD worth of assets to flooding. Coastal flood risk is expected to increase due to drivers such as ground subsidence, intensification of tropical and extra-tropical storms, sea level rise and socio-economic development. For better understanding of the hazard and drivers to global coastal flood risk, a globally consistent analysis of coastal flooding is required. In this contribution we present a comprehensive global coastal flood hazard mapping study. Coastal flooding is estimated using a modular inundation routine, based on a vegetation corrected SRTM elevation model and forced by extreme sea levels. Per tile, either a simple GIS inundation routine or a hydrodynamic model can be selected. The GIS inundation method projects extreme sea levels to land, taking into account physical obstructions and dampening of the surge level land inwards. For coastlines with steep slopes or where local dynamics play a minor role in flood behavior, this fast GIS method can be applied. Extreme sea levels are derived from the Global Tide and Surge Reanalysis (GTSR) dataset. Future sea level projections are based on probabilistic sea level rise for RCP 4.5 and RCP 8.5 scenarios. The approach is validated against observed flood extents from ground and satellite observations. The results will be made available through the online Aqueduct Global Flood Risk Analyzer of the World Resources Institute.

  1. Beyond eruptive scenarios: assessing tephra fallout hazard from Neapolitan volcanoes.

    PubMed

    Sandri, Laura; Costa, Antonio; Selva, Jacopo; Tonini, Roberto; Macedonio, Giovanni; Folch, Arnau; Sulpizio, Roberto

    2016-04-12

    Assessment of volcanic hazards is necessary for risk mitigation. Typically, hazard assessment is based on one or a few, subjectively chosen representative eruptive scenarios, which use a specific combination of eruptive sizes and intensities to represent a particular size class of eruption. While such eruptive scenarios use a range of representative members to capture a range of eruptive sizes and intensities in order to reflect a wider size class, a scenario approach neglects to account for the intrinsic variability of volcanic eruptions, and implicitly assumes that inter-class size variability (i.e. size difference between different eruptive size classes) dominates over intra-class size variability (i.e. size difference within an eruptive size class), the latter of which is treated as negligible. So far, no quantitative study has been undertaken to verify such an assumption. Here, we adopt a novel Probabilistic Volcanic Hazard Analysis (PVHA) strategy, which accounts for intrinsic eruptive variabilities, to quantify the tephra fallout hazard in the Campania area. We compare the results of the new probabilistic approach with the classical scenario approach. The results allow for determining whether a simplified scenario approach can be considered valid, and for quantifying the bias which arises when full variability is not accounted for.

  2. Effects of shipping on marine acoustic habitats in Canadian Arctic estimated via probabilistic modeling and mapping.

    PubMed

    Aulanier, Florian; Simard, Yvan; Roy, Nathalie; Gervaise, Cédric; Bandet, Marion

    2017-12-15

    Canadian Arctic and Subarctic regions experience a rapid decrease of sea ice accompanied with increasing shipping traffic. The resulting time-space changes in shipping noise are studied for four key regions of this pristine environment, for 2013 traffic conditions and a hypothetical tenfold traffic increase. A probabilistic modeling and mapping framework, called Ramdam, which integrates the intrinsic variability and uncertainties of shipping noise and its effects on marine habitats, is developed and applied. A substantial transformation of soundscapes is observed in areas where shipping noise changes from present occasional-transient contributor to a dominant noise source. Examination of impacts on low-frequency mammals within ecologically and biologically significant areas reveals that shipping noise has the potential to trigger behavioral responses and masking in the future, although no risk of temporary or permanent hearing threshold shifts is noted. Such probabilistic modeling and mapping is strategic in marine spatial planning of this emerging noise issues. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  3. Assessment of global flood exposures - developing an appropriate approach

    NASA Astrophysics Data System (ADS)

    Millinship, Ian; Booth, Naomi

    2015-04-01

    Increasingly complex probabilistic catastrophe models have become the standard for quantitative flood risk assessments by re/insurance companies. On the one hand, probabilistic modelling of this nature is extremely useful; a large range of risk metrics can be output. However, they can be time consuming and computationally expensive to develop and run. Levels of uncertainty are persistently high despite, or perhaps because of, attempts to increase resolution and complexity. A cycle of dependency between modelling companies and re/insurers has developed whereby available models are purchased, models run, and both portfolio and model data 'improved' every year. This can lead to potential exposures in perils and territories that are not currently modelled being largely overlooked by companies, who may then face substantial and unexpected losses when large events occur in these areas. We present here an approach to assessing global flood exposures which reduces the scale and complexity of approach used and begins with the identification of hotspots where there is a significant exposure to flood risk. The method comprises four stages: i) compile consistent exposure information, ii) to apply reinsurance terms and conditions to calculate values exposed, iii) to assess the potential hazard using a global set of flood hazard maps, and iv) to identify potential risk 'hotspots' which include considerations of spatially and/or temporally clustered historical events, and local flood defences. This global exposure assessment is designed as a scoping exercise, and reveals areas or cities where the potential for accumulated loss is of significant interest to a reinsurance company, and for which there is no existing catastrophe model. These regions are then candidates for the development of deterministic scenarios, or probabilistic models. The key advantages of this approach will be discussed. These include simplicity and ability of business leaders to understand results, as well as ease and speed of analysis and the advantages this can offer in terms of monitoring changing exposures over time. Significantly, in many areas of the world, this increase in exposure is likely to have more of an impact on increasing catastrophe losses than potential anthropogenically driven changes in weather extremes.

  4. Integrating volcanic hazard data in a systematic approach to develop volcanic hazard maps in the Lesser Antilles

    NASA Astrophysics Data System (ADS)

    Lindsay, Jan M.; Robertson, Richard E. A.

    2018-04-01

    We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick 'em Jenny and Ronde/Caille), Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s) of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past 10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that the documentation of our experience might be useful for other map makers to take into account when creating new or updating existing maps.

  5. A time-dependent probabilistic seismic-hazard model for California

    USGS Publications Warehouse

    Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.

    2000-01-01

    For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.

  6. Tsunami evacuation plans for future megathrust earthquakes in Padang, Indonesia, considering stochastic earthquake scenarios

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro; Alexander, Nicholas A.; Kongko, Widjo; Muhari, Abdul

    2017-12-01

    This study develops tsunami evacuation plans in Padang, Indonesia, using a stochastic tsunami simulation method. The stochastic results are based on multiple earthquake scenarios for different magnitudes (Mw 8.5, 8.75, and 9.0) that reflect asperity characteristics of the 1797 historical event in the same region. The generation of the earthquake scenarios involves probabilistic models of earthquake source parameters and stochastic synthesis of earthquake slip distributions. In total, 300 source models are generated to produce comprehensive tsunami evacuation plans in Padang. The tsunami hazard assessment results show that Padang may face significant tsunamis causing the maximum tsunami inundation height and depth of 15 and 10 m, respectively. A comprehensive tsunami evacuation plan - including horizontal evacuation area maps, assessment of temporary shelters considering the impact due to ground shaking and tsunami, and integrated horizontal-vertical evacuation time maps - has been developed based on the stochastic tsunami simulation results. The developed evacuation plans highlight that comprehensive mitigation policies can be produced from the stochastic tsunami simulation for future tsunamigenic events.

  7. New Intensity Attenuation in Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N. S.; Varazanashvili, O.; Tibaldi, A.; Bonali, F.; Gogoladze, Z.; Kvavadze, N.; Kvedelidze, I.

    2016-12-01

    In seismic-prone zones, increase of urbanization and infrastructures in turn produces increase of seismic risk that is mainly related to: the level of seismic hazard itself, the seismic resistance of dwelling houses, and many other factors. The relevant objectives of the present work is to improve the regional seismic hazard maps of Georgia, by implementing state-of-the art probabilistic seismic hazard assessment techniques and outputs from recent national and international collaborations. Seismic zoning is the identification of zones of similar levels of earthquake hazard. With reference to seismic zoning by ground motion assessment, the shaking intensity essentially depends on i) regional seismicity, ii) attenuation of ground motion with distance, iii) local site effects on ground motion. In the last decade, seismic hazard assessment is presented in terms of Peak Ground Acceleration (PGA), Peak Ground Velocity (PGV), or other recorded parameters. But there are very limited strong motion dataset in Georgia. Furthermore, vulnerability of buildings still is estimated for intensity, and there are no information about correlation between the distribution of ground motion recorded parameters and damage. So, macroseimic Intensity is still a very important parameter for strong ground motion evaluation. In the present work, we calibrated intensity prediction equations (IPE) for the Georgian dataset based on about 78 reviewed earthquakes. Metadata for Intensity (MSK 64 scale) were constrained and predictionequations for various types of distance (epicentral and hypocentral distance, Joyner-Boore distance, closest distance to the fault rupture plane) were calibrated. Relations between intensity and PGA values were derived. For this we used hybrid-empirical ground motion equation derived for Georgia and run scenario earthquakes for events with macroseismic data.

  8. Evaluation of the Seismic Hazard in Venezuela with a revised seismic catalog that seeks for harmonization along the country borders

    NASA Astrophysics Data System (ADS)

    Rendon, H.; Alvarado, L.; Paolini, M.; Olbrich, F.; González, J.; Ascanio, W.

    2013-05-01

    Probabilistic Seismic Hazard Assessment is a complex endeavor that relies on the quality of the information that comes from different sources: the seismic catalog, active faults parameters, strain rates, etc. Having this in mind, during the last several months, the FUNVISIS seismic hazard group has been working on a review and update of the local data base that form the basis for a reliable PSHA calculation. In particular, the seismic catalog, which provides the necessary information that allows the evaluation of the critical b-value, which controls how seismic occurrence distributes with magnitude, has received particular attention. The seismic catalog is the result of the effort of several generations of researchers along the years; therefore, the catalog necessarily suffers from the lack of consistency, homogeneity and completeness for all ranges of magnitude over any seismic study area. Merging the FUNVISIS instrumental catalog with the ones obtained from international agencies, we present the work that we have been doing to produce a consistent seismic catalog that covers Venezuela entirely, with seismic events starting from 1910 until 2012, and report the magnitude of completeness for the different periods. Also, we present preliminary results on the Seismic Hazard evaluation that takes into account such instrumental catalog, the historical catalog, updated known fault geometries and its correspondent parameters, and the new seismic sources that have been defined accordingly. Within the spirit of the Global Earthquake Model (GEM), all these efforts look for possible bridges with neighboring countries to establish consistent hazard maps across the borders.

  9. Effect of Variable Manning Coefficients on Tsunami Inundation

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Rees, D.

    2017-12-01

    Numerical simulations are commonly used to help estimate tsunami hazard, improve evacuation plans, issue or cancel tsunami warnings, inform forecasting and hazard assessments and have therefore become an integral part of hazard mitigation among the tsunami community. Many numerical codes exist for simulating tsunamis, most of which have undergone extensive benchmarking and testing. Tsunami hazard or risk assessments employ these codes following a deterministic or probabilistic approach. Depending on the scope these studies may or may not consider uncertainty in the numerical simulations, the effects of tides, variable friction or estimate financial losses, none of which are necessarily trivial. Distributed manning coefficients, the roughness coefficients used in hydraulic modeling, are commonly used in simulating both riverine and pluvial flood events however, their use in tsunami hazard assessments is primarily part of limited scope studies and for the most part, not a standard practice. For this work, we investigate variations in manning coefficients and their effects on tsunami inundation extent, pattern and financial loss. To assign manning coefficients we use land use maps that come from the New Zealand Land Cover Database (LCDB) and more recent data from the Ministry of the Environment. More than 40 classes covering different types of land use are combined into major classes such as cropland, grassland and wetland representing common types of land use in New Zealand, each of which is assigned a unique manning coefficient. By utilizing different data sources for variable manning coefficients, we examine the impact of data sources and classification methodology on the accuracy of model outputs.

  10. Earthquake Hazard and Risk in Alaska

    NASA Astrophysics Data System (ADS)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance: the Trans-Alaska pipeline, industrial facilities in Valdez, and typical residential wood buildings in Anchorage, Fairbanks and Juneau.

  11. Earthquake Hazard and Risk in New Zealand

    NASA Astrophysics Data System (ADS)

    Apel, E. V.; Nyst, M.; Fitzenz, D. D.; Molas, G.

    2014-12-01

    To quantify risk in New Zealand we examine the impact of updating the seismic hazard model. The previous RMS New Zealand hazard model is based on the 2002 probabilistic seismic hazard maps for New Zealand (Stirling et al., 2002). The 2015 RMS model, based on Stirling et al., (2012) will update several key source parameters. These updates include: implementation a new set of crustal faults including multi-segment ruptures, updating the subduction zone geometry and reccurrence rate and implementing new background rates and a robust methodology for modeling background earthquake sources. The number of crustal faults has increased by over 200 from the 2002 model, to the 2012 model which now includes over 500 individual fault sources. This includes the additions of many offshore faults in northern, east-central, and southwest regions. We also use the recent data to update the source geometry of the Hikurangi subduction zone (Wallace, 2009; Williams et al., 2013). We compare hazard changes in our updated model with those from the previous version. Changes between the two maps are discussed as well as the drivers for these changes. We examine the impact the hazard model changes have on New Zealand earthquake risk. Considered risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance. New Zealand is interesting in that the city with the majority of the risk exposure in the country (Auckland) lies in the region of lowest hazard, where we don't have a lot of information about the location of faults and distributed seismicity is modeled by averaged Mw-frequency relationships on area sources. Thus small changes to the background rates can have a large impact on the risk profile for the area. Wellington, another area of high exposure is particularly sensitive to how the Hikurangi subduction zone and the Wellington fault are modeled. Minor changes on these sources have substantial impacts for the risk profile of the city and the country at large.

  12. Sensitivity analysis of seismic hazard for the northwestern portion of the state of Gujarat, India

    USGS Publications Warehouse

    Petersen, M.D.; Rastogi, B.K.; Schweig, E.S.; Harmsen, S.C.; Gomberg, J.S.

    2004-01-01

    We test the sensitivity of seismic hazard to three fault source models for the northwestern portion of Gujarat, India. The models incorporate different characteristic earthquake magnitudes on three faults with individual recurrence intervals of either 800 or 1600 years. These recurrence intervals imply that large earthquakes occur on one of these faults every 266-533 years, similar to the rate of historic large earthquakes in this region during the past two centuries and for earthquakes in intraplate environments like the New Madrid region in the central United States. If one assumes a recurrence interval of 800 years for large earthquakes on each of three local faults, the peak ground accelerations (PGA; horizontal) and 1-Hz spectral acceleration ground motions (5% damping) are greater than 1 g over a broad region for a 2% probability of exceedance in 50 years' hazard level. These probabilistic PGAs at this hazard level are similar to median deterministic ground motions. The PGAs for 10% in 50 years' hazard level are considerably lower, generally ranging between 0.2 g and 0.7 g across northwestern Gujarat. Ground motions calculated from our models that consider fault interevent times of 800 years are considerably higher than other published models even though they imply similar recurrence intervals. These higher ground motions are mainly caused by the application of intraplate attenuation relations, which account for less severe attenuation of seismic waves when compared to the crustal interplate relations used in these previous studies. For sites in Bhuj and Ahmedabad, magnitude (M) 7 3/4 earthquakes contribute most to the PGA and the 0.2- and 1-s spectral acceleration ground motion maps at the two considered hazard levels. ?? 2004 Elsevier B.V. All rights reserved.

  13. Application-driven ground motion prediction equation for seismic hazard assessments in non-cratonic moderate-seismicity areas

    NASA Astrophysics Data System (ADS)

    Bindi, D.; Cotton, F.; Kotha, S. R.; Bosse, C.; Stromeyer, D.; Grünthal, G.

    2017-09-01

    We present a ground motion prediction equation (GMPE) for probabilistic seismic hazard assessments (PSHA) in low-to-moderate seismicity areas, such as Germany. Starting from the NGA-West2 flat-file (Ancheta et al. in Earthquake Spectra 30:989-1005, 2014), we develop a model tailored to the hazard application in terms of data selection and implemented functional form. In light of such hazard application, the GMPE is derived for hypocentral distance (along with the Joyner-Boore one), selecting recordings at sites with vs30 ≥ 360 m/s, distances within 300 km, and magnitudes in the range 3 to 8 (being 7.4 the maximum magnitude for the PSHA in the target area). Moreover, the complexity of the considered functional form is reflecting the availability of information in the target area. The median predictions are compared with those from the NGA-West2 models and with one recent European model, using the Sammon's map constructed for different scenarios. Despite the simplification in the functional form, the assessed epistemic uncertainty in the GMPE median is of the order of those affecting the NGA-West2 models for the magnitude range of interest of the hazard application. On the other hand, the simplification of the functional form led to an increment of the apparent aleatory variability. In conclusion, the GMPE developed in this study is tailored to the needs for applications in low-to-moderate seismic areas and for short return periods (e.g., 475 years); its application in studies where the hazard is involving magnitudes above 7.4 and for long return periods is not advised.

  14. Fault-based PSHA of an active tectonic region characterized by low deformation rates: the case of the Lower Rhine Graben

    NASA Astrophysics Data System (ADS)

    Vanneste, Kris; Vleminckx, Bart; Camelbeeck, Thierry

    2016-04-01

    The Lower Rhine Graben (LRG) is one of the few regions in intraplate NW Europe where seismic activity can be linked to active faults, yet probabilistic seismic hazard assessments of this region have hitherto been based on area-source models, in which the LRG is modeled as a single or a small number of seismotectonic zones with uniform seismicity. While fault-based PSHA has become common practice in more active regions of the world (e.g., California, Japan, New Zealand, Italy), knowledge of active faults has been lagging behind in other regions, due to incomplete tectonic inventory, low level of seismicity, lack of systematic fault parameterization, or a combination thereof. The past few years, efforts are increasingly being directed to the inclusion of fault sources in PSHA in these regions as well, in order to predict hazard on a more physically sound basis. In Europe, the EC project SHARE ("Seismic Hazard Harmonization in Europe", http://www.share-eu.org/) represented an important step forward in this regard. In the frame of this project, we previously compiled the first parameterized fault model for the LRG that can be applied in PSHA. We defined 15 fault sources based on major stepovers, bifurcations, gaps, and important changes in strike, dip direction or slip rate. Based on the available data, we were able to place reasonable bounds on the parameters required for time-independent PSHA: length, width, strike, dip, rake, slip rate, and maximum magnitude. With long-term slip rates remaining below 0.1 mm/yr, the LRG can be classified as a low-deformation-rate structure. Information on recurrence interval and elapsed time since the last major earthquake is lacking for most faults, impeding time-dependent PSHA. We consider different models to construct the magnitude-frequency distribution (MFD) of each fault: a slip-rate constrained form of the classical truncated Gutenberg-Richter MFD (Anderson & Luco, 1983) versus a characteristic MFD following Youngs & Coppersmith (1985). The summed Anderson & Luco fault MFDs show a remarkably good agreement with the MFD obtained from the historical and instrumental catalog for the entire LRG, whereas the summed Youngs & Coppersmith MFD clearly underpredicts low to moderate magnitudes, but yields higher occurrence rates for M > 6.3 than would be obtained by simple extrapolation of the catalog MFD. The moment rate implied by the Youngs & Coppersmith MFDs is about three times higher, but is still within the range allowed by current GPS uncertainties. Using the open-source hazard engine OpenQuake (http://openquake.org/), we compute hazard maps for return periods of 475, 2475, and 10,000 yr, and for spectral periods of 0 s (PGA) and 1 s. We explore the impact of various parameter choices, such as MFD model, GMPE distance metric, and inclusion of a background zone to account for lower magnitudes, and we also compare the results with hazard maps based on area-source models. References: Anderson, J. G., and J. E. Luco (1983), Consequences of slip rate constraints on earthquake occurrence relations, Bull. Seismol. Soc. Am., 73(2), 471-496. Youngs, R. R., and K. J. Coppersmith (1985), Implications of fault slip rates and earthquake recurrence models to probabilistic seismic hazard estimates, Bull. Seismol. Soc. Am., 75(4), 939-964.

  15. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  16. Predicting the spatial extent of liquefaction from geospatial and earthquake specific parameters

    USGS Publications Warehouse

    Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Deodatis, George; Ellingwood, Bruce R.; Frangopol, Dan M.

    2014-01-01

    The spatially extensive damage from the 2010-2011 Christchurch, New Zealand earthquake events are a reminder of the need for liquefaction hazard maps for anticipating damage from future earthquakes. Liquefaction hazard mapping as traditionally relied on detailed geologic mapping and expensive site studies. These traditional techniques are difficult to apply globally for rapid response or loss estimation. We have developed a logistic regression model to predict the probability of liquefaction occurrence in coastal sedimentary areas as a function of simple and globally available geospatial features (e.g., derived from digital elevation models) and standard earthquake-specific intensity data (e.g., peak ground acceleration). Some of the geospatial explanatory variables that we consider are taken from the hydrology community, which has a long tradition of using remotely sensed data as proxies for subsurface parameters. As a result of using high resolution, remotely-sensed, and spatially continuous data as a proxy for important subsurface parameters such as soil density and soil saturation, and by using a probabilistic modeling framework, our liquefaction model inherently includes the natural spatial variability of liquefaction occurrence and provides an estimate of spatial extent of liquefaction for a given earthquake. To provide a quantitative check on how the predicted probabilities relate to spatial extent of liquefaction, we report the frequency of observed liquefaction features within a range of predicted probabilities. The percentage of liquefaction is the areal extent of observed liquefaction within a given probability contour. The regional model and the results show that there is a strong relationship between the predicted probability and the observed percentage of liquefaction. Visual inspection of the probability contours for each event also indicates that the pattern of liquefaction is well represented by the model.

  17. Wind/tornado design criteria, development to achieve required probabilistic performance goals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, D.S.

    1991-06-01

    This paper describes the strategy for developing new design criteria for a critical facility to withstand loading induced by the wind/tornado hazard. The proposed design requirements for resisting wind/tornado loads are based on probabilistic performance goals. The proposed design criteria were prepared by a Working Group consisting of six experts in wind/tornado engineering and meteorology. Utilizing their best technical knowledge and judgment in the wind/tornado field, they met and discussed the methodologies and reviewed available data. A review of the available wind/tornado hazard model for the site, structural response evaluation methods, and conservative acceptance criteria lead to proposed design criteriamore » that has a high probability of achieving the required performance goals.« less

  18. Wind/tornado design criteria, development to achieve required probabilistic performance goals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, D.S.

    This paper describes the strategy for developing new design criteria for a critical facility to withstand loading induced by the wind/tornado hazard. The proposed design requirements for resisting wind/tornado loads are based on probabilistic performance goals. The proposed design criteria were prepared by a Working Group consisting of six experts in wind/tornado engineering and meteorology. Utilizing their best technical knowledge and judgment in the wind/tornado field, they met and discussed the methodologies and reviewed available data. A review of the available wind/tornado hazard model for the site, structural response evaluation methods, and conservative acceptance criteria lead to proposed design criteriamore » that has a high probability of achieving the required performance goals.« less

  19. Geological, geomechanical and geostatistical assessment of rockfall hazard in San Quirico Village (Abruzzo, Italy)

    NASA Astrophysics Data System (ADS)

    Chiessi, Vittorio; D'Orefice, Maurizio; Scarascia Mugnozza, Gabriele; Vitale, Valerio; Cannese, Christian

    2010-07-01

    This paper describes the results of a rockfall hazard assessment for the village of San Quirico (Abruzzo region, Italy) based on an engineering-geological model. After the collection of geological, geomechanical, and geomorphological data, the rockfall hazard assessment was performed based on two separate approaches: i) simulation of detachment of rock blocks and their downhill movement using a GIS; and ii) application of geostatistical techniques to the analysis of georeferenced observations of previously fallen blocks, in order to assess the probability of arrival of blocks due to potential future collapses. The results show that the trajectographic analysis is significantly influenced by the input parameters, with particular reference to the coefficients of restitution values. In order to solve this problem, the model was calibrated based on repeated field observations. The geostatistical approach is useful because it gives the best estimation of point-source phenomena such as rockfalls; however, the sensitivity of results to basic assumptions, e.g. assessment of variograms and choice of a threshold value, may be problematic. Consequently, interpolations derived from different variograms have been used and compared among them; hence, those showing the lowest errors were adopted. The data sets which were statistically analysed are relevant to both kinetic energy and surveyed rock blocks in the accumulation area. The obtained maps highlight areas susceptible to rock block arrivals, and show that the area accommodating the new settlement of S. Quirico Village has the highest level of hazard according to both probabilistic and deterministic methods.

  20. Earthquake Hazard Assessment: an Independent Review

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  1. Conditional spectrum computation incorporating multiple causal earthquakes and ground-motion prediction models

    USGS Publications Warehouse

    Lin, Ting; Harmsen, Stephen C.; Baker, Jack W.; Luco, Nicolas

    2013-01-01

    The conditional spectrum (CS) is a target spectrum (with conditional mean and conditional standard deviation) that links seismic hazard information with ground-motion selection for nonlinear dynamic analysis. Probabilistic seismic hazard analysis (PSHA) estimates the ground-motion hazard by incorporating the aleatory uncertainties in all earthquake scenarios and resulting ground motions, as well as the epistemic uncertainties in ground-motion prediction models (GMPMs) and seismic source models. Typical CS calculations to date are produced for a single earthquake scenario using a single GMPM, but more precise use requires consideration of at least multiple causal earthquakes and multiple GMPMs that are often considered in a PSHA computation. This paper presents the mathematics underlying these more precise CS calculations. Despite requiring more effort to compute than approximate calculations using a single causal earthquake and GMPM, the proposed approach produces an exact output that has a theoretical basis. To demonstrate the results of this approach and compare the exact and approximate calculations, several example calculations are performed for real sites in the western United States. The results also provide some insights regarding the circumstances under which approximate results are likely to closely match more exact results. To facilitate these more precise calculations for real applications, the exact CS calculations can now be performed for real sites in the United States using new deaggregation features in the U.S. Geological Survey hazard mapping tools. Details regarding this implementation are discussed in this paper.

  2. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.

  3. Probabilistic Seismic Hazard Analysis of Victoria, British Columbia, Canada: Considering an Active Leech River Fault

    NASA Astrophysics Data System (ADS)

    Kukovica, J.; Molnar, S.; Ghofrani, H.

    2017-12-01

    The Leech River fault is situated on Vancouver Island near the city of Victoria, British Columbia, Canada. The 60km transpressional reverse fault zone runs east to west along the southern tip of Vancouver Island, dividing the lithologic units of Jurassic-Cretaceous Leech River Complex schists to the north and Eocene Metchosin Formation basalts to the south. This fault system poses a considerable hazard due to its proximity to Victoria and 3 major hydroelectric dams. The Canadian seismic hazard model for the 2015 National Building Code of Canada (NBCC) considered the fault system to be inactive. However, recent paleoseismic evidence suggests there to be at least 2 surface-rupturing events to have exceeded a moment magnitude (M) of 6.5 within the last 15,000 years (Morell et al. 2017). We perform a Probabilistic Seismic Hazard Analysis (PSHA) for the city of Victoria with consideration of the Leech River fault as an active source. A PSHA for Victoria which replicates the 2015 NBCC estimates is accomplished to calibrate our PSHA procedure. The same seismic source zones, magnitude recurrence parameters, and Ground Motion Prediction Equations (GMPEs) are used. We replicate the uniform hazard spectrum for a probability of exceedance of 2% in 50 years for a 500 km radial area around Victoria. An active Leech River fault zone is then added; known length and dip. We are determining magnitude recurrence parameters based on a Gutenberg-Richter relationship for the Leech River fault from various catalogues of the recorded seismicity (M 2-3) within the fault's vicinity and the proposed paleoseismic events. We seek to understand whether inclusion of an active Leech River fault source will significantly increase the probabilistic seismic hazard for Victoria. Morell et al. 2017. Quaternary rupture of a crustal fault beneath Victoria, British Columbia, Canada. GSA Today, 27, doi: 10.1130/GSATG291A.1

  4. Using Multi-Scenario Tsunami Modelling Results combined with Probabilistic Analyses to provide Hazard Information for the South-WestCoast of Indonesia

    NASA Astrophysics Data System (ADS)

    Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.

    2009-04-01

    Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of the inundation for a specific area, the wave height at coast at this area and the estimated times of arrival (ETAs) of the waves, caused by one tsunamigenic source with a specific magnitude. These parameters from the several scenarios can overlap each other along the coast and must be combined to get one comprehensive hazard assessment for all possible future tsunamis at the region under observation. The simplest way to derive the inundation probability along the coast using the multiscenario approach is to overlay all scenario inundation results and to determine how often a point on land will be significantly inundated from the various scenarios. But this does not take into account that the used tsunamigenic sources for the modeled scenarios have different likelihoods of causing a tsunami. Hence a statistical analysis of historical data and geophysical investigation results based on numerical modelling results is added to the hazard assessment, which clearly improves the significance of the hazard assessment. For this purpose the present method is developed and contains a complex logical combination of the diverse probabilities assessed like probability of occurrence for different earthquake magnitudes at different localities, probability of occurrence for a specific wave height at the coast and the probability for every point on land likely to get hit by a tsunami. The values are combined by a logical tree technique and quantified by statistical analysis of historical data and of the tsunami modelling results as mentioned before. This results in a tsunami inundation probability map covering the South West Coast of Indonesia which nevertheless shows a significant spatial diversity offering a good base for evacuation planning and mitigation strategies. Keywords: tsunami hazard assessment, tsunami modelling, probabilistic analysis, early warning

  5. Transient flow conditions in probabilistic wellhead protection: importance and ways to manage spatial and temporal uncertainty in capture zone delineation

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.

    2012-12-01

    "From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following an idea by Festger and Walter, 2002. These quasi steady-state flow fields are cast into a geostatistical Monte Carlo framework to admit and evaluate the influence of parameter uncertainty on the delineation process. Furthermore, this framework enables conditioning on observed data with any conditioning scheme, such as rejection sampling, Ensemble Kalman Filters, etc. To further reduce the computational load, we use the reverse formulation of advective-dispersive transport. We simulate the reverse transport by particle tracking random walk in order to avoid numerical dispersion to account for well arrival times.

  6. Hazard Maps in the Classroom.

    ERIC Educational Resources Information Center

    Cross, John A.

    1988-01-01

    Emphasizes the use of geophysical hazard maps and illustrates how they can be used in the classroom from kindergarten to college level. Depicts ways that hazard maps of floods, landslides, earthquakes, volcanoes, and multi-hazards can be integrated into classroom instruction. Tells how maps may be obtained. (SLM)

  7. Flood Hazard Mapping by Applying Fuzzy TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.

    2017-12-01

    There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS Acknowlegement This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  8. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    PubMed

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  9. Hazard function theory for nonstationary natural hazards

    DOE PAGES

    Read, Laura K.; Vogel, Richard M.

    2016-04-11

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  10. Hazard function theory for nonstationary natural hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Read, Laura K.; Vogel, Richard M.

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  11. Changes and future trends in landslide risk mapping for mountain communities: application to the Vars catchment and Barcelonnette basin (French Alps)

    NASA Astrophysics Data System (ADS)

    Puissant, Anne; Wernert, Pauline; Débonnaire, Nicolas; Malet, Jean-Philippe; Bernardie, Séverine; Thomas, Loic

    2017-04-01

    Landslide risk assessment has become a major research subject within the last decades. In the context of the French-funded ANR Project SAMCO which aims at enhancing the overall resilience of societies on the impacts of mountain risks, we developed a procedure to quantify changes in landslide risk at catchment scales. First, we investigate landslide susceptibility, the spatial component of the hazard, through a weight of evidence probabilistic model. This latter is based on the knowledge of past and current landslides in order to simulate their spatial locations in relation to environmental controlling factors. Second, we studied potential consequences using a semi-quantitative region-scale indicator-based method, called method of the Potential Damage Index (PDI). It allows estimating the possible damages related to landslides by combining weighted indicators reflecting the exposure of the element at risk for structural, functional and socio-economic stakes. Finally, we provide landslide risk maps by combining both susceptibility and potential consequence maps resulting from the two previous steps. The risk maps are produced for the present time and for the future (e.g. period 2050 and 2100) taking into account four scenarios of future landcover and landuse development (based on the Prelude European Project) that are consistent with the likely evolution of mountain communities. Results allow identifying the geographical areas that are likely to be exposed to landslide risk in the future. The results are integrated on a web-based demonstrator, enabling the comparison between various scenarios, and could thus be used as decision-support tools for local stakeholders. The method and the demonstrator will be presented through the analysis of landslide risk in two catchments of the French Alps: the Vars catchment and the Barcelonnette basin, both characterized by a different exposure to landslide hazards.

  12. Expanding CyberShake Physics-Based Seismic Hazard Calculations to Central California

    NASA Astrophysics Data System (ADS)

    Silva, F.; Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.

    2016-12-01

    As part of its program of earthquake system science, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by first simulating a tensor-valued wavefield of Strain Green Tensors. CyberShake then takes an earthquake rupture forecast and extends it by varying the hypocenter location and slip distribution, resulting in about 500,000 rupture variations. Seismic reciprocity is used to calculate synthetic seismograms for each rupture variation at each computation site. These seismograms are processed to obtain intensity measures, such as spectral acceleration, which are then combined with probabilities from the earthquake rupture forecast to produce a hazard curve. Hazard curves are calculated at seismic frequencies up to 1 Hz for hundreds of sites in a region and the results interpolated to obtain a hazard map. In developing and verifying CyberShake, we have focused our modeling in the greater Los Angeles region. We are now expanding the hazard calculations into Central California. Using workflow tools running jobs across two large-scale open-science supercomputers, NCSA Blue Waters and OLCF Titan, we calculated 1-Hz PSHA results for over 400 locations in Central California. For each location, we produced hazard curves using both a 3D central California velocity model created via tomographic inversion, and a regionally averaged 1D model. These new results provide low-frequency exceedance probabilities for the rapidly expanding metropolitan areas of Santa Barbara, Bakersfield, and San Luis Obispo, and lend new insights into the effects of directivity-basin coupling associated with basins juxtaposed to major faults such as the San Andreas. Particularly interesting are the basin effects associated with the deep sediments of the southern San Joaquin Valley. We will compare hazard estimates from the 1D and 3D models, summarize the challenges of expanding CyberShake to a new geographic region, and describe our future CyberShake plans.

  13. Schoolyard Volcanoes: A Unit in Volcanology and Hazards

    NASA Astrophysics Data System (ADS)

    Lechner, H. N.; Gochis, E. E.; Brill, K. A.

    2014-12-01

    How do you teach volcanology and volcanic hazards to students when there is no volcano nearby? You bring the volcano to them! At Michigan Technological University we have developed a four-lesson-unit for middle and high school students which incorporates virtual, analogue and numerical models to increase students' interests in geosciences while simultaneously expanding the community of earth-science-literate individuals necessary for a disaster resilient society. The unit aims to build on students' prior geoscience knowledge by examining the physical properties that influence volcanic eruptions and introduces them to challenges and methods of communicating hazards and risk. Lesson one engages students in a series of hands-on investigations that explore the "3-Vs" of volcanology: Viscosity, Volatiles and Volume. The students learn about the relationship between magma composition and viscosity and the influence on eruption style, behavior and morphology of different volcanoes. Lesson two uses an analogue model of a volcano to demonstrate the forces involved in an explosive eruption and associated hazards. Students think critically about the factors that affect hazards and risk as well as the variables (such as topography) that affect the eruption and the hazard. During lesson three students use Google Earth for a virtual field trip to Pacaya volcano, Guatemala to examine changes in the landscape over time and other evidence of volcanic activity to make interpretations about the volcano. The final lesson has the students use numerical models and GIS to create hazard maps based on probabilistic lahar scenarios. Throughout the unit students are engaged in an inquiry-based exploration that covers several Next Generation Science Standards (NGSS) content and practices. This four lesson unit has been field tested in two school districts and during a summer engineering program. Results from student work and post-surveys show that this strategy raises interests in and knowledge of volcanic hazards.

  14. Probabilistic Seismic Hazard Assessment for Iraq

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq

    Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al.,more » 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.« less

  15. Mixture Modeling for Background and Sources Separation in x-ray Astronomical Images

    NASA Astrophysics Data System (ADS)

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker

    2004-11-01

    A probabilistic technique for the joint estimation of background and sources in high-energy astrophysics is described. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. The present analysis is applied to ROSAT PSPC data (0.1-2.4 keV) in Survey Mode. A background map is modelled using a Thin-Plate spline. Source probability maps are obtained for each pixel (45 arcsec) independently and for larger correlation lengths, revealing faint and extended sources. We will demonstrate that the described probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS) used for the production of the ROSAT All-Sky Survey (RASS) catalogues.

  16. A Probabilistic Tsunami Hazard Assessment Methodology and Its Application to Crescent City, CA

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Leveque, R. J.; Waagan, K.; Adams, L.; Lin, G.

    2012-12-01

    A PTHA methodology, based in large part on Probabilistic Seismic Hazard Assessment methods (e.g., Cornell, 1968; SSHAC, 1997; Geist and Parsons, 2005), was previously applied to Seaside, OR (Gonzalez, et al., 2009). This initial version of the method has been updated to include: a revised method to estimate tidal uncertainty; an improved method for generating stochastic realizations to estimate slip distribution uncertainty (Mai and Beroza, 2002; Blair, et al., 2011); additional near-field sources in the Cascadia Subduction Zone, based on the work of Goldfinger, et al. (2012); far-field sources in Japan, based on information updated since the 3 March 2011 Tohoku tsunami (Japan Earthquake Research Committee, 2011). The GeoClaw tsunami model (Berger, et. al, 2011) is used to simulate generation, propagation and inundation. We will discuss this revised PTHA methodology and the results of its application to Crescent City, CA. Berger, M.J., D. L. George, R. J. LeVeque, and K. T. Mandli, The GeoClaw software for depth-averaged flows with adaptive refinement, Adv. Water Res. 34 (2011), pp. 1195-1206. Blair, J.L., McCrory, P.A., Oppenheimer, D.H., and Waldhauser, F. (2011): A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity: U.S. Geological Survey Data Series 633, v.1.0, available at http://pubs.usgs.gov/ds/633/. Cornell, C. A. (1968): Engineering seismic risk analysis, Bull. Seismol. Soc. Am., 58, 1583-1606. Geist, E. L., and T. Parsons (2005): Probabilistic Analysis of Tsunami Hazards, Nat. Hazards, 37 (3), 277-314. Goldfinger, C., Nelson, C.H., Morey, A.E., Johnson, J.E., Patton, J.R., Karabanov, E., Gutiérrez-Pastor, J., Eriksson, A.T., Gràcia, E., Dunhill, G., Enkin, R.J., Dallimore, A., and Vallier, T. (2012): Turbidite event history—Methods and implications for Holocene paleoseismicity of the Cascadia subduction zone: U.S. Geological Survey Professional Paper 1661-F, 170 p. (Available at http://pubs.usgs.gov/pp/pp1661f/). González, F.I., E.L. Geist, B. Jaffe, U. Kânoglu, H. Mofjeld, C.E. Synolakis, V.V Titov, D. Arcas, D. Bellomo, D. Carlton, T. Horning, J. Johnson, J. Newman, T. Parsons, R. Peters, C. Peterson, G .Priest, A. Venturato, J. Weber, F. Wong, and A. Yalciner (2009): Probabilistic Tsunami Hazard Assessment at Seaside, Oregon, for Near- and Far-Field Seismic Sources, J. Geophys. Res., 114, C11023, doi:10.1029/2008JC005132. Japan Earthquake Research Committee, (2011): http://www.jishin.go.jp/main/p_hyoka02.htm Mai, P. M., and G. C. Beroza (2002): A spatial random field model to characterize complexity in earthquake slip, J. Geophys. Res., 107(B11), 2308, doi:10.1029/2001JB000588. SSHAC (Senior Seismic Hazard Analysis Committee) (1997): Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, Main Report Rep. NUREG/CR-6372 UCRL-ID-122160 Vol. 1, 256 pp, U.S. Nuclear Regulatory Commission.

  17. Beyond eruptive scenarios: assessing tephra fallout hazard from Neapolitan volcanoes

    PubMed Central

    Sandri, Laura; Costa, Antonio; Selva, Jacopo; Tonini, Roberto; Macedonio, Giovanni; Folch, Arnau; Sulpizio, Roberto

    2016-01-01

    Assessment of volcanic hazards is necessary for risk mitigation. Typically, hazard assessment is based on one or a few, subjectively chosen representative eruptive scenarios, which use a specific combination of eruptive sizes and intensities to represent a particular size class of eruption. While such eruptive scenarios use a range of representative members to capture a range of eruptive sizes and intensities in order to reflect a wider size class, a scenario approach neglects to account for the intrinsic variability of volcanic eruptions, and implicitly assumes that inter-class size variability (i.e. size difference between different eruptive size classes) dominates over intra-class size variability (i.e. size difference within an eruptive size class), the latter of which is treated as negligible. So far, no quantitative study has been undertaken to verify such an assumption. Here, we adopt a novel Probabilistic Volcanic Hazard Analysis (PVHA) strategy, which accounts for intrinsic eruptive variabilities, to quantify the tephra fallout hazard in the Campania area. We compare the results of the new probabilistic approach with the classical scenario approach. The results allow for determining whether a simplified scenario approach can be considered valid, and for quantifying the bias which arises when full variability is not accounted for. PMID:27067389

  18. Forecasting probabilistic seismic shaking for greater Tokyo from 400 years of intensity observations

    USGS Publications Warehouse

    Bozkurt, S.B.; Stein, R.S.; Toda, S.

    2007-01-01

    The long recorded history of earthquakes in Japan affords an opportunity to forecast seismic shaking exclusively from past shaking. We calculate the time-averaged (Poisson) probability of severe shaking by using more than 10,000 intensity observations recorded since AD 1600 in a 350 km-wide box centered on Tokyo. Unlike other hazard-assessment methods, source and site effects are included without modeling, and we do not need to know the size or location of any earthquake nor the location and slip rate of any fault. The two key assumptions are that the slope of the observed frequency-intensity relation at every site is the same, and that the 400-year record is long enough to encompass the full range of seismic behavior. Tests we conduct here suggest that both assumptions are sound. The resulting 30-year probability of IJMA ??? 6 shaking (??? PGA ??? 0.4 g or MMI ??? IX) is 30%-40% in Tokyo, Kawasaki, and Yokohama, and 10% 15% in Chiba and Tsukuba. This result means that there is a 30% chance that 4 million people will be subjected to IJMA ??? 6 shaking during an average 30-year period. We also produce exceedance maps of PGA for building-code regulations, and calculate short-term hazard associated with a hypothetical catastrophe bond. Our results resemble an independent assessment developed from conventional seismic hazard analysis for greater Tokyo. ?? 2007, Earthquake Engineering Research Institute.

  19. Values of Flood Hazard Mapping for Disaster Risk Assessment and Communication

    NASA Astrophysics Data System (ADS)

    Sayama, T.; Takara, K. T.

    2015-12-01

    Flood plains provide tremendous benefits for human settlements. Since olden days people have lived with floods and attempted to control them if necessary. Modern engineering works such as building embankment have enabled people to live even in flood prone areas, and over time population and economic assets have concentrated in these areas. In developing countries also, rapid land use change alters exposure and vulnerability to floods and consequently increases disaster risk. Flood hazard mapping is an essential step for any counter measures. It has various objectives including raising awareness of residents, finding effective evacuation routes and estimating potential damages through flood risk mapping. Depending on the objectives and data availability, there are also many possible approaches for hazard mapping including simulation basis, community basis and remote sensing basis. In addition to traditional paper-based hazard maps, Information and Communication Technology (ICT) promotes more interactive hazard mapping such as movable hazard map to demonstrate scenario simulations for risk communications and real-time hazard mapping for effective disaster responses and safe evacuations. This presentation first summarizes recent advancement of flood hazard mapping by focusing on Japanese experiences and other examples from Asian countries. Then it introduces a flood simulation tool suitable for hazard mapping at the river basin scale even in data limited regions. In the past few years, the tool has been practiced by local officers responsible for disaster management in Asian countries. Through the training activities of hazard mapping and risk assessment, we conduct comparative analysis to identify similarity and uniqueness of estimated economic damages depending on topographic and land use conditions.

  20. Probabilistic Structural Health Monitoring of the Orbiter Wing Leading Edge

    NASA Technical Reports Server (NTRS)

    Yap, Keng C.; Macias, Jesus; Kaouk, Mohamed; Gafka, Tammy L.; Kerr, Justin H.

    2011-01-01

    A structural health monitoring (SHM) system can contribute to the risk management of a structure operating under hazardous conditions. An example is the Wing Leading Edge Impact Detection System (WLEIDS) that monitors the debris hazards to the Space Shuttle Orbiter s Reinforced Carbon-Carbon (RCC) panels. Since Return-to-Flight (RTF) after the Columbia accident, WLEIDS was developed and subsequently deployed on board the Orbiter to detect ascent and on-orbit debris impacts, so as to support the assessment of wing leading edge structural integrity prior to Orbiter re-entry. As SHM is inherently an inverse problem, the analyses involved, including those performed for WLEIDS, tend to be associated with significant uncertainty. The use of probabilistic approaches to handle the uncertainty has resulted in the successful implementation of many development and application milestones.

  1. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    NASA Astrophysics Data System (ADS)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide location data. These results show a high concordance between the landslide inventory and the high susceptibility estimated zone with an adjustment of 95.1 % for ANN model and 89.4% for LR model. In addition, we make a comparative analysis of both techniques using the Receiver Operating Characteristic (ROC) curve, a graphical plot of the sensitivity vs. (1 - specificity) for a binary classifier system in function of its discrimination threshold, and calculating the Area Under the ROC (AUROC) value for each model. Finally, the previous models are used for the developing a new probabilistic landslide hazard map for future events. They are obtained combining the expected triggering factor (calculated earthquake ground motion) for a return period of 475 years with the susceptibility map.

  2. Risk assessment for tephra dispersal and sedimentation: the example of four Icelandic volcanoes

    NASA Astrophysics Data System (ADS)

    Biass, Sebastien; Scaini, Chiara; Bonadonna, Costanza; Smith, Kate; Folch, Arnau; Höskuldsson, Armann; Galderisi, Adriana

    2014-05-01

    In order to assist the elaboration of proactive measures for the management of future Icelandic volcanic eruptions, we developed a new approach to assess the impact associated with tephra dispersal and sedimentation at various scales and for multiple sources. Target volcanoes are Hekla, Katla, Eyjafjallajökull and Askja, selected for their high probabilities of eruption and/or their high potential impact. We combined stratigraphic studies, probabilistic strategies and numerical modelling to develop comprehensive eruption scenarios and compile hazard maps for local ground deposition and regional atmospheric concentration using both TEPHRA2 and FALL3D models. New algorithms for the identification of comprehensive probability density functions of eruptive source parameters were developed for both short and long-lasting activity scenarios. A vulnerability assessment of socioeconomic and territorial aspects was also performed at both national and continental scales. The identification of relevant vulnerability indicators allowed for the identification of the most critical areas and territorial nodes. At a national scale, the vulnerability of economic activities and the accessibility to critical infrastructures was assessed. At a continental scale, we assessed the vulnerability of the main airline routes and airports. Resulting impact and risk were finally assessed by combining hazard and vulnerability analysis.

  3. Going beyond the flood insurance rate map: insights from flood hazard map co-production

    NASA Astrophysics Data System (ADS)

    Luke, Adam; Sanders, Brett F.; Goodrich, Kristen A.; Feldman, David L.; Boudreau, Danielle; Eguiarte, Ana; Serrano, Kimberly; Reyes, Abigail; Schubert, Jochen E.; AghaKouchak, Amir; Basolo, Victoria; Matthew, Richard A.

    2018-04-01

    Flood hazard mapping in the United States (US) is deeply tied to the National Flood Insurance Program (NFIP). Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM) such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1) legends that frame flood intensity both qualitatively and quantitatively, and (2) flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1) standing water depths following the flood, (2) the erosive potential of flowing water, and (3) pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating pluvial flood hazards and by using concrete reference points to describe flooding scenarios rather than exceedance probabilities or frequencies.

  4. Seismic hazard assessment in the Catania and Siracusa urban areas (Italy) through different approaches

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Lombardo, Giuseppe; Rigano, Rosaria

    2010-05-01

    The seismic hazard assessment (SHA) can be performed using either Deterministic or Probabilistic approaches. In present study a probabilistic analysis was carried out for the Catania and Siracusa towns using two different procedures: the 'site' (Albarello and Mucciarelli, 2002) and the 'seismotectonic' (Cornell 1968; Esteva, 1967) methodologies. The SASHA code (D'Amico and Albarello, 2007) was used to calculate seismic hazard through the 'site' approach, whereas the CRISIS2007 code (Ordaz et al., 2007) was adopted in the Esteva-Cornell procedure. According to current international conventions for PSHA (SSHAC, 1997), a logic tree approach was followed to consider and reduce the epistemic uncertainties, for both seismotectonic and site methods. The code SASHA handles the intensity data taking into account the macroseismic information of past earthquakes. CRISIS2007 code needs, as input elements, a seismic catalogue tested for completeness, a seismogenetic zonation and ground motion predicting equations. Data concerning the characterization of regional seismic sources and ground motion attenuation properties were taken from the literature. Special care was devoted to define source zone models, taking into account the most recent studies on regional seismotectonic features and, in particular, the possibility of considering the Malta escarpment as a potential source. The combined use of the above mentioned approaches allowed us to obtain useful elements to define the site seismic hazard in Catania and Siracusa. The results point out that the choice of the probabilistic model plays a fundamental role. It is indeed observed that when the site intensity data are used, the town of Catania shows hazard values higher than the ones found for Siracusa, for each considered return period. On the contrary, when the Esteva-Cornell method is used, Siracusa urban area shows higher hazard than Catania, for return periods greater than one hundred years. The higher hazard observed, through the site approach, for Catania area can be interpreted in terms of greater damage historically observed at this town and its smaller distance from the seismogenic structures. On the other hand, the higher level of hazard found for Siracusa, throughout the Esteva-Cornell approach, could be a consequence of the features of such method which spreads out the intensities over a wide area. However, in SHA the use of a combined approach is recommended for a mutual validation of obtained results and any choice between the two approaches is strictly linked to the knowledge of the local seismotectonic features. References Albarello D. and Mucciarelli M.; 2002: Seismic hazard estimates using ill?defined macroseismic data at site. Pure Appl. Geophys., 159, 1289?1304. Cornell C.A.; 1968: Engineering seismic risk analysis. Bull. Seism. Soc. Am., 58(5), 1583-1606. D'Amico V. and Albarello D.; 2007: Codice per il calcolo della pericolosità sismica da dati di sito (freeware). Progetto DPC-INGV S1, http://esse1.mi.ingv.it/d12.html Esteva L.; 1967: Criterios para la construcción de espectros para diseño sísmico. Proceedings of XII Jornadas Sudamericanas de Ingeniería Estructural y III Simposio Panamericano de Estructuras, Caracas, 1967. Published later in Boletín del Instituto de Materiales y Modelos Estructurales, Universidad Central de Venezuela, No. 19. Ordaz M., Aguilar A. and Arboleda J.; 2007: CRISIS2007, Program for computing seismic hazard. Version 5.4, Mexico City: UNAM. SSHAC (Senior Seismic Hazard Analysis Committee); 1997: Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and use of experts. NUREG/CR-6372.

  5. The 2008 U.S. Geological Survey national seismic hazard models and maps for the central and eastern United States

    USGS Publications Warehouse

    Petersen, Mark D.; Frankel, Arthur D.; Harmsen, Stephen C.; Mueller, Charles S.; Boyd, Oliver S.; Luco, Nicolas; Wheeler, Russell L.; Rukstales, Kenneth S.; Haller, Kathleen M.

    2012-01-01

    In this paper, we describe the scientific basis for the source and ground-motion models applied in the 2008 National Seismic Hazard Maps, the development of new products that are used for building design and risk analyses, relationships between the hazard maps and design maps used in building codes, and potential future improvements to the hazard maps.

  6. Probabilistic Risk Assessment Process for High-Power Laser Operations in Outdoor Environments

    DTIC Science & Technology

    2016-01-01

    avionics data bus. In the case of a UAS-mounted laser system, the control path will additionally include a radio or satellite communications link. A remote...JBSA Fort Sam Houston, TX 78234 711 HPW/RHDO 11 . SPONSOR’S/MONITOR’S REPORT NUMBER(S) AFRL-RH-FS-JA-2015...hazard assessment pur- poses is not widespread within the laser safety community . The aim of this paper is to outline the basis of the probabilistic

  7. Identification of failure type in corroded pipelines: a bayesian probabilistic approach.

    PubMed

    Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J

    2010-07-15

    Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. 2010 Elsevier B.V. All rights reserved.

  8. Landscape-scale fuel treatment and wildfire impacts on carbon stocks and fire hazard in California spotted owl habitat

    Treesearch

    Lindsay A. Chiono; Danny L. Fry; Brandon M. Collins; Andrea H. Chatfield; Scott L. Stephens

    2017-01-01

    Forest managers are challenged with meeting numerous demands that often include wildlife habitat and carbon (C) sequestration. We used a probabilistic framework of wildfire occurrence to (1) estimate the potential for fuel treatments to reduce fire risk and hazard across the landscape and within protected California spotted owl (Strix occidentalis...

  9. A probabilistic tornado wind hazard model for the continental United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hossain, Q; Kimball, J; Mensing, R

    A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectanglemore » and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.« less

  10. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  11. Comparative risk assessments for the city of Pointe-à-Pitre (French West Indies): earthquakes and storm surge

    NASA Astrophysics Data System (ADS)

    Reveillere, A. R.; Bertil, D. B.; Douglas, J. D.; Grisanti, L. G.; Lecacheux, S. L.; Monfort, D. M.; Modaressi, H. M.; Müller, H. M.; Rohmer, J. R.; Sedan, O. S.

    2012-04-01

    In France, risk assessments for natural hazards are usually carried out separately and decision makers lack comprehensive information. Moreover, since the cause of the hazard (e.g. meteorological, geological) and the physical phenomenon that causes damage (e.g. inundation, ground shaking) may be fundamentally different, the quantitative comparison of single risk assessments that were not conducted in a compatible framework is not straightforward. Comprehensive comparative risk assessments exist in a few other countries. For instance, the Risk Map Germany project has developed and applied a methodology for quantitatively comparing the risk of relevant natural hazards at various scales (city, state) in Germany. The present on-going work applies a similar methodology to the Pointe-à-Pitre urban area, which represents more than half of the population of Guadeloupe, an overseas region in the French West Indies. Relevant hazards as well as hazard intensity levels differ from continental Europe, which will lead to different conclusions. French West Indies are prone to a large number of hazards, among which hurricanes, volcanic eruptions and earthquakes dominate. Hurricanes cause damage through three phenomena: wind, heavy rainfall and storm surge, the latter having had a preeminent role during the largest historical event in 1928. Seismic risk is characterized by many induced phenomena, among which earthquake shocks dominate. This study proposes a comparison of earthquake and cyclonic storm surge risks. Losses corresponding to hazard intensities having the same probability of occurrence are calculated. They are quantified in a common loss unit, chosen to be the direct economic losses. Intangible or indirect losses are not considered. The methodology therefore relies on (i) a probabilistic hazard assessment, (ii) a loss ratio estimation for the exposed elements and (iii) an economic estimation of these assets. Storm surge hazard assessment is based on the selection of relevant historical cyclones and on the simulation of the associated wave and cyclonic surge. The combined local sea elevations, called "set-up", are then fitted with a statistical distribution in order to obtain its time return characteristics. Several run-ups are then extracted, the inundation areas are calculated and the relative losses of the affected assets are deduced. The Probabilistic Seismic Hazard Assessment and the exposed elements location and seismic vulnerability result from past public risk assessment studies. The loss estimations are computed for several return time periods, measured in percentage of buildings being in a given EMS-98 damage state per grid block, which are then converted into loss ratio. In parallel, an asset estimation is conducted. It is mainly focused on private housing, but it considers some major public infrastructures as well. The final outcome of this work is a direct economic loss-frequency plot for earthquake and storm surge. The Probable Maximum Loss and the Average Annual Loss derivate from this risk curve. In addition, different sources of uncertainty are identified through the loss estimation process. The full propagation of these uncertainties can provide an interval of confidence, which can be assigned to the risk-curve and we show how such additional information can be useful for risk comparison.

  12. A novel visualisation tool for climate services: a case study of temperature extremes and human mortality in Europe

    NASA Astrophysics Data System (ADS)

    Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.

    2013-12-01

    Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.

  13. Volcanic Hazard Maps; the results and progress made by the IAVCEI Hazard Map working group

    NASA Astrophysics Data System (ADS)

    Calder, Eliza; Lindsay, Jan; Wright, Heather

    2017-04-01

    The IAVCEI Commission on Volcanic Hazards and Risk set up a working group on Hazard Maps in 2014. Since then, the group has led or co-organised three major workshops, and organized two thematic conference sessions. In particular we have initiated a series of workshops, named the "State of the Hazard Map" which we plan to continue (the first was held at COV8 (State of the Hazard Map 1) and second at COV9 (State of the Hazard Map 2) and the third will be held at IAVCEI General Assembly in Portland. The broad aim of these activities is to work towards an IAVCEI-endorsed considerations or guidelines document for volcanic hazard map generation. The workshops have brought together people from around the world working on volcanic hazard maps, and have had four primary objectives: 1) to review (and collect further data on) the diverse variety of methods and rationales currently used to develop maps; 2) to openly discuss approaches and experiences regarding how hazard maps are interpreted and used by different groups; 3) to discuss and prepare the IAVCEI Guidelines document; and lastly, 4) Discuss options for finalizing, publishing and disseminating the Guidelines document (e.g. wiki, report, open-source publication). This presentation will provide an update of the results and outcomes of those initiatives. This includes brief outcomes of the reviews undertaken, a survey that has been constructed in order to gather additional data, the planned structure for the guidelines documents and a summary of the key findings to date. The majority of the participants of these activities so far have come from volcano observatories or geological surveys, as these institutions commonly have primary responsibility for making operational hazard map. It is important however that others in the scientific community that work on quantification of volcanic hazard contribute to these guidelines. We therefore invite interested parties to become involved.

  14. A Bayesian modelling framework for tornado occurrences in North America

    NASA Astrophysics Data System (ADS)

    Cheng, Vincent Y. S.; Arhonditsis, George B.; Sills, David M. L.; Gough, William A.; Auld, Heather

    2015-03-01

    Tornadoes represent one of nature’s most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year.

  15. A Bayesian modelling framework for tornado occurrences in North America.

    PubMed

    Cheng, Vincent Y S; Arhonditsis, George B; Sills, David M L; Gough, William A; Auld, Heather

    2015-03-25

    Tornadoes represent one of nature's most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year.

  16. Utah Flooding Hazard: Raising Public Awareness through the Creation of Multidisciplinary Web-Based Maps

    NASA Astrophysics Data System (ADS)

    Castleton, J.; Erickson, B.; Bowman, S. D.; Unger, C. D.

    2014-12-01

    The Utah Geological Survey's (UGS) Geologic Hazards Program has partnered with the U.S. Army Corps of Engineers to create geologically derived web-based flood hazard maps. Flooding in Utah communities has historically been one of the most damaging geologic hazards. The most serious floods in Utah have generally occurred in the Great Salt Lake basin, particularly in the Weber River drainage on the western slopes of the Wasatch Range, in areas of high population density. With a growing population of 2.9 million, the state of Utah is motivated to raise awareness about the potential for flooding. The process of increasing community resiliency to flooding begins with identification and characterization of flood hazards. Many small communities in areas experiencing rapid growth have not been mapped completely by the Federal Emergency Management Agency (FEMA) Flood Insurance Rate Maps (FIRM). Existing FIRM maps typically only consider drainage areas that are greater than one square mile in determining flood zones and do not incorporate geologic data, such as the presence of young, geologically active alluvial fans that indicate a high potential for debris flows and sheet flooding. Our new flood hazard mapping combines and expands on FEMA data by incorporating mapping derived from 1:24,000-scale UGS geologic maps, LiDAR data, digital elevation models, and historical aerial photography. Our flood hazard maps are intended to supplement the FIRM maps to provide local governments and the public with additional flood hazard information so they may make informed decisions, ultimately reducing the risk to life and property from flooding hazards. Flooding information must be widely available and easily accessed. One of the most effective ways to inform the public is through web-based maps. Web-based flood hazard maps will not only supply the public with the flood information they need, but also provides a platform to add additional geologic hazards to an easily accessible format.

  17. Probabilistic choice between symmetric disparities in motion stereo matching for a lateral navigation system

    NASA Astrophysics Data System (ADS)

    Ershov, Egor; Karnaukhov, Victor; Mozerov, Mikhail

    2016-02-01

    Two consecutive frames of a lateral navigation camera video sequence can be considered as an appropriate approximation to epipolar stereo. To overcome edge-aware inaccuracy caused by occlusion, we propose a model that matches the current frame to the next and to the previous ones. The positive disparity of matching to the previous frame has its symmetric negative disparity to the next frame. The proposed algorithm performs probabilistic choice for each matched pixel between the positive disparity and its symmetric disparity cost. A disparity map obtained by optimization over the cost volume composed of the proposed probabilistic choice is more accurate than the traditional left-to-right and right-to-left disparity maps cross-check. Also, our algorithm needs two times less computational operations per pixel than the cross-check technique. The effectiveness of our approach is demonstrated on synthetic data and real video sequences, with ground-truth value.

  18. Towards a robust framework for Probabilistic Tsunami Hazard Assessment (PTHA) for local and regional tsunami in New Zealand

    NASA Astrophysics Data System (ADS)

    Mueller, Christof; Power, William; Fraser, Stuart; Wang, Xiaoming

    2013-04-01

    Probabilistic Tsunami Hazard Assessment (PTHA) is conceptually closely related to Probabilistic Seismic Hazard Assessment (PSHA). The main difference is that PTHA needs to simulate propagation of tsunami waves through the ocean and cannot rely on attenuation relationships, which makes PTHA computationally more expensive. The wave propagation process can be assumed to be linear as long as water depth is much larger than the wave amplitude of the tsunami. Beyond this limit a non-linear scheme has to be employed with significantly higher algorithmic run times. PTHA considering far-field tsunami sources typically uses unit source simulations, and relies on the linearity of the process by later scaling and combining the wave fields of individual simulations to represent the intended earthquake magnitude and rupture area. Probabilistic assessments are typically made for locations offshore but close to the coast. Inundation is calculated only for significantly contributing events (de-aggregation). For local and regional tsunami it has been demonstrated that earthquake rupture complexity has a significant effect on the tsunami amplitude distribution offshore and also on inundation. In this case PTHA has to take variable slip distributions and non-linearity into account. A unit source approach cannot easily be applied. Rupture complexity is seen as an aleatory uncertainty and can be incorporated directly into the rate calculation. We have developed a framework that manages the large number of simulations required for local PTHA. As an initial case study the effect of rupture complexity on tsunami inundation and the statistics of the distribution of wave heights have been investigated for plate-interface earthquakes in the Hawke's Bay region in New Zealand. Assessing the probability that water levels will be in excess of a certain threshold requires the calculation of empirical cumulative distribution functions (ECDF). We compare our results with traditional estimates for tsunami inundation simulations that do not consider rupture complexity. De-aggregation based on moment magnitude alone might not be appropriate, because the hazard posed by any individual event can be underestimated locally if rupture complexity is ignored.

  19. An Application of the SSHAC Level 3 Process to the Probabilistic Seismic Hazard Analysis for Nuclear Facilities at the Hanford Site, Eastern Washington, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coppersmith , Kevin J.; Bommer, Julian J.; Bryce, Robert W.

    Under the sponsorship of the US Department of Energy (DOE) and the electric utility Energy Northwest, the Pacific Northwest National Laboratory (PNNL) is conducting a probabilistic seismic hazard analysis (PSHA) within the framework of a SSHAC Level 3 procedure (Senior Seismic Hazard Analysis Committee; Budnitz et al., 1997). Specifically, the project is being conducted following the guidelines and requirements specified in NUREG-2117 (USNRC, 2012b) and consistent with approach given in the American Nuclear Standard ANSI/ANS-2.29-2008 Probabilistic Seismic Hazard Analysis. The collaboration between DOE and Energy Northwest is spawned by the needs of both organizations for an accepted PSHA with highmore » levels of regulatory assurance that can be used for the design and safety evaluation of nuclear facilities. DOE committed to this study after performing a ten-year review of the existing PSHA, as required by DOE Order 420.1C. The study will also be used by Energy Northwest as a basis for fulfilling the NRC’s 10CFR50.54(f) requirement that the western US nuclear power plants conduct PSHAs in conformance with SSHAC Level 3 procedures. The study was planned and is being carried out in conjunction with a project Work Plan, which identifies the purpose of the study, the roles and responsibilities of all participants, tasks and their associated schedules, Quality Assurance (QA) requirements, and project deliverables. New data collection and analysis activities are being conducted as a means of reducing the uncertainties in key inputs to the PSHA. It is anticipated that the results of the study will provide inputs to the site response analyses at multiple nuclear facility sites within the Hanford Site and at the Columbia Generating Station.« less

  20. Prospects and pitfalls of occupational hazard mapping: 'between these lines there be dragons'.

    PubMed

    Koehler, Kirsten A; Volckens, John

    2011-10-01

    Hazard data mapping is a promising new technique that can enhance the process of occupational exposure assessment and risk communication. Hazard maps have the potential to improve worker health by providing key input for the design of hazard intervention and control strategies. Hazard maps are developed with aid from direct-reading instruments, which can collect highly spatially and temporally resolved data in a relatively short period of time. However, quantifying spatial-temporal variability in the occupational environment is not a straightforward process, and our lack of understanding of how to ascertain and model spatial and temporal variability is a limiting factor in the use and interpretation of workplace hazard maps. We provide an example of how sources of and exposures to workplace hazards may be mischaracterized in a hazard map due to a lack of completeness and representativeness of collected measurement data. Based on this example, we believe that a major priority for research in this emerging area should focus on the development of a statistical framework to quantify uncertainty in spatially and temporally varying data. In conjunction with this need is one for the development of guidelines and procedures for the proper sampling, generation, and evaluation of workplace hazard maps.

  1. Seismic hazard of the Kivu rift (western branch, East African Rift system): new neotectonic map and seismotectonic zonation model

    NASA Astrophysics Data System (ADS)

    Delvaux, Damien; Mulumba, Jean-Luc; Sebagenzi Mwene Ntabwoba, Stanislas; Fiama Bondo, Silvanos; Kervyn, François; Havenith, Hans-Balder

    2017-04-01

    The first detailed probabilistic seismic hazard assessment has been performed for the Kivu and northern Tanganyika rift region in Central Africa. This region, which forms the central part of the Western Rift Branch, is one of the most seismically active part of the East African rift system. It was already integrated in large scale seismic hazard assessments, but here we defined a finer zonation model with 7 different zones representing the lateral variation of the geological and geophysical setting across the region. In order to build the new zonation model, we compiled homogeneous cross-border geological, neotectonic and sismotectonic maps over the central part of East D.R. Congo, SW Uganda, Rwanda, Burundi and NW Tanzania and defined a new neotectonic sheme. The seismic risk assessment is based on a new earthquake catalogue, compiled on the basis of various local and global earthquake catalogues. The use of macroseismic epicenters determined from felt earthquakes allowed to extend the time-range back to the beginning of the 20th century, spanning 126 years, with 1068 events. The magnitudes have been homogenized to Mw and aftershocks removed. From this initial catalogue, a catalogue of 359 events from 1956 to 2015 and with M > 4.4 has been extracted for the seismic hazard assessment. The seismotectonic zonation includes 7 seismic source areas that have been defined on the basis of the regional geological structure, neotectonic fault systems, basin architecture and distribution of thermal springs and earthquake epicenters. The Gutenberg-Richter seismic hazard parameters were determined using both the least square linear fit and the maximum likelihood method (Kijko & Smit aue program). Seismic hazard maps have been computed with the Crisis 2012 software using 3 different attenuation laws. We obtained higher PGA values (475 years return period) for the Kivu rift region than the previous estimates (Delvaux et al., 2016). They vary laterally in function of the tectonic setting, with the lowest value in the volcanically active Virunga - Rutshuru zone, highest in the currently non-volcanic parts of Lake Kivu, Rusizi valley and North Tanganyika rift zone, and intermediate in the regions flanking the axial rift zone. Those are to be considered as preliminary values, as there are a number of important uncertainties such as the heterogeneity and relatively short duration of the instrumental seismic catalogue used (60 years), the absence of locally derived attenuation laws and thus the choice of the attenuation laws used, and the seismic zonation scheme. Delvaux, D. et al., 2016. Journal of African Earth Sciences, doi: 10.1016/j.jafrearsci.2016.10.004.

  2. Estimating drought risk across Europe from reported drought impacts, hazard indicators and vulnerability factors

    NASA Astrophysics Data System (ADS)

    Blauhut, V.; Stahl, K.; Stagge, J. H.; Tallaksen, L. M.; De Stefano, L.; Vogt, J.

    2015-12-01

    Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work (1) tests the capability of commonly applied hazard indicators and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and (2) combines information on past drought impacts, drought hazard indicators, and vulnerability factors into estimates of drought risk at the pan-European scale. This "hybrid approach" bridges the gap between traditional vulnerability assessment and probabilistic impact forecast in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro region specific sensitivities of hazard indicators, with the Standardised Precipitation Evapotranspiration Index for a twelve month aggregation period (SPEI-12) as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictor, with information about landuse and water resources as best vulnerability-based predictors. (3) The application of the "hybrid approach" revealed strong regional (NUTS combo level) and sector specific differences in drought risk across Europe. The majority of best predictor combinations rely on a combination of SPEI for shorter and longer aggregation periods, and a combination of information on landuse and water resources. The added value of integrating regional vulnerability information with drought risk prediction could be proven. Thus, the study contributes to the overall understanding of drivers of drought impacts, current practice of drought indicators selection for specific application, and drought risk assessment.

  3. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    NASA Astrophysics Data System (ADS)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions Euros, shows that geological and geophysical investigations necessary to assess a reliable deterministic hazard evaluation are largely justified.

  4. Documentation for the 2008 Update of the United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, Mark D.; Frankel, Arthur D.; Harmsen, Stephen C.; Mueller, Charles S.; Haller, Kathleen M.; Wheeler, Russell L.; Wesson, Robert L.; Zeng, Yuehua; Boyd, Oliver S.; Perkins, David M.; Luco, Nicolas; Field, Edward H.; Wills, Chris J.; Rukstales, Kenneth S.

    2008-01-01

    The 2008 U.S. Geological Survey (USGS) National Seismic Hazard Maps display earthquake ground motions for various probability levels across the United States and are applied in seismic provisions of building codes, insurance rate structures, risk assessments, and other public policy. This update of the maps incorporates new findings on earthquake ground shaking, faults, seismicity, and geodesy. The resulting maps are derived from seismic hazard curves calculated on a grid of sites across the United States that describe the frequency of exceeding a set of ground motions. The USGS National Seismic Hazard Mapping Project developed these maps by incorporating information on potential earthquakes and associated ground shaking obtained from interaction in science and engineering workshops involving hundreds of participants, review by several science organizations and State surveys, and advice from two expert panels. The National Seismic Hazard Maps represent our assessment of the 'best available science' in earthquake hazards estimation for the United States (maps of Alaska and Hawaii as well as further information on hazard across the United States are available on our Web site at http://earthquake.usgs.gov/research/hazmaps/).

  5. Preliminary volcanic hazards evaluation for Los Alamos National Laboratory Facilities and Operations : current state of knowledge and proposed path forward

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keating, Gordon N.; Schultz-Fellenz, Emily S.; Miller, Elizabeth D.

    2010-09-01

    The integration of available information on the volcanic history of the region surrounding Los Alamos National Laboratory indicates that the Laboratory is at risk from volcanic hazards. Volcanism in the vicinity of the Laboratory is unlikely within the lifetime of the facility (ca. 50–100 years) but cannot be ruled out. This evaluation provides a preliminary estimate of recurrence rates for volcanic activity. If further assessment of the hazard is deemed beneficial to reduce risk uncertainty, the next step would be to convene a formal probabilistic volcanic hazards assessment.

  6. Issues of tsunami hazard maps revealed by the 2011 Tohoku tsunami

    NASA Astrophysics Data System (ADS)

    Sugimoto, M.

    2013-12-01

    Tsunami scientists are imposed responsibilities of selection for people's tsunami evacuation place after the 2011 Tohoku Tsunami in Japan. A lot of matured people died out of tsunami hazard zone based on tsunami hazard map though students made a miracle by evacuation on their own judgment in Kamaishi city. Tsunami hazard maps were based on numerical model smaller than actual magnitude 9. How can we bridge the gap between hazard map and future disasters? We have to discuss about using tsunami numerical model better enough to contribute tsunami hazard map. How do we have to improve tsunami hazard map? Tsunami hazard map should be revised included possibility of upthrust or downthrust after earthquakes and social information. Ground sank 1.14m below sea level in Ayukawa town, Tohoku. Ministry of Land, Infrastructure, Transport and Tourism's research shows around 10% people know about tsunami hazard map in Japan. However, people know about their evacuation places (buildings) through experienced drills once a year even though most people did not know about tsunami hazard map. We need wider spread of tsunami hazard with contingency of science (See the botom disaster handbook material's URL). California Emergency Management Agency (CEMA) team practically shows one good practice and solution to me. I followed their field trip in Catalina Island, California in Sep 2011. A team members are multidisciplinary specialists: A geologist, a GIS specialist, oceanographers in USC (tsunami numerical modeler) and a private company, a local policeman, a disaster manager, a local authority and so on. They check field based on their own specialties. They conduct an on-the-spot inspection of ambiguous locations between tsunami numerical model and real field conditions today. The data always become older. They pay attention not only to topographical conditions but also to social conditions: vulnerable people, elementary schools and so on. It takes a long time to check such field information, however tsunami hazard map based on numerical model should be this process. Tsunami scientists should not enter into the inhumane business by using tsunami numerical model. It includes accountability to society therefore scientists need scientific ethics and humanitarian attention. Should only tsunami scientist have responsibility for human life? Multidisciplinary approach is essential for mitigation like CEMA. I am taking on hazard map training course for disaster management officers from developing countries in JICA training course. I would like to discuss how to improve tsunami hazard map after the 2011 Tohoku tsunami experience in this presentation. A multidisciplinary exparts team of CEMA's tsunami hazard map

  7. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

    USGS Publications Warehouse

    Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.

    2004-01-01

    The ground motion hazard for Sumatra and the Malaysian peninsula is calculated in a probabilistic framework, using procedures developed for the US National Seismic Hazard Maps. We constructed regional earthquake source models and used standard published and modified attenuation equations to calculate peak ground acceleration at 2% and 10% probability of exceedance in 50 years for rock site conditions. We developed or modified earthquake catalogs and declustered these catalogs to include only independent earthquakes. The resulting catalogs were used to define four source zones that characterize earthquakes in four tectonic environments: subduction zone interface earthquakes, subduction zone deep intraslab earthquakes, strike-slip transform earthquakes, and intraplate earthquakes. The recurrence rates and sizes of historical earthquakes on known faults and across zones were also determined from this modified catalog. In addition to the source zones, our seismic source model considers two major faults that are known historically to generate large earthquakes: the Sumatran subduction zone and the Sumatran transform fault. Several published studies were used to describe earthquakes along these faults during historical and pre-historical time, as well as to identify segmentation models of faults. Peak horizontal ground accelerations were calculated using ground motion prediction relations that were developed from seismic data obtained from the crustal interplate environment, crustal intraplate environment, along the subduction zone interface, and from deep intraslab earthquakes. Most of these relations, however, have not been developed for large distances that are needed for calculating the hazard across the Malaysian peninsula, and none were developed for earthquake ground motions generated in an interplate tectonic environment that are propagated into an intraplate tectonic environment. For the interplate and intraplate crustal earthquakes, we have applied ground-motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.

  8. Volcanic Hazard Map as a Tool of City Planning: Experiences at Galeras Volcano and the county of Pasto, Colombia.

    NASA Astrophysics Data System (ADS)

    Calvache, M. L.

    2001-12-01

    Large populated areas located near active volcanoes emphasize the importance to take effective actions towards risk reduction. A volcanic hazard map is believed to be the first step in order to inform government officials, private institutions and community about the danger that poses a particular volcano. The hazard map is a tool that must be used to evaluate risk and elaborate risk map. The risk map must be used by decision makers to take measurements about the land-use accordingly with the hazard present in the area and to prepare contingency plans. In 1998 and 1999 the Colombian government pass a law, where every county of the country has to have a plan of land-use and development (POT) for the following 10 years. The POT must consider natural hazard and risk such as seismicity, landslide and volcanic activity. Without the plan, the county will not receive any economical support from the central government. In the county of Pasto, the largest city in the influence zone of Galeras volcano, the hazard map has been used to promote educational plan in schools, increasing public awareness of Galeras and its hazard, advise and persuade decision makers to consider Galeras hazard in the city development plans. On the other hand, the hazard map has been mistaken as a risk map and it has originated opposition due to the measurements taken as a consequence of the map. This presentation deal with the gain experience of using the hazard map as a tool of information and planing and the confrontation that any decision implies with political, social and economic interest.

  9. Potential Impacts of Accelerated Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, L. R.; Vail, L. W.

    2016-05-31

    This research project is part of the U.S. Nuclear Regulatory Commission’s (NRC’s) Probabilistic Flood Hazard Assessment (PFHA) Research plan in support of developing a risk-informed licensing framework for flood hazards and design standards at proposed new facilities and significance determination tools for evaluating potential deficiencies related to flood protection at operating facilities. The PFHA plan aims to build upon recent advances in deterministic, probabilistic, and statistical modeling of extreme precipitation events to develop regulatory tools and guidance for NRC staff with regard to PFHA for nuclear facilities. The tools and guidance developed under the PFHA plan will support and enhancemore » NRC’s capacity to perform thorough and efficient reviews of license applications and license amendment requests. They will also support risk-informed significance determination of inspection findings, unusual events, and other oversight activities.« less

  10. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less

  11. Comparison of Probabilistic Coastal Inundation Maps Based on Historical Storms and Statistically Modeled Storm Ensemble

    NASA Astrophysics Data System (ADS)

    Feng, X.; Sheng, Y.; Condon, A. J.; Paramygin, V. A.; Hall, T.

    2012-12-01

    A cost effective method, JPM-OS (Joint Probability Method with Optimal Sampling), for determining storm response and inundation return frequencies was developed and applied to quantify the hazard of hurricane storm surges and inundation along the Southwest FL,US coast (Condon and Sheng 2012). The JPM-OS uses piecewise multivariate regression splines coupled with dimension adaptive sparse grids to enable the generation of a base flood elevation (BFE) map. Storms are characterized by their landfall characteristics (pressure deficit, radius to maximum winds, forward speed, heading, and landfall location) and a sparse grid algorithm determines the optimal set of storm parameter combinations so that the inundation from any other storm parameter combination can be determined. The end result is a sample of a few hundred (197 for SW FL) optimal storms which are simulated using a dynamically coupled storm surge / wave modeling system CH3D-SSMS (Sheng et al. 2010). The limited historical climatology (1940 - 2009) is explored to develop probabilistic characterizations of the five storm parameters. The probability distributions are discretized and the inundation response of all parameter combinations is determined by the interpolation in five-dimensional space of the optimal storms. The surge response and the associated joint probability of the parameter combination is used to determine the flood elevation with a 1% annual probability of occurrence. The limited historical data constrains the accuracy of the PDFs of the hurricane characteristics, which in turn affect the accuracy of the BFE maps calculated. To offset the deficiency of limited historical dataset, this study presents a different method for producing coastal inundation maps. Instead of using the historical storm data, here we adopt 33,731 tracks that can represent the storm climatology in North Atlantic basin and SW Florida coasts. This large quantity of hurricane tracks is generated from a new statistical model which had been used for Western North Pacific (WNP) tropical cyclone (TC) genesis (Hall 2011) as well as North Atlantic tropical cyclone genesis (Hall and Jewson 2007). The introduction of these tracks complements the shortage of the historical samples and allows for more reliable PDFs required for implementation of JPM-OS. Using the 33,731 tracks and JPM-OS, an optimal storm ensemble is determined. This approach results in different storms/winds for storm surge and inundation modeling, and produces different Base Flood Elevation maps for coastal regions. Coastal inundation maps produced by the two different methods will be discussed in detail in the poster paper.

  12. Selected Images of the Effects of the October 15, 2006, Kiholo Bay-Mahukona, Hawai'i, Earthquakes and Recovery Efforts

    USGS Publications Warehouse

    Takahashi, Taeko Jane; Ikeda, Nancy A.; Okubo, Paul G.; Sako, Maurice K.; Dow, David C.; Priester, Anna M.; Steiner, Nolan A.

    2011-01-01

    Although the vast majority of earthquakes in the State of Hawaii are closely related to the active volcanism associated with the southeastern part of the Island of Hawai‘i, the October 2006 Kīholo Bay and Māhukona earthquakes clearly suggest the devastating potential of deeper lithospheric earthquakes. Large earthquakes thought to be nearly M7 have struck near the islands of Lāna‘i (1871) and Maui (1938). It is thought that these, like the 2006 earthquakes, were deep lithospheric flexure earthquakes (Wyss and Koyanagi, 1992; Klein and others, 2001). Thus, it is important to recognize the potential seismic hazard posed by such earthquakes beneath the older Hawaiian Islands. The data and observations afforded by the 2006 earthquakes promise to improve probabilistic seismic hazards modeling in Hawai‘i. The effects of the October 15, 2006, Kīholo Bay-Māhukona earthquakes are shown in images taken from the coastal route along the northern half of the Island of Hawai‘i, where damage was the most concentrated. The direction of presentation is counter-clockwise, from Pa‘auilo on the eastern or windward (Hāmākua) side to Kealakekua Bay on the western or leeward (Kona) side. A list of sites, their locations, coordinates, and distance from the epicenter at Kīholo Bay are given in table 1. A Google Earth map (fig. 7) and a topographic map (fig. 8) pinpoint the 36 sites where damage was documented and digital images were compiled for this collection.

  13. Hazard function analysis for flood planning under nonstationarity

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  14. Development of a Probabilistic Tornado Wind Hazard Model for the Continental United States Volume I: Main Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boissonnade, A; Hossain, Q; Kimball, J

    Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526,more » in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States.« less

  15. Multi-atlas based segmentation using probabilistic label fusion with adaptive weighting of image similarity measures.

    PubMed

    Sjöberg, C; Ahnesjö, A

    2013-06-01

    Label fusion multi-atlas approaches for image segmentation can give better segmentation results than single atlas methods. We present a multi-atlas label fusion strategy based on probabilistic weighting of distance maps. Relationships between image similarities and segmentation similarities are estimated in a learning phase and used to derive fusion weights that are proportional to the probability for each atlas to improve the segmentation result. The method was tested using a leave-one-out strategy on a database of 21 pre-segmented prostate patients for different image registrations combined with different image similarity scorings. The probabilistic weighting yields results that are equal or better compared to both fusion with equal weights and results using the STAPLE algorithm. Results from the experiments demonstrate that label fusion by weighted distance maps is feasible, and that probabilistic weighted fusion improves segmentation quality more the stronger the individual atlas segmentation quality depends on the corresponding registered image similarity. The regions used for evaluation of the image similarity measures were found to be more important than the choice of similarity measure. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Long-period amplification in deep alluvial basins and consequences for site-specific probabilistic seismic-hazard: the case of Castelleone in the Po Plain (Northern Italy)

    NASA Astrophysics Data System (ADS)

    Barani, S.; Mascandola, C.; Massa, M.; Spallarossa, D.

    2017-12-01

    The recent Emilia seismic sequence (Northern Italy) occurred at the end of the first half of 2012 with main shock of Mw6.1 highlighted the importance of studying site effects in the Po Plain, the larger and deeper sedimentary basin in Italy. As has long been known, long-period amplification related to deep sedimentary basins can significantly affect the characteristics of the ground-motion induced by strong earthquakes. It follows that the effects of deep sedimentary deposits on ground shaking require special attention during the definition of the design seismic action. The work presented here analyzes the impact of deep-soil discontinuities on ground-motion amplification, with particular focus on long-period probabilistic seismic-hazard assessment. The study focuses on the site of Castelleone, where a seismic station of the Italian National Seismic Network has been recording since 2009. Our study includes both experimental and numerical site response analyses. Specifically, extensive active and passive geophysical measurements were carried out in order to define a detailed shear-wave velocity (VS) model to be used in the numerical analyses. These latter are needed to assess the site-specific ground-motion hazard. Besides classical seismic refraction profiles and multichannel analysis of surface waves, we analyzed ambient vibration measurements in both single and array configurations. The VS profile was determined via joint inversion of the experimental phase-velocity dispersion curve with the ellipticity curve derived from horizontal-to-vertical spectral ratios. The profile shows two main discontinuities at depths of around 160 and 1350 m, respectively. The probabilistic site-specific hazard was assessed in terms of both spectral acceleration and displacement. A partially non-ergodic approach was adopted. We have found that the spectral acceleration hazard is barely sensitive to long-period (up to 10 s) amplification related to the deeper discontinuity whereas the displacement hazard is strongly affected. Our results show that neglecting the effects of the deeper discontinuity implies an underestimation of the hazard of up to about 49% for a mean return period (MRP) of 475 years and 57% for an MRP of 2475 years, with possible consequences on the design of very tall buildings and large bridges.

  17. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude distribution [Youngs and Coppersmith, 1985]. Notably, the software can deal with uncertainty in the seismicity input parameters such as maximum magnitude value. CRISIS offers a set of built-in GMPEs, as well as the possibility of defining new ones by providing information in a tabular format. Our study shows that in case of Ajaristkali HPP study area, significant contribution to Seismic Hazard comes from local sources with quite low Mmax values, thus these two attenuation lows give us quite different PGA and SA values.

  18. Integrating multidisciplinary science, modelling and impact data into evolving, syn-event volcanic hazard mapping and communication: A case study from the 2012 Tongariro eruption crisis, New Zealand

    NASA Astrophysics Data System (ADS)

    Leonard, Graham S.; Stewart, Carol; Wilson, Thomas M.; Procter, Jonathan N.; Scott, Bradley J.; Keys, Harry J.; Jolly, Gill E.; Wardman, Johnny B.; Cronin, Shane J.; McBride, Sara K.

    2014-10-01

    New Zealand's Tongariro National Park volcanoes produce hazardous eruptions every few years to decades. On 6 August 2012 the Te Maari vent of Tongariro Volcano erupted, producing a series of explosions and a fine ash of minor volume which was dispersed rapidly to the east. This manuscript presents a summary of the eruption impacts and the way these supported science communication during the crisis, particularly in terms of hazard map development. The most significant proximal impact was damage from pyroclastic surges and ballistics to the popular and economically-important Tongariro Alpine Crossing track. The only hazard to affect the medial impact zone was a few mms of ashfall with minor impacts. Field testing indicated that the Te Maari ash had extremely low resistivity when wetted, implying a very high potential to cause disruption to nationally-important power transmission networks via the mechanism of insulator flashover. This was not observed, presumably due to insufficient ash accumulation on insulators. Virtually no impacts from distal ashfall were reported. Post-event analysis of PM10 data demonstrates the additional value of regional air quality monitoring networks in quantifying population exposure to airborne respirable ash. While the eruption was minor, it generated a high level of public interest and a demand for information on volcanic hazards and impacts from emergency managers, the public, critical infrastructure managers, health officials, and the agriculture sector. Meeting this demand fully taxed available resources. We present here aspects of the New Zealand experience which may have wider applicability in moving towards improved integration of hazard impact information, mapping, and communication. These include wide use of a wiki technical clearinghouse and email listservs, a focus on multi-agency consistent messages, and a recently developed environment of collaboration and alignment of both research funding and technical science advice. Hazard maps were integral to science communication during the crisis, but there is limited international best practice information available on hazard maps as communication devices, as most volcanic hazard mapping literature is concerned with defining hazard zones. We propose that hazard maps are only as good as the communications framework and inter-agency relationships in which they are embedded, and we document in detail the crisis hazard map development process. We distinguish crisis hazard maps from background hazard maps and ashfall prediction maps, illustrating the complementary nature of these three distinct communication mechanisms. We highlight issues that arose and implications for the development of future maps.

  19. The European ASAMPSA_E project : towards guidance to model the impact of high amplitude natural hazards in the probabilistic safety assessment of nuclear power plants. Information on the project progress and needs from the geosciences.

    NASA Astrophysics Data System (ADS)

    Raimond, Emmanuel; Decker, Kurt; Guigueno, Yves; Klug, Joakim; Loeffler, Horst

    2015-04-01

    The Fukushima nuclear accident in Japan resulted from the combination of two correlated extreme external events (earthquake and tsunami). The consequences, in particular flooding, went beyond what was considered in the initial engineering design design of nuclear power plants (NPPs). Such situations can in theory be identified using probabilistic safety assessment (PSA) methodology. PSA results may then lead industry (system suppliers and utilities) or Safety Authorities to take appropriate decisions to reinforce the defence-in-depth of the NPP for low probability event but high amplitude consequences. In reality, the development of such PSA remains a challenging task. Definitions of the design basis of NPPs, for example, require data on events with occurrence probabilities not higher than 10-4 per year. Today, even lower probabilities, down to 10-8, are expected and typically used for probabilistic safety analyses (PSA) of NPPs and the examination of so-called design extension conditions. Modelling the combinations of natural or man-made hazards that can affect a NPP and affecting some meaningful probability of occurrence seems to be difficult. The European project ASAMPSAE (www.asampsa.eu) gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants. It offers a framework to discuss, at a technical level, how "extended PSA" can be developed efficiently and be used to verify if the robustness of Nuclear Power Plants (NPPs) in their environment is sufficient. The paper will present the objectives of this project, some first lessons and introduce which type of guidance is being developed. It will explain the need of expertise from geosciences to support the nuclear safety assessment in the different area (seismotectonic, hydrological, meteorological and biological hazards, …).

  20. Volcanic hazard management in dispersed volcanism areas

    NASA Astrophysics Data System (ADS)

    Marrero, Jose Manuel; Garcia, Alicia; Ortiz, Ramon

    2014-05-01

    Traditional volcanic hazard methodologies were developed mainly to deal with the big stratovolcanoes. In such type of volcanoes, the hazard map is an important tool for decision-makers not only during a volcanic crisis but also for territorial planning. According to the past and recent eruptions of a volcano, all possible volcanic hazards are modelled and included in the hazard map. Combining the hazard map with the Event Tree the impact area can be zoned and defining the likely eruptive scenarios that will be used during a real volcanic crisis. But in areas of disperse volcanism is very complex to apply the same volcanic hazard methodologies. The event tree do not take into account unknown vents, because the spatial concepts included in it are only related with the distance reached by volcanic hazards. The volcanic hazard simulation is also difficult because the vent scatter modifies the results. The volcanic susceptibility try to solve this problem, calculating the most likely areas to have an eruption, but the differences between low and large values obtained are often very small. In these conditions the traditional hazard map effectiveness could be questioned, making necessary a change in the concept of hazard map. Instead to delimit the potential impact areas, the hazard map should show the expected behaviour of the volcanic activity and how the differences in the landscape and internal geo-structures could condition such behaviour. This approach has been carried out in La Palma (Canary Islands), combining the concept of long-term hazard map with the short-term volcanic scenario to show the expected volcanic activity behaviour. The objective is the decision-makers understand how a volcanic crisis could be and what kind of mitigation measurement and strategy could be used.

  1. Development of Local Amplification Factors in the NEAM Region for Production of Regional Tsunami Hazard Maps

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Glimsdal, S.; Løvholt, F.; Orefice, S.; Romano, F.; Brizuela, B.; Lorito, S.; Hoechner, A.; Babeyko, A. Y.

    2016-12-01

    The standard way of estimating tsunami inundation is by applying numerical depth-averaged shallow-water run-up models. However, for a regional Probabilistic Tsunami Hazard Assessment (PTHA), applying such inundation models may be too time-consuming. A faster, yet less accurate procedure, is to relate the near-shore surface elevations at offshore points to maximum shoreline water levels by using a set of amplification factors based on the characteristics of the incident wave and the bathymetric slope. The surface elevation at the shoreline then acts as a rough approximation for the maximum inundation height or run-up height along the shoreline. An amplification-factor procedure based on a limited set of idealized broken shoreline segments has previously been applied to estimate the maximum inundation heights globally. Here, we present a study where this technique is developed further, by taking into account the local bathymetric profiles. We extract a large number of local bathymetric transects over a significant part of the North East Atlantic, the Mediterranean and connected seas (NEAM) region. For each bathymetric transect, we compute the wave amplification from an offshore control point to points close to the shoreline using a linear shallow-water model for waves of different period and polarity with a sinusoidal pulse wave as input. The amplification factors are then tabulated. We present maximum water levels from the amplification factor method, and compare these with results from conventional inundation models. Finally, we demonstrate how the amplification factor method can be convolved with PTHA results to provide regional tsunami hazard maps. This work has been supported by the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement 603839 (Project ASTARTE), and the TSUMAPS-NEAM Project (http://www.tsumapsneam.eu/), co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  2. Development of Local Amplification Factors in the NEAM Region for Production of Regional Tsunami Hazard Maps

    NASA Astrophysics Data System (ADS)

    Glimsdal, Sylfest; Løvholt, Finn; Bonnevie Harbitz, Carl; Orefice, Simone; Romano, Fabrizio; Brizuela, Beatriz; Lorito, Stefano; Hoechner, Andreas; Babeyko, Andrey

    2017-04-01

    The standard way of estimating tsunami inundation is by applying numerical depth-averaged shallow-water run-up models. However, for a regional Probabilistic Tsunami Hazard Assessment (PTHA), applying such inundation models may be too time-consuming. A faster, yet less accurate procedure, is to relate the near-shore surface elevations at offshore points to maximum shoreline water levels by using a set of amplification factors based on the characteristics of the incident wave and the bathymetric slope. The surface elevation at the shoreline then acts as a rough approximation for the maximum inundation height or run-up height along the shoreline. An amplification-factor procedure based on a limited set of idealized broken shoreline segments has previously been applied to estimate the maximum inundation heights globally. Here, we present a study where this technique is developed further, by taking into account the local bathymetric profiles. We extract a large number of local bathymetric transects over a significant part of the North East Atlantic, the Mediterranean and connected seas (NEAM region). For each bathymetric transect, we compute the wave amplification from an offshore control point to points close to the shoreline using a linear shallow-water model for waves of different period and polarity with a sinusoidal pulse wave as input. The amplification factors are then tabulated. We present maximum water levels from the amplification factor method, and compare these with results from conventional inundation models. Finally, we demonstrate how the amplification factor method can be convolved with PTHA results to provide regional tsunami hazard maps. This work has been supported by the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement 603839 (Project ASTARTE), and the TSUMAPS-NEAM Project (http://www.tsumapsneam.eu/), co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  3. Seismic Landslide Hazard for the City of Berkeley, California

    USGS Publications Warehouse

    Miles, Scott B.; Keefer, David K.

    2001-01-01

    This map describes the possible hazard from earthquake-induced landslides for the city of Berkeley, CA. The hazard depicted by this map was modeled for a scenario corresponding to an M=7.1 earthquake on the Hayward, CA fault. This scenario magnitude is associated with complete rupture of the northern and southern segments of the Hayward fault, an event that has an estimated return period of about 500 years. The modeled hazard also corresponds to completely saturated ground-water conditions resulting from an extreme storm event or series of storm events. This combination of earthquake and ground-water scenarios represents a particularly severe state of hazard for earthquake-induced landslides. For dry ground-water conditions, overall hazard will be less, while relative patterns of hazard are likely to change. Purpose: The map is intended as a tool for regional planning. Any site-specific planning or analysis should be undertaken with the assistance of a qualified geotechnical engineer. This hazard map should not be used as a substitute to the State of California Seismic Hazard Zones map for the same area. (See California Department of Conservation, Division of Mines and Geology, 1999). As previously noted for maps of this type by Wieczorek and others (1985), this map should not be used as a basis to determine the absolute risk from seismically triggered landslides at any locality, as the sole justification for zoning or rezoning any parcel, for detailed design of any lifeline, for site-specific hazard-reduction planning, or for setting or modifying insurance rates.

  4. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2015-12-01

    The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence probabilities P30(i) for all earthquakes (CEFMs) and calculated maximum coastal tsunami heights. In the synthesis, aleatory uncertainties relating to incompleteness of governing equations, CEFM modeling, bathymetry and topography data, etc, are modeled assuming a log-normal probabilistic distribution. Examples of tsunami hazard curves will be presented.

  5. Urban Seismic Hazard Mapping for Memphis, Shelby County, Tennessee

    USGS Publications Warehouse

    Gomberg, Joan

    2006-01-01

    Earthquakes cannot be predicted, but scientists can forecast how strongly the ground is likely to shake as a result of an earthquake. Seismic hazard maps provide one way of conveying such forecasts. The U.S. Geological Survey (USGS), which produces seismic hazard maps for the Nation, is now engaged in developing more detailed maps for vulnerable urban areas. The first set of these maps is now available for Memphis, Tennessee.

  6. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paces, James B.

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rindsmore » on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.« less

  7. Safety design approach for external events in Japan sodium-cooled fast reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamano, H.; Kubo, S.; Tani, A.

    2012-07-01

    This paper describes a safety design approach for external events in the design study of Japan sodium-cooled fast reactor. An emphasis is introduction of a design extension external condition (DEEC). In addition to seismic design, other external events such as tsunami, strong wind, abnormal temperature, etc. were addressed in this study. From a wide variety of external events consisting of natural hazards and human-induced ones, a screening method was developed in terms of siting, consequence, frequency to select representative events. Design approaches for these events were categorized on the probabilistic, statistical and deterministic basis. External hazard conditions were considered mainlymore » for DEECs. In the probabilistic approach, the DEECs of earthquake, tsunami and strong wind were defined as 1/10 of exceedance probability of the external design bases. The other representative DEECs were also defined based on statistical or deterministic approaches. (authors)« less

  8. 230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis

    USGS Publications Warehouse

    Paces, James B.

    2014-01-01

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  9. Research on response spectrum of dam based on scenario earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoliang; Zhang, Yushan

    2017-10-01

    Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.

  10. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  11. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    PubMed

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  12. Probabilistic performance-based design for high performance control systems

    NASA Astrophysics Data System (ADS)

    Micheli, Laura; Cao, Liang; Gong, Yongqiang; Cancelli, Alessandro; Laflamme, Simon; Alipour, Alice

    2017-04-01

    High performance control systems (HPCS) are advanced damping systems capable of high damping performance over a wide frequency bandwidth, ideal for mitigation of multi-hazards. They include active, semi-active, and hybrid damping systems. However, HPCS are more expensive than typical passive mitigation systems, rely on power and hardware (e.g., sensors, actuators) to operate, and require maintenance. In this paper, a life cycle cost analysis (LCA) approach is proposed to estimate the economic benefit these systems over the entire life of the structure. The novelty resides in the life cycle cost analysis in the performance based design (PBD) tailored to multi-level wind hazards. This yields a probabilistic performance-based design approach for HPCS. Numerical simulations are conducted on a building located in Boston, MA. LCA are conducted for passive control systems and HPCS, and the concept of controller robustness is demonstrated. Results highlight the promise of the proposed performance-based design procedure.

  13. Ground mapping resolution accuracy of a scanning radiometer from a geostationary satellite.

    PubMed

    Stremler, F G; Khalil, M A; Parent, R J

    1977-06-01

    Measures of the spatial and spatial rate (frequency) mapping of scanned visual imagery from an earth reference system to a spin-scan geostationary satellite are examined. Mapping distortions and coordinate inversions to correct for these distortions are formulated in terms of geometric transformations between earth and satellite frames of reference. Probabilistic methods are used to develop relations for obtainable mapping resolution when coordinate inversions are employed.

  14. Landslide prediction using combined deterministic and probabilistic methods in hilly area of Mt. Medvednica in Zagreb City, Croatia

    NASA Astrophysics Data System (ADS)

    Wang, Chunxiang; Watanabe, Naoki; Marui, Hideaki

    2013-04-01

    The hilly slopes of Mt. Medvednica are stretched in the northwestern part of Zagreb City, Croatia, and extend to approximately 180km2. In this area, landslides, e.g. Kostanjek landslide and Črešnjevec landslide, have brought damage to many houses, roads, farmlands, grassland and etc. Therefore, it is necessary to predict the potential landslides and to enhance landslide inventory for hazard mitigation and security management of local society in this area. We combined deterministic method and probabilistic method to assess potential landslides including their locations, size and sliding surfaces. Firstly, this study area is divided into several slope units that have similar topographic and geological characteristics using the hydrology analysis tool in ArcGIS. Then, a GIS-based modified three-dimensional Hovland's method for slope stability analysis system is developed to identify the sliding surface and corresponding three-dimensional safety factor for each slope unit. Each sliding surface is assumed to be the lower part of each ellipsoid. The direction of inclination of the ellipsoid is considered to be the same as the main dip direction of the slope unit. The center point of the ellipsoid is randomly set to the center point of a grid cell in the slope unit. The minimum three-dimensional safety factor and corresponding critical sliding surface are also obtained for each slope unit. Thirdly, since a single value of safety factor is insufficient to evaluate the slope stability of a slope unit, the ratio of the number of calculation cases in which the three-dimensional safety factor values less than 1.0 to the total number of trial calculation is defined as the failure probability of the slope unit. If the failure probability is more than 80%, the slope unit is distinguished as 'unstable' from other slope units and the landslide hazard can be mapped for the whole study area.

  15. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  16. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  17. Towards a Proactive Risk Mitigation Strategy at La Fossa Volcano, Vulcano Island

    NASA Astrophysics Data System (ADS)

    Biass, S.; Gregg, C. E.; Frischknecht, C.; Falcone, J. L.; Lestuzzi, P.; di Traglia, F.; Rosi, M.; Bonadonna, C.

    2014-12-01

    A comprehensive risk assessment framework was built to develop proactive risk reduction measures for Vulcano Island, Italy. This framework includes identification of eruption scenarios; probabilistic hazard assessment, quantification of hazard impacts on the built environment, accessibility assessment on the island and risk perception study. Vulcano, a 21 km2 island with two primary communities host to 900 permanent residents and up to 10,000 visitors during summer, shows a strong dependency on the mainland for basic needs (water, energy) and relies on a ~2 month tourism season for its economy. The recent stratigraphy reveals a dominance of vulcanian and subplinian eruptions, producing a range of hazards acting at different time scales. We developed new methods to probabilistically quantify the hazard related to ballistics, lahars and tephra for all eruption styles. We also elaborated field- and GIS- based methods to assess the physical vulnerability of the built environment and created dynamic models of accessibility. Results outline the difference of hazard between short and long-lasting eruptions. A subplinian eruption has a 50% probability of impacting ~30% of the buildings within days after the eruption, but the year-long damage resulting from a long-lasting vulcanian eruption is similar if tephra is not removed from rooftops. Similarly, a subplinian eruption results in a volume of 7x105 m3 of material potentially remobilized into lahars soon after the eruption. Similar volumes are expected for a vulcanian activity over years, increasing the hazard of small lahars. Preferential lahar paths affect critical infrastructures lacking redundancy, such as the road network, communications systems, the island's only gas station, and access to the island's two evacuation ports. Such results from hazard, physical and systemic vulnerability help establish proactive volcanic risk mitigation strategies and may be applicable in other island settings.

  18. A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)

    NASA Astrophysics Data System (ADS)

    Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.

    2012-12-01

    A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.

  19. Digital geomorphological landslide hazard mapping of the Alpago area, Italy

    NASA Astrophysics Data System (ADS)

    van Westen, Cees J.; Soeters, Rob; Sijmons, Koert

    Large-scale geomorphological maps of mountainous areas are traditionally made using complex symbol-based legends. They can serve as excellent "geomorphological databases", from which an experienced geomorphologist can extract a large amount of information for hazard mapping. However, these maps are not designed to be used in combination with a GIS, due to their complex cartographic structure. In this paper, two methods are presented for digital geomorphological mapping at large scales using GIS and digital cartographic software. The methods are applied to an area with a complex geomorphological setting on the Borsoia catchment, located in the Alpago region, near Belluno in the Italian Alps. The GIS database set-up is presented with an overview of the data layers that have been generated and how they are interrelated. The GIS database was also converted into a paper map, using a digital cartographic package. The resulting largescale geomorphological hazard map is attached. The resulting GIS database and cartographic product can be used to analyse the hazard type and hazard degree for each polygon, and to find the reasons for the hazard classification.

  20. Comparing Newmark

    NASA Astrophysics Data System (ADS)

    Rodríguez-Peces, M. J.; García-Mayordomo, J.; Azañón-Hernández, J. M.; Jabaloy-Sánchez, A.

    2009-04-01

    The Lorca Basin (Eastern Betic Cordillera, SE Spain) is one of the most seismically active regions of Spain. In this area there are well known cases of earthquake-induced slope instabilities associated to specific earthquakes (e.g., Bullas 2002, La Paca 2005). Furthermore, this area is characterized by moderate magnitude seismicity which mainly produces rock-falls and avalanches. In this work we present the results of our research at regional and site scales. For the regional scale, we have used a geographic information system (GIS) to develop an implementation of the Newmark's sliding rigid block method. We have particularly proposed a new variation of Newmark's method to consider soil and topographic amplification effects. Subsequently, we produced "Newmark displacement" maps for both probabilistic and deterministic seismic scenarios in the Lorca Basin. Probabilistic seismic scenarios consider three hazard maps in terms of peak ground acceleration (PGA) on rock corresponding to the 475-, 975- and 2475-year return periods (exceedance probability of 10, 5 and 2% in 50 years, respectively) in the Murcia Region. Deterministic seismic scenarios consider the occurrence of the most probable earthquake for a 475-year return period (Mw=5.0) at every location, or either a complete rupture of Lorca-Totana (Mw=6.7) or Puerto Lumbreras-Lorca (Mw=6.8) segments of Alhama de Murcia Fault. The Newmark displacement maps allowed us to identify areas with the highest potential seismic hazard, and also locate areas for future particular studies. We have found that rock-falls produced during the last earthquakes in Lorca Basin (e.g., Bullas 2002, La Paca 2005) match very well with areas with values of Newmark displacement lower than 2 cm in all the seismic scenarios considered. Therefore, it seems that low values of Newmark displacements are very likely associated with rock-falls. To support this hypothesis we have applied the Newmark method at a site scale. To do this, we have selected La Paca rock-fall which was generated during La Paca 2005 earthquake (mbLg=4.7, IEMS=VI-VII). We have used a terrestrial laser scanner in order to obtain a high resolution digital elevation model of La Paca rock-fall area. Moreover, we have performed a back-analysis based on field data to estimate the static safety factor previous to the earthquake and the critical acceleration. Furthermore, we have selected a representative strong ground motion record for La Paca earthquake from international databases. The critical acceleration and the peak ground acceleration values obtained from the strong ground motion record allowed us to estimate the actual soil and topographic amplification effects. Finally, we have calculated analytically the real Newmark displacement at La Paca rock-fall and we have compared this displacement with our GIS estimation in order to improve the calibration of Newmark's method at the regional scale.

  1. A Probabilistic Typhoon Risk Model for Vietnam

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  2. A preliminary probabilistic analysis of tsunami sources of seismic and non-seismic origin applied to the city of Naples, Italy

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Anita, G.

    2011-12-01

    In both worldwide and regional historical catalogues, most of the tsunamis are caused by earthquakes and a minor percentage is represented by all the other non-seismic sources. On the other hand, tsunami hazard and risk studies are often applied to very specific areas, where this global trend can be different or even inverted, depending on the kind of potential tsunamigenic sources which characterize the case study. So far, few probabilistic approaches consider the contribution of landslides and/or phenomena derived by volcanic activity, i.e. pyroclastic flows and flank collapses, as predominant in the PTHA, also because of the difficulties to estimate the correspondent recurrence time. These considerations are valid, for example, for the city of Naples, Italy, which is surrounded by a complex active volcanic system (Vesuvio, Campi Flegrei, Ischia) that presents a significant number of potential tsunami sources of non-seismic origin compared to the seismic ones. In this work we present the preliminary results of a probabilistic multi-source tsunami hazard assessment applied to Naples. The method to estimate the uncertainties will be based on Bayesian inference. This is the first step towards a more comprehensive task which will provide a tsunami risk quantification for this town in the frame of the Italian national project ByMuR (http://bymur.bo.ingv.it). This three years long ongoing project has the final objective of developing a Bayesian multi-risk methodology to quantify the risk related to different natural hazards (volcanoes, earthquakes and tsunamis) applied to the city of Naples.

  3. Ensemble of ground subsidence hazard maps using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Park, Inhye; Lee, Jiyeong; Saro, Lee

    2014-06-01

    Hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok, Korea, were constructed using fuzzy ensemble techniques and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, groundwater, and ground subsidence maps. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 70/30 for training and validation of the models. The relationships between the detected ground-subsidence area and the factors were identified and quantified by frequency ratio (FR), logistic regression (LR) and artificial neural network (ANN) models. The relationships were used as factor ratings in the overlay analysis to create ground-subsidence hazard indexes and maps. The three GSH maps were then used as new input factors and integrated using fuzzy-ensemble methods to make better hazard maps. All of the hazard maps were validated by comparison with known subsidence areas that were not used directly in the analysis. As the result, the ensemble model was found to be more effective in terms of prediction accuracy than the individual model.

  4. Fleeing to Fault Zones: Incorporating Syrian Refugees into Earthquake Risk Analysis along the East Anatolian and Dead Sea Rift Fault Zones

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Paradise, T. R.

    2016-12-01

    The influx of millions of Syrian refugees into Turkey has rapidly changed the population distribution along the Dead Sea Rift and East Anatolian Fault zones. In contrast to other countries in the Middle East where refugees are accommodated in camp environments, the majority of displaced individuals in Turkey are integrated into cities, towns, and villages—placing stress on urban settings and increasing potential exposure to strong shaking. Yet, displaced populations are not traditionally captured in data sources used in earthquake risk analysis or loss estimations. Accordingly, we present a district-level analysis assessing the spatial overlap of earthquake hazards and refugee locations in southeastern Turkey to determine how migration patterns are altering seismic risk in the region. Using migration estimates from the U.S. Humanitarian Information Unit, we create three district-level population scenarios that combine official population statistics, refugee camp populations, and low, median, and high bounds for integrated refugee populations. We perform probabilistic seismic hazard analysis alongside these population scenarios to map spatial variations in seismic risk between 2011 and late 2015. Our results show a significant relative southward increase of seismic risk for this period due to refugee migration. Additionally, we calculate earthquake fatalities for simulated earthquakes using a semi-empirical loss estimation technique to determine degree of under-estimation resulting from forgoing migration data in loss modeling. We find that including refugee populations increased casualties by 11-12% using median population estimates, and upwards of 20% using high population estimates. These results communicate the ongoing importance of placing environmental hazards in their appropriate regional and temporal context which unites physical, political, cultural, and socio-economic landscapes. Keywords: Earthquakes, Hazards, Loss-Estimation, Syrian Crisis, Migration, Refugees

  5. Development Of New Databases For Tsunami Hazard Analysis In California

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Barberopoulou, A.; Borrero, J. C.; Bryant, W. A.; Dengler, L. A.; Goltz, J. D.; Legg, M.; McGuire, T.; Miller, K. M.; Real, C. R.; Synolakis, C.; Uslu, B.

    2009-12-01

    The California Geological Survey (CGS) has partnered with other tsunami specialists to produce two statewide databases to facilitate the evaluation of tsunami hazard products for both emergency response and land-use planning and development. A robust, State-run tsunami deposit database is being developed that compliments and expands on existing databases from the National Geophysical Data Center (global) and the USGS (Cascadia). Whereas these existing databases focus on references or individual tsunami layers, the new State-maintained database concentrates on the location and contents of individual borings/trenches that sample tsunami deposits. These data provide an important observational benchmark for evaluating the results of tsunami inundation modeling. CGS is collaborating with and sharing the database entry form with other states to encourage its continued development beyond California’s coastline so that historic tsunami deposits can be evaluated on a regional basis. CGS is also developing an internet-based, tsunami source scenario database and forum where tsunami source experts and hydrodynamic modelers can discuss the validity of tsunami sources and their contribution to hazard assessments for California and other coastal areas bordering the Pacific Ocean. The database includes all distant and local tsunami sources relevant to California starting with the forty scenarios evaluated during the creation of the recently completed statewide series of tsunami inundation maps for emergency response planning. Factors germane to probabilistic tsunami hazard analyses (PTHA), such as event histories and recurrence intervals, are also addressed in the database and discussed in the forum. Discussions with other tsunami source experts will help CGS determine what additional scenarios should be considered in PTHA for assessing the feasibility of generating products of value to local land-use planning and development.

  6. Alteration, slope-classified alteration, and potential lahar inundation maps of volcanoes for the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Volcano Archive

    USGS Publications Warehouse

    Mars, John C.; Hubbard, Bernard E.; Pieri, David; Linick, Justin

    2015-01-01

    This study was undertaken during 2012–2013 in cooperation with the National Aeronautics and Space Administration (NASA). Since completion of this study, a new lahar modeling program (LAHAR_pz) has been released, which may produce slightly different modeling results from the LAHARZ model used in this study. The maps and data from this study should not be used in place of existing volcano hazard maps published by local authorities. For volcanoes without hazard maps and (or) published lahar-related hazard studies, this work will provide a starting point from which more accurate hazard maps can be produced. This is the first dataset to provide digital maps of altered volcanoes and adjacent watersheds that can be used for assessing volcanic hazards, hydrothermal alteration, and other volcanic processes in future studies.

  7. The Influence of Environmental Hazard Maps on Risk Beliefs, Emotion, and Health-related Behavioral Intentions

    PubMed Central

    Severtson, Dolores

    2013-01-01

    To test a theoretical explanation of how attributes of mapped environmental health hazards influence health-related behavioral intentions and how beliefs and emotion mediate the influences of attributes, 24 maps were developed that varied by four attributes of a residential drinking water hazard: level, proximity, prevalence, and density. In a factorial design, student participants (N=446) answered questions for a subset of maps. Hazard level and proximity had the largest influences on intentions to test water and mitigate exposure. Belief in the problem’s seriousness mediated attributes’ influence on intention to test drinking water, and perceived susceptibility mediated the influence of attributes on intention to mitigate risk. Maps with carefully illustrated attributes of hazards may promote appropriate health-related risk beliefs, intentions, and behavior. PMID:23533022

  8. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    USGS Publications Warehouse

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  9. An Integrated Probabilistic-Fuzzy Assessment of Uncertainty Associated with Human Health Risk to MSW Landfill Leachate Contamination

    NASA Astrophysics Data System (ADS)

    Mishra, H.; Karmakar, S.; Kumar, R.

    2016-12-01

    Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.

  10. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sezen, Halil; Aldemir, Tunc; Denning, R.

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  11. Probabilistic Evaluation of Mammalian Pharmacology Data to Target Pharmaceuticals for Environmental Hazard Assessment

    EPA Science Inventory

    Active Pharmaceutical Ingredients (APIs) are being detected with increasing frequency in aquatic systems associated with municipal effluent. APIs considered a Contaminant of Emerging Concern (CEC) -Little, if any, regulation considering aquatic systems -Effects on aquatic o...

  12. Vs30 mapping at selected sites within the Greater Accra Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Nortey, Grace; Armah, Thomas K.; Amponsah, Paulina

    2018-06-01

    A large part of Accra is underlain by a complex distribution of shallow soft soils. Within seismically active zones, these soils hold the most potential to significantly amplify seismic waves and cause severe damage, especially to structures sited on soils lacking sufficient stiffness. This paper presents preliminary site classification for the Greater Accra Metropolitan Area of Ghana (GAMA), using experimental data from two-dimensional (2-D) Multichannel Analysis of Surface Wave (MASW) technique. The dispersive characteristics of fundamental mode Rayleigh type surface waves were utilized for imaging the shallow subsurface layers (approx. up to 30 m depth) by estimating the 1D (depth) and 2D (depth and surface location) shear wave velocities at 5 selected sites. The average shear wave velocity for 30 m depth (Vs30), which is critical in evaluating the site response of the upper 30 m, was estimated and used for the preliminary site classification of the GAM area, as per NEHRP (National Earthquake Hazards Reduction Program). Based on the Vs30 values obtained in the study, two common site types C, and D corresponding to shallow (>6 m < 30 m) weathered rock and deep (up 30 m thick) stiff soils respectively, have been identified within the study area. Lower velocity profiles are inferred for the residual soils (sandy to silty clays), derived from the Accraian Formation that lies mainly within Accra central. Stiffer soil sites lie to the north of Accra, and to the west near Nyanyano. The seismic response characteristics over the residual soils in the GAMA have become apparent using the MASW technique. An extensive site effect map and a more robust probabilistic seismic hazard analysis can now be efficiently built for the metropolis, by considering the site classes and design parameters obtained from this study.

  13. Active Faults and Earthquake Hazards in the FY 79 Verification Sites - Nevada-Utah Siting Region.

    DTIC Science & Technology

    1980-03-26

    structures, such as shelters and command/control facilities, away from rup- ture hazards. Again, the probability of rupture, the effect of damage and ...accommodate an MCE, and less critical structures (such as the shelters ) designed for a probabilistically determined event, may have merit for the MX...B., and Eaton, G. P., eds., Cenozoic tectonics and regional geophysics of the western cordillera : Geol. Soc. Am. Mem. 152, p. 1-32. Stewart, J. H

  14. Probabilistic evaluation of the physical impact of future tephra fallout events for the Island of Vulcano, Italy

    NASA Astrophysics Data System (ADS)

    Biass, Sebastien; Bonadonna, Costanza; di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino

    2016-05-01

    A first probabilistic scenario-based hazard assessment for tephra fallout is presented for La Fossa volcano (Vulcano Island, Italy) and subsequently used to assess the impact on the built environment. Eruption scenarios are based upon the stratigraphy produced by the last 1000 years of activity at Vulcano and include long-lasting Vulcanian and sub-Plinian eruptions. A new method is proposed to quantify the evolution through time of the hazard associated with pulsatory Vulcanian eruptions lasting from weeks to years, and the increase in hazard related to typical rainfall events around Sicily is also accounted for. The impact assessment on the roofs is performed by combining a field characterization of the buildings with the composite European vulnerability curves for typical roofing stocks. Results show that a sub-Plinian eruption of VEI 2 is not likely to affect buildings, whereas a sub-Plinian eruption of VEI 3 results in 90 % of the building stock having a ≥12 % probability of collapse. The hazard related to long-lasting Vulcanian eruptions evolves through time, and our analysis shows that the town of Il Piano, located downwind of the preferential wind patterns, is likely to reach critical tephra accumulations for roof collapse 5-9 months after the onset of the eruption. If no cleaning measures are taken, half of the building stock has a probability >20 % of suffering roof collapse.

  15. Empirical Data Fusion for Convective Weather Hazard Nowcasting

    NASA Astrophysics Data System (ADS)

    Williams, J.; Ahijevych, D.; Steiner, M.; Dettling, S.

    2009-09-01

    This paper describes a statistical analysis approach to developing an automated convective weather hazard nowcast system suitable for use by aviation users in strategic route planning and air traffic management. The analysis makes use of numerical weather prediction model fields and radar, satellite, and lightning observations and derived features along with observed thunderstorm evolution data, which are aligned using radar-derived motion vectors. Using a dataset collected during the summers of 2007 and 2008 over the eastern U.S., the predictive contributions of the various potential predictor fields are analyzed for various spatial scales, lead-times and scenarios using a technique called random forests (RFs). A minimal, skillful set of predictors is selected for each scenario requiring distinct forecast logic, and RFs are used to construct an empirical probabilistic model for each. The resulting data fusion system, which ran in real-time at the National Center for Atmospheric Research during the summer of 2009, produces probabilistic and deterministic nowcasts of the convective weather hazard and assessments of the prediction uncertainty. The nowcasts' performance and results for several case studies are presented to demonstrate the value of this approach. This research has been funded by the U.S. Federal Aviation Administration to support the development of the Consolidated Storm Prediction for Aviation (CoSPA) system, which is intended to provide convective hazard nowcasts and forecasts for the U.S. Next Generation Air Transportation System (NextGen).

  16. Seismic risk assessment of Navarre (Northern Spain)

    NASA Astrophysics Data System (ADS)

    Gaspar-Escribano, J. M.; Rivas-Medina, A.; García Rodríguez, M. J.; Benito, B.; Tsige, M.; Martínez-Díaz, J. J.; Murphy, P.

    2009-04-01

    The RISNA project, financed by the Emergency Agency of Navarre (Northern Spain), aims at assessing the seismic risk of the entire region. The final goal of the project is the definition of emergency plans for future earthquakes. With this purpose, four main topics are covered: seismic hazard characterization, geotechnical classification, vulnerability assessment and damage estimation to structures and exposed population. A geographic information system is used to integrate, analyze and represent all information colleted in the different phases of the study. Expected ground motions on rock conditions with a 90% probability of non-exceedance in an exposure time of 50 years are determined following a Probabilistic Seismic Hazard Assessment (PSHA) methodology that includes a logic tree with different ground motion and source zoning models. As the region under study is located in the boundary between Spain and France, an effort is required to collect and homogenise seismological data from different national and regional agencies. A new homogenised seismic catalogue, merging data from Spanish, French, Catalonian and international agencies and establishing correlations between different magnitude scales, is developed. In addition, a new seismic zoning model focused on the study area is proposed. Results show that the highest ground motions on rock conditions are expected in the northeastern part of the region, decreasing southwards. Seismic hazard can be expressed as low-to-moderate. A geotechnical classification of the entire region is developed based on surface geology, available borehole data and morphotectonic constraints. Frequency-dependent amplification factors, consistent with code values, are proposed. The northern and southern parts of the region are characterized by stiff and soft soils respectively, being the softest soils located along river valleys. Seismic hazard maps including soil effects are obtained by applying these factors to the seismic hazard maps on rock conditions (for the same probability level). Again, the highest hazard is found in the northeastern part of the region. The lowest hazard is obtained along major river valleys The vulnerability assessment of the Navarra building stock is accomplished using as proxy a combination of building age, location, number of floors and the implantation of building codes. Field surveys help constraining the extent of traditional and technological construction types. The vulnerability characterization is carried out following three methods: European Macroseismic Scale (EMS 98), RISK UE vulnerability index and the capacity spectrum method implemented in Hazus. Vulnerability distribution maps for each Navarrean municipality are provided, adapted to the EMS98 vulnerability classes. The vulnerability of Navarre is medium to high, except for recent urban, highly populated developments. For each vulnerability class and expected ground motion, damage distribution is estimated by means of damage probability matrixes. Several damage indexes, embracing relative and absolute damage estimates, are used. Expected average damage is low. Whereas the largest amounts of damaged structures are found in big cities, the highest percentages are obtained in some muniucipalities of northeastern Navarre. Additionally, expected percentages and amounts of affected persons by earthquake damage are calculated for each municipality. Expected amounts of affected people are low, reflecting the low expected damage degree.

  17. The influence of environmental hazard maps on risk beliefs, emotion, and health-related behavioral intentions.

    PubMed

    Severtson, Dolores J

    2013-08-01

    To test a theoretical explanation of how attributes of mapped environmental health hazards influence health-related behavioral intentions and how beliefs and emotion mediate the influences of attributes, 24 maps were developed that varied by four attributes of a residential drinking water hazard: level, proximity, prevalence, and density. In a factorial design, student participants (N = 446) answered questions about a subset of maps. Hazard level and proximity had the largest influences on intentions to test water and mitigate exposure. Belief in the problem's seriousness mediated attributes' influence on intention to test drinking water, and perceived susceptibility mediated the influence of attributes on intention to mitigate risk. Maps with carefully illustrated attributes of hazards may promote appropriate health-related risk beliefs, intentions, and behavior. Copyright © 2013 Wiley Periodicals, Inc.

  18. Coupling of Bayesian Networks with GIS for wildfire risk assessment on natural and agricultural areas of the Mediterranean

    NASA Astrophysics Data System (ADS)

    Scherb, Anke; Papakosta, Panagiota; Straub, Daniel

    2014-05-01

    Wildfires cause severe damages to ecosystems, socio-economic assets, and human lives in the Mediterranean. To facilitate coping with wildfire risks, an understanding of the factors influencing wildfire occurrence and behavior (e.g. human activity, weather conditions, topography, fuel loads) and their interaction is of importance, as is the implementation of this knowledge in improved wildfire hazard and risk prediction systems. In this project, a probabilistic wildfire risk prediction model is developed, with integrated fire occurrence and fire propagation probability and potential impact prediction on natural and cultivated areas. Bayesian Networks (BNs) are used to facilitate the probabilistic modeling. The final BN model is a spatial-temporal prediction system at the meso scale (1 km2 spatial and 1 day temporal resolution). The modeled consequences account for potential restoration costs and production losses referred to forests, agriculture, and (semi-) natural areas. BNs and a geographic information system (GIS) are coupled within this project to support a semi-automated BN model parameter learning and the spatial-temporal risk prediction. The coupling also enables the visualization of prediction results by means of daily maps. The BN parameters are learnt for Cyprus with data from 2006-2009. Data from 2010 is used as validation data set. A special focus is put on the performance evaluation of the BN for fire occurrence, which is modeled as binary classifier and thus, could be validated by means of Receiver Operator Characteristic (ROC) curves. With the final best models, AUC values of more than 70% for validation could be achieved, which indicates potential for reliable prediction performance via BN. Maps of selected days in 2010 are shown to illustrate final prediction results. The resulting system can be easily expanded to predict additional expected damages in the mesoscale (e.g. building and infrastructure damages). The system can support planning of preventive measures (e.g. state resources allocation for wildfire prevention and preparedness) and assist recuperation plans of damaged areas.

  19. Using the Logarithm of Odds to Define a Vector Space on Probabilistic Atlases

    PubMed Central

    Pohl, Kilian M.; Fisher, John; Bouix, Sylvain; Shenton, Martha; McCarley, Robert W.; Grimson, W. Eric L.; Kikinis, Ron; Wells, William M.

    2007-01-01

    The Logarithm of the Odds ratio (LogOdds) is frequently used in areas such as artificial neural networks, economics, and biology, as an alternative representation of probabilities. Here, we use LogOdds to place probabilistic atlases in a linear vector space. This representation has several useful properties for medical imaging. For example, it not only encodes the shape of multiple anatomical structures but also captures some information concerning uncertainty. We demonstrate that the resulting vector space operations of addition and scalar multiplication have natural probabilistic interpretations. We discuss several examples for placing label maps into the space of LogOdds. First, we relate signed distance maps, a widely used implicit shape representation, to LogOdds and compare it to an alternative that is based on smoothing by spatial Gaussians. We find that the LogOdds approach better preserves shapes in a complex multiple object setting. In the second example, we capture the uncertainty of boundary locations by mapping multiple label maps of the same object into the LogOdds space. Third, we define a framework for non-convex interpolations among atlases that capture different time points in the aging process of a population. We evaluate the accuracy of our representation by generating a deformable shape atlas that captures the variations of anatomical shapes across a population. The deformable atlas is the result of a principal component analysis within the LogOdds space. This atlas is integrated into an existing segmentation approach for MR images. We compare the performance of the resulting implementation in segmenting 20 test cases to a similar approach that uses a more standard shape model that is based on signed distance maps. On this data set, the Bayesian classification model with our new representation outperformed the other approaches in segmenting subcortical structures. PMID:17698403

  20. Assessing carcinogenic risks associated with ingesting arsenic in farmed smeltfish (Ayu, Plecoglossus altirelis) in aseniasis-endemic area of Taiwan.

    PubMed

    Lee, Jin-Jing; Jang, Cheng-Shin; Liang, Ching-Ping; Liu, Chen-Wuing

    2008-09-15

    This study spatially analyzed potential carcinogenic risks associated with ingesting arsenic (As) contents in aquacultural smeltfish (Plecoglossus altirelis) from the Lanyang Plain of northeastern Taiwan. Sequential indicator simulation (SIS) was adopted to reproduce As exposure distributions in groundwater based on their three-dimensional variability. A target cancer risk (TR) associated with ingesting As in aquacultural smeltfish was employed to evaluate the potential risk to human health. The probabilistic risk assessment determined by Monte Carlo simulation and SIS is used to propagate properly the uncertainty of parameters. Safe and hazardous aquacultural regions were mapped to elucidate the safety of groundwater use. The TRs determined from the risks at the 95th percentiles exceed one millionth, indicating that ingesting smeltfish that are farmed in the highly As-affected regions represents a potential cancer threat to human health. The 95th percentile of TRs is considered in formulating a strategy for the aquacultural use of groundwater in the preliminary stage.

  1. Dating previously balanced rocks in seismically active parts of California and Nevada

    USGS Publications Warehouse

    Bell, J.W.; Brune, J.N.; Liu, T.; Zreda, M.; Yount, J.C.

    1998-01-01

    Precariously balanced boulders that could be knocked down by strong earthquake ground motion are found in some seismically active areas of southern California and Nevada. In this study we used two independent surface-exposure dating techniques - rock-varnish microlamination and cosmogenic 36Cl dating methodologies - to estimate minimum- and maximum-limiting ages, respectively, of the precarious boulders and by inference the elapsed time since the sites were shaken down. The results of the exposure dating indicate that all of the precarious rocks are >10.5 ka and that some may be significantly older. At Victorville and Jacumba, California, these results show that the precarious rocks have not been knocked down for at least 10.5 k.y., a conclusion in apparent conflict with some commonly used probabilistic seismic hazard maps. At Yucca Mountain, Nevada, the ages of the precarious rocks are >10.5 to >27.0 ka, providing an independent measure of the minimum time elapsed since faulting occurred on the Solitario Canyon fault.

  2. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    NASA Astrophysics Data System (ADS)

    Graves, Robert; Jordan, Thomas H.; Callaghan, Scott; Deelman, Ewa; Field, Edward; Juve, Gideon; Kesselman, Carl; Maechling, Philip; Mehta, Gaurang; Milner, Kevin; Okaya, David; Small, Patrick; Vahi, Karan

    2011-03-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i.e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process.

  3. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    USGS Publications Warehouse

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process. ?? 2010 Springer Basel AG.

  4. Proposal of a method for evaluating tsunami risk using response-surface methodology

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.

  5. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  6. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.

  7. Mapping of hazard from rainfall-triggered landslides in developing countries: Examples from Honduras and Micronesia

    USGS Publications Warehouse

    Harp, E.L.; Reid, M.E.; McKenna, J.P.; Michael, J.A.

    2009-01-01

    Loss of life and property caused by landslides triggered by extreme rainfall events demonstrates the need for landslide-hazard assessment in developing countries where recovery from such events often exceeds the country's resources. Mapping landslide hazards in developing countries where the need for landslide-hazard mitigation is great but the resources are few is a challenging, but not intractable problem. The minimum requirements for constructing a physically based landslide-hazard map from a landslide-triggering storm, using the simple methods we discuss, are: (1) an accurate mapped landslide inventory, (2) a slope map derived from a digital elevation model (DEM) or topographic map, and (3) material strength properties of the slopes involved. Provided that the landslide distribution from a triggering event can be documented and mapped, it is often possible to glean enough topographic and geologic information from existing databases to produce a reliable map that depicts landslide hazards from an extreme event. Most areas of the world have enough topographic information to provide digital elevation models from which to construct slope maps. In the likely event that engineering properties of slope materials are not available, reasonable estimates can be made with detailed field examination by engineering geologists or geotechnical engineers. Resulting landslide hazard maps can be used as tools to guide relocation and redevelopment, or, more likely, temporary relocation efforts during severe storm events such as hurricanes/typhoons to minimize loss of life and property. We illustrate these methods in two case studies of lethal landslides in developing countries: Tegucigalpa, Honduras (during Hurricane Mitch in 1998) and the Chuuk Islands, Micronesia (during Typhoon Chata'an in 2002).

  8. The U.S. Geological Survey's Earthquake Summary Posters: A GIS-based Education and Communication Product for Presenting Consolidated Post-Earthquake Information

    NASA Astrophysics Data System (ADS)

    Tarr, A.; Benz, H.; Earle, P.; Wald, D. J.

    2003-12-01

    Earthquake Summary Posters (ESP's), a new product of the U.S. Geological Survey's Earthquake Program, are produced at the National Earthquake Information Center (NEIC) in Golden. The posters consist of rapidly-generated, GIS-based maps made following significant earthquakes worldwide (typically M>7.0, or events of significant media/public interest). ESP's consolidate, in an attractive map format, a large-scale epicentral map, several auxiliary regional overviews (showing tectonic and geographical setting, seismic history, seismic hazard, and earthquake effects), depth sections (as appropriate), a table of regional earthquakes, and a summary of the reional seismic history and tectonics. The immediate availability of the latter text summaries has been facilitated by the availability of Rapid, Accurate Tectonic Summaries (RATS) produced at NEIC and posted on the web following significant events. The rapid production of ESP's has been facilitated by generating, during the past two years, regional templates for tectonic areas around the world by organizing the necessary spatially-referenced data for the map base and the thematic layers that overlay the base. These GIS databases enable scripted Arc Macro Language (AML) production of routine elements of the maps (for example background seismicity, tectonic features, and probabilistic hazard maps). However, other elements of the maps are earthquake-specific and are produced manually to reflect new data, earthquake effects, and special characteristics. By the end of this year, approximately 85% of the Earth's seismic zones will be covered for generating future ESP's. During the past year, 13 posters were completed, comparable to the yearly average expected for significant earthquakes. Each year, all ESPs will be published on a CD in PDF format as an Open-File Report. In addition, each is linked to the special event earthquake pages on the USGS Earthquake Program web site (http://earthquake.usgs.gov). Although three formats are generated, the poster-size format is the most popular for display, outreach, and use as a working map for project scientists (JPEG format for web; PDF for download, editing, and printing) whereas the other (smaller) formats are suitable for briefing packages. We will soon make both GIS and PDF files of individual elements of the posters available online. ESP's provide an unprecedented opportunity for college earth-science faculty to take advantage of current events for timely lessons in global tectonics. They are also invaluable to communicate with the media and with government officials. ESP's will be used as a vehicle to present other products now under development under the auspices of NEIC and the ANSS, including rapid finite-fault models, global predictive ShakeMaps, "Did You Feel It?", and Rapid Assessments of Global Earthquakes (RAGE, Earle and others, this meeting).

  9. Chapter 8: US geological survey Circum-Arctic Resource Appraisal (CARA): Introduction and summary of organization and methods

    USGS Publications Warehouse

    Charpentier, R.R.; Gautier, D.L.

    2011-01-01

    The USGS has assessed undiscovered petroleum resources in the Arctic through geological mapping, basin analysis and quantitative assessment. The new map compilation provided the base from which geologists subdivided the Arctic for burial history modelling and quantitative assessment. The CARA was a probabilistic, geologically based study that used existing USGS methodology, modified somewhat for the circumstances of the Arctic. The assessment relied heavily on analogue modelling, with numerical input as lognormal distributions of sizes and numbers of undiscovered accumulations. Probabilistic results for individual assessment units were statistically aggregated taking geological dependencies into account. Fourteen papers in this Geological Society volume present summaries of various aspects of the CARA. ?? 2011 The Geological Society of London.

  10. Nonequilibrium Probabilistic Dynamics of the Logistic Map at the Edge of Chaos

    NASA Astrophysics Data System (ADS)

    Borges, Ernesto P.; Tsallis, Constantino; Añaños, Garín F.; de Oliveira, Paulo Murilo

    2002-12-01

    We consider nonequilibrium probabilistic dynamics in logisticlike maps xt+1=1-a|xt|z, (z>1) at their chaos threshold: We first introduce many initial conditions within one among W>>1 intervals partitioning the phase space and focus on the unique value qsen<1 for which the entropic form Sq≡(1- ∑i=1Wpqi)/(q-1) linearly increases with time. We then verify that Sqsen(t)-Sqsen(∞) vanishes like t-1/[qrel(W)-1] [qrel(W)>1]. We finally exhibit a new finite-size scaling, qrel(∞)-qrel(W)~W- |qsen|. This establishes quantitatively, for the first time, a long pursued relation between sensitivity to the initial conditions and relaxation, concepts which play central roles in nonextensive statistical mechanics.

  11. Comparing the Performance of Japan's Earthquake Hazard Maps to Uniform and Randomized Maps

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S. A.; Spencer, B. D.

    2015-12-01

    The devastating 2011 magnitude 9.1 Tohoku earthquake and the resulting shaking and tsunami were much larger than anticipated in earthquake hazard maps. Because this and all other earthquakes that caused ten or more fatalities in Japan since 1979 occurred in places assigned a relatively low hazard, Geller (2011) argued that "all of Japan is at risk from earthquakes, and the present state of seismological science does not allow us to reliably differentiate the risk level in particular geographic areas," so a map showing uniform hazard would be preferable to the existing map. Defenders of the maps countered by arguing that these earthquakes are low-probability events allowed by the maps, which predict the levels of shaking that should expected with a certain probability over a given time. Although such maps are used worldwide in making costly policy decisions for earthquake-resistant construction, how well these maps actually perform is unknown. We explore this hotly-contested issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted ground motion should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. These results indicate that the JNH maps are not performing as well as expected, that what factors control map performance is complicated, and that learning more about how maps perform and why would be valuable in making more effective policy.

  12. Uncertainties in evaluation of hazard and seismic risk

    NASA Astrophysics Data System (ADS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru; Ortanza Cioflan, Carmen; Manea, Elena-Florinela

    2015-04-01

    Two methods are commonly used for seismic hazard assessment: probabilistic (PSHA) and deterministic(DSHA) seismic hazard analysis.Selection of a ground motion for engineering design requires a clear understanding of seismic hazard and risk among stakeholders, seismologists and engineers. What is wrong with traditional PSHA or DSHA ? PSHA common used in engineering is using four assumptions developed by Cornell in 1968:(1)-Constant-in-time average occurrence rate of earthquakes; (2)-Single point source; (3).Variability of ground motion at a site is independent;(4)-Poisson(or "memory - less") behavior of earthquake occurrences. It is a probabilistic method and "when the causality dies, its place is taken by probability, prestigious term meant to define the inability of us to predict the course of nature"(Nils Bohr). DSHA method was used for the original design of Fukushima Daichii, but Japanese authorities moved to probabilistic assessment methods and the probability of exceeding of the design basis acceleration was expected to be 10-4-10-6 . It was exceeded and it was a violation of the principles of deterministic hazard analysis (ignoring historical events)(Klügel,J,U, EGU,2014, ISSO). PSHA was developed from mathematical statistics and is not based on earthquake science(invalid physical models- point source and Poisson distribution; invalid mathematics; misinterpretation of annual probability of exceeding or return period etc.) and become a pure numerical "creation" (Wang, PAGEOPH.168(2011),11-25). An uncertainty which is a key component for seismic hazard assessment including both PSHA and DSHA is the ground motion attenuation relationship or the so-called ground motion prediction equation (GMPE) which describes a relationship between a ground motion parameter (i.e., PGA,MMI etc.), earthquake magnitude M, source to site distance R, and an uncertainty. So far, no one is taking into consideration strong nonlinear behavior of soils during of strong earthquakes. But, how many cities, villages, metropolitan areas etc. in seismic regions are constructed on rock? Most of them are located on soil deposits? A soil is of basic type sand or gravel (termed coarse soils), silt or clay (termed fine soils) etc. The effect on nonlinearity is very large. For example, if we maintain the same spectral amplification factor (SAF=5.8942) as for relatively strong earthquake on May 3,1990(MW=6.4),then at Bacǎu seismic station for Vrancea earthquake on May 30,1990 (MW =6.9) the peak acceleration has to be a*max =0.154g and the actual recorded was only, amax =0.135g(-14.16%). Also, for Vrancea earthquake on August 30,1986(MW=7.1),the peak acceleration has to be a*max = 0.107g instead of real value recorded of 0.0736 g(- 45.57%). There are many data for more than 60 seismic stations. There is a strong nonlinear dependence of SAF with earthquake magnitude in each site. The authors are coming with an alternative approach called "real spectral amplification factors" instead of GMPE for all extra-Carpathian area where all cities and villages are located on soil deposits. Key words: Probabilistic Seismic Hazard; Uncertainties; Nonlinear seismology; Spectral amplification factors(SAF).

  13. Mapping mountain torrent hazards in the Hexi Corridor using an evidential reasoning approach

    NASA Astrophysics Data System (ADS)

    Ran, Youhua; Liu, Jinpeng; Tian, Feng; Wang, Dekai

    2017-02-01

    The Hexi Corridor is an important part of the Silk Road Economic Belt and a crucial channel for westward development in China. Many important national engineering projects pass through the corridor, such as highways, railways, and the West-to-East Gas Pipeline. The frequent torrent disasters greatly impact the security of infrastructure and human safety. In this study, an evidential reasoning approach based on Dempster-Shafer theory is proposed for mapping mountain torrent hazards in the Hexi Corridor. A torrent hazard map for the Hexi Corridor was generated by integrating the driving factors of mountain torrent disasters including precipitation, terrain, flow concentration processes, and the vegetation fraction. The results show that the capability of the proposed method is satisfactory. The torrent hazard map shows that there is high potential torrent hazard in the central and southeastern Hexi Corridor. The results are useful for engineering planning support and resource protection in the Hexi Corridor. Further efforts are discussed for improving torrent hazard mapping and prediction.

  14. Modelling Multi Hazard Mapping in Semarang City Using GIS-Fuzzy Method

    NASA Astrophysics Data System (ADS)

    Nugraha, A. L.; Awaluddin, M.; Sasmito, B.

    2018-02-01

    One important aspect of disaster mitigation planning is hazard mapping. Hazard mapping can provide spatial information on the distribution of locations that are threatened by disaster. Semarang City as the capital of Central Java Province is one of the cities with high natural disaster intensity. Frequent natural disasters Semarang city is tidal flood, floods, landslides, and droughts. Therefore, Semarang City needs spatial information by doing multi hazard mapping to support disaster mitigation planning in Semarang City. Multi Hazards map modelling can be derived from parameters such as slope maps, rainfall, land use, and soil types. This modelling is done by using GIS method with scoring and overlay technique. However, the accuracy of modelling would be better if the GIS method is combined with Fuzzy Logic techniques to provide a good classification in determining disaster threats. The Fuzzy-GIS method will build a multi hazards map of Semarang city can deliver results with good accuracy and with appropriate threat class spread so as to provide disaster information for disaster mitigation planning of Semarang city. from the multi-hazard modelling using GIS-Fuzzy can be known type of membership that has a good accuracy is the type of membership Gauss with RMSE of 0.404 the smallest of the other membership and VAF value of 72.909% of the largest of the other membership.

  15. A Model for Generating Multi-hazard Scenarios

    NASA Astrophysics Data System (ADS)

    Lo Jacomo, A.; Han, D.; Champneys, A.

    2017-12-01

    Communities in mountain areas are often subject to risk from multiple hazards, such as earthquakes, landslides, and floods. Each hazard has its own different rate of onset, duration, and return period. Multiple hazards tend to complicate the combined risk due to their interactions. Prioritising interventions for minimising risk in this context is challenging. We developed a probabilistic multi-hazard model to help inform decision making in multi-hazard areas. The model is applied to a case study region in the Sichuan province in China, using information from satellite imagery and in-situ data. The model is not intended as a predictive model, but rather as a tool which takes stakeholder input and can be used to explore plausible hazard scenarios over time. By using a Monte Carlo framework and varrying uncertain parameters for each of the hazards, the model can be used to explore the effect of different mitigation interventions aimed at reducing the disaster risk within an uncertain hazard context.

  16. Publications - DDS 10 | Alaska Division of Geological & Geophysical Surveys

    Science.gov Websites

    Portal Climate and Cryosphere Hazards Coastal Hazards Program Guide to Geologic Hazards in Alaska Products Interactive Interactive Map Alaska Tsunami Inundation Maps Keywords Coastal and River; Geologic

  17. Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henneke, Dennis W.; Robinson, James

    In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less

  18. A wavelet-based estimator of the degrees of freedom in denoised fMRI time series for probabilistic testing of functional connectivity and brain graphs.

    PubMed

    Patel, Ameera X; Bullmore, Edward T

    2016-11-15

    Connectome mapping using techniques such as functional magnetic resonance imaging (fMRI) has become a focus of systems neuroscience. There remain many statistical challenges in analysis of functional connectivity and network architecture from BOLD fMRI multivariate time series. One key statistic for any time series is its (effective) degrees of freedom, df, which will generally be less than the number of time points (or nominal degrees of freedom, N). If we know the df, then probabilistic inference on other fMRI statistics, such as the correlation between two voxel or regional time series, is feasible. However, we currently lack good estimators of df in fMRI time series, especially after the degrees of freedom of the "raw" data have been modified substantially by denoising algorithms for head movement. Here, we used a wavelet-based method both to denoise fMRI data and to estimate the (effective) df of the denoised process. We show that seed voxel correlations corrected for locally variable df could be tested for false positive connectivity with better control over Type I error and greater specificity of anatomical mapping than probabilistic connectivity maps using the nominal degrees of freedom. We also show that wavelet despiked statistics can be used to estimate all pairwise correlations between a set of regional nodes, assign a P value to each edge, and then iteratively add edges to the graph in order of increasing P. These probabilistically thresholded graphs are likely more robust to regional variation in head movement effects than comparable graphs constructed by thresholding correlations. Finally, we show that time-windowed estimates of df can be used for probabilistic connectivity testing or dynamic network analysis so that apparent changes in the functional connectome are appropriately corrected for the effects of transient noise bursts. Wavelet despiking is both an algorithm for fMRI time series denoising and an estimator of the (effective) df of denoised fMRI time series. Accurate estimation of df offers many potential advantages for probabilistically thresholding functional connectivity and network statistics tested in the context of spatially variant and non-stationary noise. Code for wavelet despiking, seed correlational testing and probabilistic graph construction is freely available to download as part of the BrainWavelet Toolbox at www.brainwavelet.org. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  19. The Psychology of Hazard Risk Perception

    NASA Astrophysics Data System (ADS)

    Thompson, K. F.

    2012-12-01

    A critical step in preparing for natural hazards is understanding the risk: what is the hazard, its likelihood and range of impacts, and what are the vulnerabilities of the community? Any hazard forecast naturally includes a degree of uncertainty, and often these uncertainties are expressed in terms of probabilities. There is often a strong understanding of probability among the physical scientists and emergency managers who create hazard forecasts and issue watches, warnings, and evacuation orders, and often such experts expect similar levels of risk fluency among the general public—indeed, the Working Group on California Earthquake Probabilities (WGCEP) states in the introduction to its earthquake rupture forecast maps that "In daily living, people are used to making decisions based on probabilities—from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [1] However, cognitive psychologists have shown in numerous studies [see, e.g., 2-5] that the WGCEP's expectation of probability literacy is inaccurate. People neglect, distort, misjudge, or misuse probability information, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [6]. Even the most ubiquitous of probabilistic information—weather forecasts—are systematically misinterpreted [7]. So while disaster risk analysis and assessment is undoubtedly a critical step in public preparedness and hazard mitigation plans, it is equally important that scientists and practitioners understand the common psychological barriers to accurate probability perception before they attempt to communicate hazard risks to the public. This paper discusses several common, systematic distortions in probability perception and use, including: the influence of personal experience on use of statistical information; temporal discounting and construal level theory; the effect of instrumentality on risk perception; and the impact of "false alarms" or "near misses." We conclude with practical recommendations for ways that risk communications may best be presented to avoid (or, in some cases, to capitalize on) these typical psychological hurdles to the understanding of risk. 1 http://www.scec.org/ucerf/ 2 Kahneman, D. & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, XLVII: 263-291. 3 Hau, R., Pleskac, T. J., Kiefer, J., & Hertwig, R. (2008). The Description/Experience Gap in Risky Choice: The Role of Sample Size and Experienced Probabilities. Journal of Behavioral Decision Making, 21: 493-518. 4 Lichtenstein, S., Slovic, P., Fischhoff, B., Layman, M., & Combs, B. (1978). Judged frequency of lethal events. JEP: Human Learning and Memory, 4, 551-578. 5 Hertwig, R., Barron, G., Weber, E. U., & Erev, I. (2006). The role of information sampling in risky choice. In K. Fiedler, & P. Juslin (Eds.), Information sampling and adaptive cognition. (pp. 75-91). New York: Cambridge U Press. 6 Budescu, DV, Weinberg, S & Wallsten, TS (1987). Decisions based on numerically and verbally expressed uncertainties. JEP: Human Perception and Performance, 14(2), 281-294. 7 Gigerenzer, G., Hertwig, R., Van Den Broek, E., Fasolo, B., & Katsikopoulos, K. V. (2005). "A 30% chance of rain tomorrow": How does the public understand probabilistic weather forecasts? Risk Analysis, 25(3), 623-629.

  20. Project of Near-Real-Time Generation of ShakeMaps and a New Hazard Map in Austria

    NASA Astrophysics Data System (ADS)

    Jia, Yan; Weginger, Stefan; Horn, Nikolaus; Hausmann, Helmut; Lenhardt, Wolfgang

    2016-04-01

    Target-orientated prevention and effective crisis management can reduce or avoid damage and save lives in case of a strong earthquake. To achieve this goal, a project for automatic generated ShakeMaps (maps of ground motion and shaking intensity) and updating the Austrian hazard map was started at ZAMG (Zentralanstalt für Meteorologie und Geodynamik) in 2015. The first goal of the project is set for a near-real-time generation of ShakeMaps following strong earthquakes in Austria to provide rapid, accurate and official information to support the governmental crisis management. Using newly developed methods and software by SHARE (Seismic Hazard Harmonization in Europe) and GEM (Global Earthquake Model), which allows a transnational analysis at European level, a new generation of Austrian hazard maps will be ultimately calculated. More information and a status of our project will be given by this presentation.

  1. Clinical predictors of conversion to bipolar disorder in a prospective longitudinal familial high-risk sample: focus on depressive features.

    PubMed

    Frankland, Andrew; Roberts, Gloria; Holmes-Preston, Ellen; Perich, Tania; Levy, Florence; Lenroot, Rhoshel; Hadzi-Pavlovic, Dusan; Breakspear, Michael; Mitchell, Philip B

    2017-11-07

    Identifying clinical features that predict conversion to bipolar disorder (BD) in those at high familial risk (HR) would assist in identifying a more focused population for early intervention. In total 287 participants aged 12-30 (163 HR with a first-degree relative with BD and 124 controls (CONs)) were followed annually for a median of 5 years. We used the baseline presence of DSM-IV depressive, anxiety, behavioural and substance use disorders, as well as a constellation of specific depressive symptoms (as identified by the Probabilistic Approach to Bipolar Depression) to predict the subsequent development of hypo/manic episodes. At baseline, HR participants were significantly more likely to report ⩾4 Probabilistic features (40.4%) when depressed than CONs (6.7%; p < .05). Nineteen HR subjects later developed either threshold (n = 8; 4.9%) or subthreshold (n = 11; 6.7%) hypo/mania. The presence of ⩾4 Probabilistic features was associated with a seven-fold increase in the risk of 'conversion' to threshold BD (hazard ratio = 6.9, p < .05) above and beyond the fourteen-fold increase in risk related to major depressive episodes (MDEs) per se (hazard ratio = 13.9, p < .05). Individual depressive features predicting conversion were psychomotor retardation and ⩾5 MDEs. Behavioural disorders only predicted conversion to subthreshold BD (hazard ratio = 5.23, p < .01), while anxiety and substance disorders did not predict either threshold or subthreshold hypo/mania. This study suggests that specific depressive characteristics substantially increase the risk of young people at familial risk of BD going on to develop future hypo/manic episodes and may identify a more targeted HR population for the development of early intervention programs.

  2. Physical limits on ground motion at Yucca Mountain

    USGS Publications Warehouse

    Andrews, D.J.; Hanks, T.C.; Whitney, J.W.

    2007-01-01

    Physical limits on possible maximum ground motion at Yucca Mountain, Nevada, the designated site of a high-level radioactive waste repository, are set by the shear stress available in the seismogenic depth of the crust and by limits on stress change that can propagate through the medium. We find in dynamic deterministic 2D calculations that maximum possible horizontal peak ground velocity (PGV) at the underground repository site is 3.6 m/sec, which is smaller than the mean PGV predicted by the probabilistic seismic hazard analysis (PSHA) at annual exceedance probabilities less than 10-6 per year. The physical limit on vertical PGV, 5.7 m/sec, arises from supershear rupture and is larger than that from the PSHA down to 10-8 per year. In addition to these physical limits, we also calculate the maximum ground motion subject to the constraint of known fault slip at the surface, as inferred from paleoseismic studies. Using a published probabilistic fault displacement hazard curve, these calculations provide a probabilistic hazard curve for horizontal PGV that is lower than that from the PSHA. In all cases the maximum ground motion at the repository site is found by maximizing constructive interference of signals from the rupture front, for physically realizable rupture velocity, from all parts of the fault. Vertical PGV is maximized for ruptures propagating near the P-wave speed, and horizontal PGV is maximized for ruptures propagating near the Rayleigh-wave speed. Yielding in shear with a Mohr-Coulomb yield condition reduces ground motion only a modest amount in events with supershear rupture velocity, because ground motion consists primarily of P waves in that case. The possibility of compaction of the porous unsaturated tuffs at the higher ground-motion levels is another attenuating mechanism that needs to be investigated.

  3. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before eruptions: BET_VHst for Vesuvius and Campi Flegrei during recent exercises

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Rouwet, Dmtri; Tonini, Roberto; Macedonio, Giovanni; Marzocchi, Warner

    2015-04-01

    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short temporal intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology named BET_VHst (Selva et al. 2014) for short-term PVHA of volcanic tephra dispersal based on automatic interpretation of measures from the monitoring system and physical models of tephra dispersal from all possible vent positions and eruptive sizes based on frequently updated meteorological forecasts. The large uncertainty at all the steps required for the analysis, both aleatory and epistemic, is treated by means of Bayesian inference and statistical mixing of long- and short-term analyses. The BET_VHst model is here presented through its implementation during two exercises organized for volcanoes in the Neapolitan area: MESIMEX for Mt. Vesuvius, and VUELCO for Campi Flegrei. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  4. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before and during eruptions: BET_VHst for Mt. Etna

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Scollo, Simona; Costa, Antonio; Brancato, Alfonso; Prestifilippo, Michele

    2015-04-01

    Tephra dispersal, even in small amounts, may heavily affect public health and critical infrastructures, such as airports, train and road networks, and electric power supply systems. Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at managing and mitigating the risk posed by activity during volcanic crises and during eruptions. Short-term PVHA (over time intervals in the order of hours to few days) must account for rapidly changing information coming from the monitoring system, as well as, updated wind forecast, and they must be accomplished in near-real-time. In addition, while during unrest the primary goal is to forecast potential eruptions, during eruptions it is also fundamental to correctly account for the real-time status of the eruption and of tephra dispersal, as well as its potential evolution in the short-term. Here, we present a preliminary application of BET_VHst model (Selva et al. 2014) for Mt. Etna. The model has its roots into present state deterministic procedure, and it deals with the large uncertainty that such procedures typically ignore, like uncertainty on the potential position of the vent and eruptive size, on the possible evolution of volcanological input during ongoing eruptions, as well as, on wind field. Uncertainty is treated by making use of Bayesian inference, alternative modeling procedures for tephra dispersal, and statistical mixing of long- and short-term analyses. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  5. Seismic hazard maps of Mexico, the Caribbean, and Central and South America

    USGS Publications Warehouse

    Tanner, J.G.; Shedlock, K.M.

    2004-01-01

    The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures due to an insufficient knowledge of existing seismic hazard and/or economic constraints. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. We have produced a suite of seismic hazard estimates for Mexico, the Caribbean, and Central and South America. One of the preliminary maps in this suite served as the basis for the Caribbean and Central and South America portion of the Global Seismic Hazard Map (GSHM) published in 1999, which depicted peak ground acceleration (pga) with a 10% chance of exceedance in 50 years for rock sites. Herein we present maps depicting pga and 0.2 and 1.0 s spectral accelerations (SA) with 50%, 10%, and 2% chances of exceedance in 50 years for rock sites. The seismicity catalog used in the generation of these maps adds 3 more years of data to those used to calculate the GSH Map. Different attenuation functions (consistent with those used to calculate the U.S. and Canadian maps) were used as well. These nine maps are designed to assist in global risk mitigation by providing a general seismic hazard framework and serving as a resource for any national or regional agency to help focus further detailed studies required for regional/local needs. The largest seismic hazard values in Mexico, the Caribbean, and Central and South America generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes. High hazard values occur in areas where shallow-to-intermediate seismicity occurs frequently. ?? 2004 Elsevier B.V. All rights reserved.

  6. Solving graph data issues using a layered architecture approach with applications to web spam detection.

    PubMed

    Scarselli, Franco; Tsoi, Ah Chung; Hagenbuchner, Markus; Noi, Lucia Di

    2013-12-01

    This paper proposes the combination of two state-of-the-art algorithms for processing graph input data, viz., the probabilistic mapping graph self organizing map, an unsupervised learning approach, and the graph neural network, a supervised learning approach. We organize these two algorithms in a cascade architecture containing a probabilistic mapping graph self organizing map, and a graph neural network. We show that this combined approach helps us to limit the long-term dependency problem that exists when training the graph neural network resulting in an overall improvement in performance. This is demonstrated in an application to a benchmark problem requiring the detection of spam in a relatively large set of web sites. It is found that the proposed method produces results which reach the state of the art when compared with some of the best results obtained by others using quite different approaches. A particular strength of our method is its applicability towards any input domain which can be represented as a graph. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Probabilistic self-localisation on a qualitative map based on occlusions

    NASA Astrophysics Data System (ADS)

    Santos, Paulo E.; Martins, Murilo F.; Fenelon, Valquiria; Cozman, Fabio G.; Dee, Hannah M.

    2016-09-01

    Spatial knowledge plays an essential role in human reasoning, permitting tasks such as locating objects in the world (including oneself), reasoning about everyday actions and describing perceptual information. This is also the case in the field of mobile robotics, where one of the most basic (and essential) tasks is the autonomous determination of the pose of a robot with respect to a map, given its perception of the environment. This is the problem of robot self-localisation (or simply the localisation problem). This paper presents a probabilistic algorithm for robot self-localisation that is based on a topological map constructed from the observation of spatial occlusion. Distinct locations on the map are defined by means of a classical formalism for qualitative spatial reasoning, whose base definitions are closer to the human categorisation of space than traditional, numerical, localisation procedures. The approach herein proposed was systematically evaluated through experiments using a mobile robot equipped with a RGB-D sensor. The results obtained show that the localisation algorithm is successful in locating the robot in qualitatively distinct regions.

  8. Learning Probabilistic Features for Robotic Navigation Using Laser Sensors

    PubMed Central

    Aznar, Fidel; Pujol, Francisco A.; Pujol, Mar; Rizo, Ramón; Pujol, María-José

    2014-01-01

    SLAM is a popular task used by robots and autonomous vehicles to build a map of an unknown environment and, at the same time, to determine their location within the map. This paper describes a SLAM-based, probabilistic robotic system able to learn the essential features of different parts of its environment. Some previous SLAM implementations had computational complexities ranging from O(Nlog(N)) to O(N 2), where N is the number of map features. Unlike these methods, our approach reduces the computational complexity to O(N) by using a model to fuse the information from the sensors after applying the Bayesian paradigm. Once the training process is completed, the robot identifies and locates those areas that potentially match the sections that have been previously learned. After the training, the robot navigates and extracts a three-dimensional map of the environment using a single laser sensor. Thus, it perceives different sections of its world. In addition, in order to make our system able to be used in a low-cost robot, low-complexity algorithms that can be easily implemented on embedded processors or microcontrollers are used. PMID:25415377

  9. Learning probabilistic features for robotic navigation using laser sensors.

    PubMed

    Aznar, Fidel; Pujol, Francisco A; Pujol, Mar; Rizo, Ramón; Pujol, María-José

    2014-01-01

    SLAM is a popular task used by robots and autonomous vehicles to build a map of an unknown environment and, at the same time, to determine their location within the map. This paper describes a SLAM-based, probabilistic robotic system able to learn the essential features of different parts of its environment. Some previous SLAM implementations had computational complexities ranging from O(Nlog(N)) to O(N(2)), where N is the number of map features. Unlike these methods, our approach reduces the computational complexity to O(N) by using a model to fuse the information from the sensors after applying the Bayesian paradigm. Once the training process is completed, the robot identifies and locates those areas that potentially match the sections that have been previously learned. After the training, the robot navigates and extracts a three-dimensional map of the environment using a single laser sensor. Thus, it perceives different sections of its world. In addition, in order to make our system able to be used in a low-cost robot, low-complexity algorithms that can be easily implemented on embedded processors or microcontrollers are used.

  10. How to Display Hazards and other Scientific Data Using Google Maps

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Fee, J. M.

    2007-12-01

    The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) is launching a map-based interface to display hazards information using the Google® Map API (Application Program Interface). Map-based interfaces provide a synoptic view of data, making patterns easier to detect and allowing users to quickly ascertain where hazards are in relation to major population and infrastructure centers. Several map-based interfaces are now simple to run on a web server, providing ideal platforms for sharing information with colleagues, emergency managers, and the public. There are three main steps to making data accessible on a map-based interface; formatting the input data, plotting the data on the map, and customizing the user interface. The presentation, "Creating Geospatial RSS and ATOM feeds for Map-based Interfaces" (Fee and Venezky, this session), reviews key features for map input data. Join us for this presentation on how to plot data in a geographic context and then format the display with images, custom markers, and links to external data. Examples will show how the VHP Volcano Status Map was created and how to plot a field trip with driving directions.

  11. St. Louis Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

    2007-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

  12. Estimate Tsunami Flow Conditions and Large-Debris Tracks for the Design of Coastal Infrastructures along Coastlines of the U.S. Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thomas, S.; Zhou, H.; Arcas, D.; Titov, V. V.

    2017-12-01

    The increasing potential tsunami hazards pose great challenges for infrastructures along the coastlines of the U.S. Pacific Northwest. Tsunami impact at a coastal site is usually assessed from deterministic scenarios based on 10,000 years of geological records in the Cascadia Subduction Zone (CSZ). Aside from these deterministic methods, the new ASCE 7-16 tsunami provisions provide engineering design criteria of tsunami loads on buildings based on a probabilistic approach. This work develops a site-specific model near Newport, OR using high-resolution grids, and compute tsunami inundation depth and velocities at the study site resulted from credible probabilistic and deterministic earthquake sources in the Cascadia Subduction Zone. Three Cascadia scenarios, two deterministic scenarios, XXL1 and L1, and a 2,500-yr probabilistic scenario compliant with the new ASCE 7-16 standard, are simulated using combination of a depth-averaged shallow water model for offshore propagation and a Boussinesq-type model for onshore inundation. We speculate on the methods and procedure to obtain the 2,500-year probabilistic scenario for Newport that is compliant with the ASCE 7-16 tsunami provisions. We provide details of model results, particularly the inundation depth and flow speed for a new building, which will also be designated as a tsunami vertical evacuation shelter, at Newport, Oregon. We show that the ASCE 7-16 consistent hazards are between those obtained from deterministic L1 and XXL1 scenarios, and the greatest impact on the building may come from later waves. As a further step, we utilize the inundation model results to numerically compute tracks of large vessels in the vicinity of the building site and estimate if these vessels will impact on the building site during the extreme XXL1 and ASCE 7-16 hazard-consistent scenarios. Two-step study is carried out first to study tracks of massless particles and then large vessels with assigned mass considering drag force, inertial force, ship grounding and mooring. The simulation results show that none of the large vessels will impact on the building site in all tested scenarios.

  13. Application of an adaptive neuro-fuzzy inference system to ground subsidence hazard mapping

    NASA Astrophysics Data System (ADS)

    Park, Inhye; Choi, Jaewon; Jin Lee, Moung; Lee, Saro

    2012-11-01

    We constructed hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok City, Korea, using an adaptive neuro-fuzzy inference system (ANFIS) and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, and ground subsidence maps. An attribute database was also constructed from field investigations and reports on existing ground subsidence areas at the study site. Five major factors causing ground subsidence were extracted: (1) depth of drift; (2) distance from drift; (3) slope gradient; (4) geology; and (5) land use. The adaptive ANFIS model with different types of membership functions (MFs) was then applied for ground subsidence hazard mapping in the study area. Two ground subsidence hazard maps were prepared using the different MFs. Finally, the resulting ground subsidence hazard maps were validated using the ground subsidence test data which were not used for training the ANFIS. The validation results showed 95.12% accuracy using the generalized bell-shaped MF model and 94.94% accuracy using the Sigmoidal2 MF model. These accuracy results show that an ANFIS can be an effective tool in ground subsidence hazard mapping. Analysis of ground subsidence with the ANFIS model suggests that quantitative analysis of ground subsidence near AUCMs is possible.

  14. Seismic hazard map of North and Central America and the Caribbean

    USGS Publications Warehouse

    Shedlock, K.M.

    1999-01-01

    Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local government, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of North and Central America and the Caribbean is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful regional seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the horizontal force a building should be able to withstand during an earthquake. This seismic hazard map of North and Central America and the Caribbean depicts the likely level of short-period ground motion from earthquakes in a fifty-year window. Short-period ground motions effect short-period structures (e.g., one-to-two story buildings). The highest seismic hazard values in the region generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes.

  15. Global link between deformation and volcanic eruption quantified by satellite imagery

    PubMed Central

    Biggs, J.; Ebmeier, S. K.; Aspinall, W. P.; Lu, Z.; Pritchard, M. E.; Sparks, R. S. J.; Mather, T. A.

    2014-01-01

    A key challenge for volcanological science and hazard management is that few of the world’s volcanoes are effectively monitored. Satellite imagery covers volcanoes globally throughout their eruptive cycles, independent of ground-based monitoring, providing a multidecadal archive suitable for probabilistic analysis linking deformation with eruption. Here we show that, of the 198 volcanoes systematically observed for the past 18 years, 54 deformed, of which 25 also erupted. For assessing eruption potential, this high proportion of deforming volcanoes that also erupted (46%), together with the proportion of non-deforming volcanoes that did not erupt (94%), jointly represent indicators with ‘strong’ evidential worth. Using a larger catalogue of 540 volcanoes observed for 3 years, we demonstrate how this eruption–deformation relationship is influenced by tectonic, petrological and volcanic factors. Satellite technology is rapidly evolving and routine monitoring of the deformation status of all volcanoes from space is anticipated, meaning probabilistic approaches will increasingly inform hazard decisions and strategic development. PMID:24699342

  16. Global link between deformation and volcanic eruption quantified by satellite imagery.

    PubMed

    Biggs, J; Ebmeier, S K; Aspinall, W P; Lu, Z; Pritchard, M E; Sparks, R S J; Mather, T A

    2014-04-03

    A key challenge for volcanological science and hazard management is that few of the world's volcanoes are effectively monitored. Satellite imagery covers volcanoes globally throughout their eruptive cycles, independent of ground-based monitoring, providing a multidecadal archive suitable for probabilistic analysis linking deformation with eruption. Here we show that, of the 198 volcanoes systematically observed for the past 18 years, 54 deformed, of which 25 also erupted. For assessing eruption potential, this high proportion of deforming volcanoes that also erupted (46%), together with the proportion of non-deforming volcanoes that did not erupt (94%), jointly represent indicators with 'strong' evidential worth. Using a larger catalogue of 540 volcanoes observed for 3 years, we demonstrate how this eruption-deformation relationship is influenced by tectonic, petrological and volcanic factors. Satellite technology is rapidly evolving and routine monitoring of the deformation status of all volcanoes from space is anticipated, meaning probabilistic approaches will increasingly inform hazard decisions and strategic development.

  17. Method for the Preparation of Hazard Map in Urban Area Using Soil Depth and Groundwater Level

    NASA Astrophysics Data System (ADS)

    Kim, Sung-Wook; Choi, Eun-Kyeong; Cho, Jin Woo; Lee, Ju-Hyoung

    2017-04-01

    The hazard maps for predicting collapse on natural slopes consists of a combination of topographic, hydrological, and geological factors. Topographic factors are extracted from DEM, including aspect, slope, curvature, and topographic index. Hydrological factors, such as distance to drainage, drainage density, stream-power index, and wetness index are most important factors for slope instability. However, most of the urban areas are located on the plains and it is difficult to apply the hazard map using the topography and hydrological factors. In order to evaluate the risk of collapse of flat and low slope areas, soil depth and groundwater level data were collected and used as a factor for interpretation. In addition, the reliability of the hazard map was compared with the disaster history of the study area (Gangnam-gu and Yeouido district). In the disaster map of the disaster prevention agency, the urban area was mostly classified as the stable area and did not reflect the collapse history. Soil depth, drainage conditions and groundwater level obtained from boreholes were added as input data of hazard map, and disaster vulnerability increased at the location where the actual collapse points. In the study area where damage occurred, the moderate and low grades of the vulnerability of previous hazard map were 12% and 88%, respectively. While, the improved map showed 2% high grade, moderate grade 29%, low grade 66% and very low grade 2%. These results were similar to actual damage. Keywords: hazard map, urban area, soil depth, ground water level Acknowledgement This research was supported by a Grant from a Strategic Research Project (Horizontal Drilling and Stabilization Technologies for Urban Search and Rescue (US&R) Operation) funded by the Korea Institute of Civil Engineering and Building Technology.

  18. A Method to Assess Flux Hazards at CSP Plants to Reduce Avian Mortality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, Clifford K.; Wendelin, Timothy; Horstman, Luke

    A method to evaluate avian flux hazards at concentrating solar power plants (CSP) has been developed. A heat-transfer model has been coupled to simulations of the irradiance in the airspace above a CSP plant to determine the feather temperature along prescribed bird flight paths. Probabilistic modeling results show that the irradiance and assumed feather properties (thickness, absorptance, heat capacity) have the most significant impact on the simulated feather temperature, which can increase rapidly (hundreds of degrees Celsius in seconds) depending on the parameter values. The avian flux hazard model is being combined with a plant performance model to identify alternativemore » heliostat standby aiming strategies that minimize both avian flux hazards and negative impacts on plant performance.« less

  19. A method to assess flux hazards at CSP plants to reduce avian mortality

    NASA Astrophysics Data System (ADS)

    Ho, Clifford K.; Wendelin, Timothy; Horstman, Luke; Yellowhair, Julius

    2017-06-01

    A method to evaluate avian flux hazards at concentrating solar power plants (CSP) has been developed. A heat-transfer model has been coupled to simulations of the irradiance in the airspace above a CSP plant to determine the feather temperature along prescribed bird flight paths. Probabilistic modeling results show that the irradiance and assumed feather properties (thickness, absorptance, heat capacity) have the most significant impact on the simulated feather temperature, which can increase rapidly (hundreds of degrees Celsius in seconds) depending on the parameter values. The avian flux hazard model is being combined with a plant performance model to identify alternative heliostat standby aiming strategies that minimize both avian flux hazards and negative impacts on plant performance.

  20. Assignment of functional activations to probabilistic cytoarchitectonic areas revisited.

    PubMed

    Eickhoff, Simon B; Paus, Tomas; Caspers, Svenja; Grosbras, Marie-Helene; Evans, Alan C; Zilles, Karl; Amunts, Katrin

    2007-07-01

    Probabilistic cytoarchitectonic maps in standard reference space provide a powerful tool for the analysis of structure-function relationships in the human brain. While these microstructurally defined maps have already been successfully used in the analysis of somatosensory, motor or language functions, several conceptual issues in the analysis of structure-function relationships still demand further clarification. In this paper, we demonstrate the principle approaches for anatomical localisation of functional activations based on probabilistic cytoarchitectonic maps by exemplary analysis of an anterior parietal activation evoked by visual presentation of hand gestures. After consideration of the conceptual basis and implementation of volume or local maxima labelling, we comment on some potential interpretational difficulties, limitations and caveats that could be encountered. Extending and supplementing these methods, we then propose a supplementary approach for quantification of structure-function correspondences based on distribution analysis. This approach relates the cytoarchitectonic probabilities observed at a particular functionally defined location to the areal specific null distribution of probabilities across the whole brain (i.e., the full probability map). Importantly, this method avoids the need for a unique classification of voxels to a single cortical area and may increase the comparability between results obtained for different areas. Moreover, as distribution-based labelling quantifies the "central tendency" of an activation with respect to anatomical areas, it will, in combination with the established methods, allow an advanced characterisation of the anatomical substrates of functional activations. Finally, the advantages and disadvantages of the various methods are discussed, focussing on the question of which approach is most appropriate for a particular situation.

  1. Probabilistic estimation of long-term volcanic hazard under evolving tectonic conditions in a 1 Ma timeframe

    NASA Astrophysics Data System (ADS)

    Jaquet, O.; Lantuéjoul, C.; Goto, J.

    2017-10-01

    Risk assessments in relation to the siting of potential deep geological repositories for radioactive wastes demand the estimation of long-term tectonic hazards such as volcanicity and rock deformation. Owing to their tectonic situation, such evaluations concern many industrial regions around the world. For sites near volcanically active regions, a prevailing source of uncertainty is related to volcanic hazard. For specific situations, in particular in relation to geological repository siting, the requirements for the assessment of volcanic and tectonic hazards have to be expanded to 1 million years. At such time scales, tectonic changes are likely to influence volcanic hazard and therefore a particular stochastic model needs to be developed for the estimation of volcanic hazard. The concepts and theoretical basis of the proposed model are given and a methodological illustration is provided using data from the Tohoku region of Japan.

  2. Seismic hazard in the Intermountain West

    USGS Publications Warehouse

    Haller, Kathleen; Moschetti, Morgan P.; Mueller, Charles; Rezaeian, Sanaz; Petersen, Mark D.; Zeng, Yuehua

    2015-01-01

    The 2014 national seismic-hazard model for the conterminous United States incorporates new scientific results and important model adjustments. The current model includes updates to the historical catalog, which is spatially smoothed using both fixed-length and adaptive-length smoothing kernels. Fault-source characterization improved by adding faults, revising rates of activity, and incorporating new results from combined inversions of geologic and geodetic data. The update also includes a new suite of published ground motion models. Changes in probabilistic ground motion are generally less than 10% in most of the Intermountain West compared to the prior assessment, and ground-motion hazard in four Intermountain West cities illustrates the range and magnitude of change in the region. Seismic hazard at reference sites in Boise and Reno increased as much as 10%, whereas hazard in Salt Lake City decreased 5–6%. The largest change was in Las Vegas, where hazard increased 32–35%.

  3. Water Induced Hazard Mapping in Nepal: A Case Study of East Rapti River Basin

    NASA Astrophysics Data System (ADS)

    Neupane, N.

    2010-12-01

    This paper presents illustration on typical water induced hazard mapping of East Rapti River Basin under the DWIDP, GON. The basin covers an area of 2398 sq km. The methodology includes making of base map of water induced disaster in the basin. Landslide hazard maps were prepared by SINMAP approach. Debris flow hazard maps were prepared by considering geology, slope, and saturation. Flood hazard maps were prepared by using two approaches: HEC-RAS and Satellite Imagery Interpretation. The composite water-induced hazard maps were produced by compiling the hazards rendered by landslide, debris flow, and flood. The monsoon average rainfall in the basin is 1907 mm whereas maximum 24 hours precipitation is 456.8 mm. The peak discharge of the Rapati River in the year of 1993 at station was 1220 cu m/sec. This discharge nearly corresponds to the discharge of 100-year return period. The landslides, floods, and debris flows triggered by the heavy rain of July 1993 claimed 265 lives, affected 148516 people, and damaged 1500 houses in the basin. The field investigation and integrated GIS interpretation showed that the very high and high landslide hazard zones collectively cover 38.38% and debris flow hazard zone constitutes 6.58%. High flood hazard zone occupies 4.28% area of the watershed. Mitigation measures are recommendated according to Integrated Watershed Management Approach under which the non-structural and structural measures are proposed. The non-structural measures includes: disaster management training, formulation of evacuation system (arrangement of information plan about disaster), agriculture management practices, protection of water sources, slope protections and removal of excessive bed load from the river channel. Similarly, structural measures such as dike, spur, rehabilitation of existing preventive measures and river training at some locations are recommendated. The major factors that have contributed to induce high incidences of various types of mass movements and inundation in the basin are rock and soil properties, prolonged and high-intensity rainfall, steep topography and various anthropogenic factors.

  4. Geologic Maps as the Foundation of Mineral-Hazards Maps in California

    NASA Astrophysics Data System (ADS)

    Higgins, C. T.; Churchill, R. K.; Downey, C. I.; Clinkenbeard, J. P.; Fonseca, M. C.

    2010-12-01

    The basic geologic map is essential to the development of products that help planners, engineers, government officials, and the general public make decisions concerning natural hazards. Such maps are the primary foundation that the California Geological Survey (CGS) uses to prepare maps that show potential for mineral-hazards. Examples of clients that request these maps are the California Department of Transportation (Caltrans) and California Department of Public Health (CDPH). Largely because of their non-catastrophic nature, mineral hazards have received much less public attention compared to earthquakes, landslides, volcanic eruptions, and floods. Nonetheless, mineral hazards can be a major concern locally when considering human health and safety and potential contamination of the environment by human activities such as disposal of earth materials. To address some of these concerns, the CGS has focused its mineral-hazards maps on naturally occurring asbestos (NOA), radon, and various potentially toxic metals as well as certain artificial features such as mines and oil and gas wells. The maps range in scope from statewide to counties and Caltrans districts to segments of selected highways. To develop the hazard maps, the CGS begins with traditional paper and digital versions of basic geologic maps, which are obtained from many sources such as its own files, the USGS, USDA Forest Service, California Department of Water Resources, and counties. For each study area, these maps present many challenges of compilation related to vintage, scale, definition of units, and edge-matching across map boundaries. The result of each CGS compilation is a digital geologic layer that is subsequently reinterpreted and transformed into new digital layers (e.g., lithologic) that focus on the geochemical and mineralogical properties of the area’s earth materials and structures. These intermediate layers are then integrated with other technical data to derive final digital layers that show potential for mineral hazards. Depending on the type of mineral hazard investigated, qualitative and/or quantitative methods are used in this process. The final information is given to CGS clients in various formats that range from traditional paper maps to attributed digital layers, which can be viewed on background digital imagery in 2D or 3D with image viewers or GIS software. This variety of formats assures that users with different levels of computer experience or available computer resources can access the information. Besides the applications presented here, mineral-hazards mapping can also be used in many other settings and situations as a tool to evaluate potential effects on human health and the environment. Examples include fighting forest fires, harvesting of timber, post-fire debris flows during storms, disposal or import of earth materials for non-highway construction projects, and rural areas used for recreation (hiking, motorcycling, etc.). In the future, the CGS expects to investigate and possibly employ more-sophisticated digital algorithms to rate and display the potential for specific mineral hazards on its maps. The geologist’s knowledge and experience will still be needed, however, to review these digital results to decide if they are reasonable.

  5. Probabilistic Multi-Hazard Assessment of Dry Cask Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bencturk, Bora; Padgett, Jamie; Uddin, Rizwan

    systems the concrete shall not only provide shielding but insures stability of the upright canister, facilitates anchoring, allows ventilation, and provides physical protection against theft, severe weather and natural (seismic) as well as man-made events (blast incidences). Given the need to remain functional for 40 years or even longer in case of interim storage, the concrete outerpack and the internal canister components need to be evaluated with regard to their long-term ability to perform their intended design functions. Just as evidenced by deteriorating concrete bridges, there are reported visible degradation mechanisms of dry storage systems especially when high corrosive environmentsmore » are considered in maritime locations. The degradation of reinforced concrete is caused by multiple physical and chemical mechanisms, which may be summarized under the heading of environmental aging. The underlying hygro-thermal transport processes are accelerated by irradiation effects, hence creep and shrinkage need to include the effect of chloride penetration, alkali aggregate reaction as well as corrosion of the reinforcing steel. In light of the above, the two main objectives of this project are to (1) develop a probabilistic multi-hazard assessment framework, and (2) through experimental and numerical research perform a comprehensive assessment under combined earthquake loads and aging induced deterioration, which will also provide data for the development and validation of the probabilistic framework.« less

  6. 78 FR 45941 - Changes in Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-30

    ... (hereinafter referred to as flood hazard determinations) as shown on the indicated Letter of Map Revision (LOMR... Insurance Rate Maps (FIRMs), and in some cases the Flood Insurance Study (FIS) reports, currently in effect... respective Community Map Repository address listed in the table below and online through the FEMA Map Service...

  7. Multi Hazard Assessment: The Azores Archipelagos (PT) case

    NASA Astrophysics Data System (ADS)

    Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos

    2016-04-01

    The COPERNICUS EMS Risk & Recovery Mapping (RRM) activity offers services to support efficient design and implementation of mitigation measures and recovery planning based on EO data exploitation. The Azores Archipelagos case was realized in the context of the FWC 259811 Copernicus EMS RRM, and provides potential impact information for a number of natural disasters. The analysis identified population and assets at risk (infrastructures and environment). The risk assessment was based on hazard and vulnerability of structural elements, road network characteristics, etc. Integration of different hazards and risks was accounted in establishing the necessary first response/ first aid infrastructure. EO data (Pleiades and WV-2), were used to establish a detailed background information, common for the assessment of the whole of the risks. A qualitative Flood hazard level was established, through a "Flood Susceptibility Index" that accounts for upstream drainage area and local slope along the drainage network (Manfreda et al. 2014). Indicators, representing different vulnerability typologies, were accounted for. The risk was established through intersecting hazard and vulnerability (risk- specific lookup table). Probabilistic seismic hazards maps (PGA) were obtained by applying the Cornell (1968) methodology as implemented in CRISIS2007 (Ordaz et al. 2007). The approach relied on the identification of potential sources, the assessment of earthquake recurrence and magnitude distribution, the selection of ground motion model, and the mathematical model to calculate seismic hazard. Lava eruption areas and a volcanic activity related coefficient were established through available historical data. Lava flow paths and their convergence were estimated through applying a cellular, automata based, Lava Flow Hazard numerical model (Gestur Leó Gislason, 2013). The Landslide Hazard Index of NGI (Norwegian Geotechnical Institute) for heavy rainfall (100 year extreme monthly rainfall) and earthquake (475 years return period) was used. Topography, lithology, soil moisture and LU/LC were also accounted for. Soil erosion risk was assessed through the empirical model RUSLE (Renard et al. 1991b). Rainfall erosivity, topography and vegetation cover are the main parameters which were used for predicting the proneness to soil loss. Expected, maximum tsunami wave heights were estimated for a specific earthquake scenario at designated forecast points along the coasts. Deformation at the source was calculated by utilizing the Okada code (Okada, 1985). Tsunami waves' generation and propagation is based on the SWAN model (JRC/IPSC modification). To estimate the wave height (forecast points) the Green's Law function was used (JRC Tsunami Analysis Tool). Storm tracks' historical data indicate a return period of 17 /41 years for H1 /H2 hurricane categories respectively. NOAA WAVEWATCH III model hindcast reanalysis was used to estimate the maximum significant wave height (wind and swell) along the coastline during two major storms. The associated storm-surge risk assessment accounted also for the coastline morphology. Seven empirical (independent) indicators were used to express the erosion susceptibility of the coasts. Each indicator is evaluated according to a semi?quantitative score that represents low, medium and high level of erosion risk or impact. The estimation of the coastal erosion hazard was derived through aggregating the indicators in a grid scale.

  8. Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER) project and a next-generation real-time volcano hazard assessment system

    NASA Astrophysics Data System (ADS)

    Takarada, S.

    2012-12-01

    The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in vent position, volume, eruption rate, wind directions and topography. Therefore, numerical simulations with controlled parameters are needed for more precise volcanic eruption predictions. The use of the next-generation system should enable the visualization of past volcanic eruptions datasets such as distributions, eruption volumes and eruption rates, on maps and diagrams using timeline and GIS technology. Similar volcanic eruptions scenarios should be easily searchable from the eruption database. Using the volcano hazard assessment system, prediction of the time and area that would be affected by volcanic eruptions at any locations near the volcano should be possible, using numerical simulations. The system should estimate volcanic hazard risks by overlaying the distributions of volcanic deposits on major roads, houses and evacuation areas using a GIS enabled systems. Probabilistic volcanic hazards maps in active volcano sites should be made based on numerous numerical simulations. The next-generation real-time hazard assessment system would be implemented with user-friendly interface, making the risk assessment system easily usable and accessible online.

  9. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models

    NASA Astrophysics Data System (ADS)

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; Gutiérrez-Gutiérrez, O. Q.; Larreynaga, J.; González, M.; Castro, M.; Gavidia, F.; Aguirre-Ayerbe, I.; González-Riancho, P.; Carreño, E.

    2013-05-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has approximately a length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there have been 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and hundreds of victims. The hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached from both Probabilistic and Deterministic Methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold, on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps and from the elevation in the near-shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences - finite volumes numerical model in this work, based on the Linear and Non-linear Shallow Water Equations, to simulate a total of 24 earthquake generated tsunami scenarios. In the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results obtained with the high resolution numerical modelling, being a good and fast approximation to obtain preliminary tsunami hazard estimations. In Acajutla and La Libertad, both important tourism centres being actively developed, flooding depths between 2 and 4 m are frequent, accompanied with high and very high person instability hazard. Inside the Gulf of Fonseca the impact of the waves is almost negligible.

  10. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models.

    NASA Astrophysics Data System (ADS)

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; Gutiérrez-Gutiérrez, O. Q.; Larreynaga, J.; González, M.; Castro, M.; Gavidia, F.; Aguirre-Ayerbe, I.; González-Riancho, P.; Carreño, E.

    2013-11-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and deterministic methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold: on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high-resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps, and from the elevation in the near shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific Basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences-finite volumes numerical model in this work, based on the linear and non-linear shallow water equations, to simulate a total of 24 earthquake-generated tsunami scenarios. Our results show that at the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results obtained with the high-resolution numerical modelling, being a good and fast approximation to obtain preliminary tsunami hazard estimations. In Acajutla and La Libertad, both important tourism centres being actively developed, flooding depths between 2 and 4 m are frequent, accompanied with high and very high person instability hazard. Inside the Gulf of Fonseca the impact of the waves is almost negligible.

  11. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  12. Numerical and Probabilistic Analysis of Asteroid and Comet Impact Hazard Mitigation

    DTIC Science & Technology

    2010-09-01

    object on Jupiter are reminders and warning signals that we should take seriously. The extinction of the dinosaurs has been attributed to the impact of a...experimentally determined absorption patterns. These energy deposition processes are independent, so a piecemeal approach is physically reasonable . We

  13. Toward standardized mapping for left atrial analysis and cardiac ablation guidance

    NASA Astrophysics Data System (ADS)

    Rettmann, M. E.; Holmes, D. R.; Linte, C. A.; Packer, D. L.; Robb, R. A.

    2014-03-01

    In catheter-based cardiac ablation, the pulmonary vein ostia are important landmarks for guiding the ablation procedure, and for this reason, have been the focus of many studies quantifying their size, structure, and variability. Analysis of pulmonary vein structure, however, has been limited by the lack of a standardized reference space for population based studies. Standardized maps are important tools for characterizing anatomic variability across subjects with the goal of separating normal inter-subject variability from abnormal variability associated with disease. In this work, we describe a novel technique for computing flat maps of left atrial anatomy in a standardized space. A flat map of left atrial anatomy is created by casting a single ray through the volume and systematically rotating the camera viewpoint to obtain the entire field of view. The technique is validated by assessing preservation of relative surface areas and distances between the original 3D geometry and the flat map geometry. The proposed methodology is demonstrated on 10 subjects which are subsequently combined to form a probabilistic map of anatomic location for each of the pulmonary vein ostia and the boundary of the left atrial appendage. The probabilistic map demonstrates that the location of the inferior ostia have higher variability than the superior ostia and the variability of the left atrial appendage is similar to the superior pulmonary veins. This technique could also have potential application in mapping electrophysiology data, radio-frequency ablation burns, or treatment planning in cardiac ablation therapy.

  14. A probabilistic seismic model for the European Arctic

    NASA Astrophysics Data System (ADS)

    Hauser, Juerg; Dyer, Kathleen M.; Pasyanos, Michael E.; Bungum, Hilmar; Faleide, Jan I.; Clark, Stephen A.; Schweitzer, Johannes

    2011-01-01

    The development of three-dimensional seismic models for the crust and upper mantle has traditionally focused on finding one model that provides the best fit to the data while observing some regularization constraints. In contrast to this, the inversion employed here fits the data in a probabilistic sense and thus provides a quantitative measure of model uncertainty. Our probabilistic model is based on two sources of information: (1) prior information, which is independent from the data, and (2) different geophysical data sets, including thickness constraints, velocity profiles, gravity data, surface wave group velocities, and regional body wave traveltimes. We use a Markov chain Monte Carlo (MCMC) algorithm to sample models from the prior distribution, the set of plausible models, and test them against the data to generate the posterior distribution, the ensemble of models that fit the data with assigned uncertainties. While being computationally more expensive, such a probabilistic inversion provides a more complete picture of solution space and allows us to combine various data sets. The complex geology of the European Arctic, encompassing oceanic crust, continental shelf regions, rift basins and old cratonic crust, as well as the nonuniform coverage of the region by data with varying degrees of uncertainty, makes it a challenging setting for any imaging technique and, therefore, an ideal environment for demonstrating the practical advantages of a probabilistic approach. Maps of depth to basement and depth to Moho derived from the posterior distribution are in good agreement with previously published maps and interpretations of the regional tectonic setting. The predicted uncertainties, which are as important as the absolute values, correlate well with the variations in data coverage and quality in the region. A practical advantage of our probabilistic model is that it can provide estimates for the uncertainties of observables due to model uncertainties. We will demonstrate how this can be used for the formulation of earthquake location algorithms that take model uncertainties into account when estimating location uncertainties.

  15. A test of various site-effect parameterizations in probabilistic seismic hazard analyses of southern California

    USGS Publications Warehouse

    Field, E.H.; Petersen, M.D.

    2000-01-01

    We evaluate the implications of several attenuation relationships, including three customized for southern California, in terms of accounting for site effects in probabilistic seismic hazard studies. The analysis is carried out at 43 sites along a profile spanning the Los Angeles basin with respect to peak acceleration, and 0.3-, 1.0-, and 3.0-sec response spectral acceleration values that have a 10% chance of being exceeded in 50 years. The variability among currently viable attenuation relationships (espistemic uncertainty) is an approximate factor of 2. Biases between several commonly used attenuation relationships and southern California strong-motion data imply hazard differences that exceed 10%. However, correcting each relationship for the southern California bias does not necessarily bring hazard estimates into better agreement. A detailed subclassification of site types (beyond rock versus soil) is found to be both justified by data and to make important distinctions in terms of hazard levels. A basin depth effect is also shown to be important, implying a difference of up to a factor of 2 in ground motion between the deepest and shallowest parts of the Los Angeles basin. In fact, for peak acceleration, the basin-depth effect is even more influential than the surface site condition. Questions remain, however, whether basin depth is a proxy for some other site attribute such as distance from the basin edge. The reduction in prediction error (sigma) produced by applying detailed site and/or basin-depth corrections does not have an important influence on the hazard. In fact, the sigma reduction is less than epistemic uncertainties on sigma itself. Due to data limitations, it is impossible to determine which attenuation relationship is best. However, our results do indicate which site conditions seem most influential. This information should prove useful to those developing or updating attenuation relationships and to those attempting to make more refined estimates of hazard in the near future.

  16. New Evaluation of Seismic Hazard in Cental America and la Hispaniola

    NASA Astrophysics Data System (ADS)

    Benito, B.; Camacho, E. I.; Rojas, W.; Climent, A.; Alvarado-Induni, G.; Marroquin, G.; Molina, E.; Talavera, E.; Belizaire, D.; Pierristal, G.; Torres, Y.; Huerfano, V.; Polanco, E.; García, R.; Zevallos, F.

    2013-05-01

    The results from seismic hazard studies carried out in two seismic scenarios, Central America Region (CA) and La Hispaniola Island, are presented here. Both cases follow the Probabilistic Seismic Hazard Assessment (PSHA) methodology and they are developed in terms of PGA, and SA (T), for T of 0.1, 0.2, 0.5, 1 and 2s. In both anaysis, hybrid zonation models are considered, integrated by seismogenic zones and faults where data of slip rate and recurrence time are available. First, we present a new evaluation of seismic hazard in CA, starting with the results of a previous study by Benito et al (2011). Some improvements are now included, such as: updated catalogue till 2011, corrections in the zonning model in particular for subduction regime taken into account the variation of the dip in Costa Rica and Panama, and modelization of some faults as independent units for the hazard estimation. The results allow us to carry out a sensitivity analysis comparing the ones obtained with and without faults. In a second part we present the results of the PSHA in La Hispaniola, carried out as part of the cooperative project SISMO-HAITI supported by UPM and developed in cooperation with ONEV. It started a few months after the 2010 event, as an answer to a required help from the Haitian government to UPM. The study was aimed at obtaining results suitable for seismic design purposes and started with the elaboration of a seismic catalogue for the Hispaniola, requiring an exhaustive revision of data reported by around 30 seismic agencies, apart from these from Puerto Rico and Dominican Republic Seismic Networks. Seismotectonic models for the region were reviewed and a new regional zonation was proposed, taking into account different geophysical data. Attenuation models for subduction and crustal zones were also reviewed and the more suitable were calibrated with data recorded inside the Caribbean plate. As a result of the PSHA, different maps were generated for the quoted parameters, together with the UHS for the main cities in the country. The obtained values for PGA and return peridod of 475 y. are comparable to the ones of the Dominican Republic Building Code, with maximun PGA around 400 cm/s2 (in rock sites). However, the morphology of the map is quite similar to the previous one by Frankel et al (2011), althought ours presents lower PGA values. The results are available as a basis for the the first Haitian building code.

  17. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    NASA Astrophysics Data System (ADS)

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.

  18. Subcortical structure segmentation using probabilistic atlas priors

    NASA Astrophysics Data System (ADS)

    Gouttard, Sylvain; Styner, Martin; Joshi, Sarang; Smith, Rachel G.; Cody Hazlett, Heather; Gerig, Guido

    2007-03-01

    The segmentation of the subcortical structures of the brain is required for many forms of quantitative neuroanatomic analysis. The volumetric and shape parameters of structures such as lateral ventricles, putamen, caudate, hippocampus, pallidus and amygdala are employed to characterize a disease or its evolution. This paper presents a fully automatic segmentation of these structures via a non-rigid registration of a probabilistic atlas prior and alongside a comprehensive validation. Our approach is based on an unbiased diffeomorphic atlas with probabilistic spatial priors built from a training set of MR images with corresponding manual segmentations. The atlas building computes an average image along with transformation fields mapping each training case to the average image. These transformation fields are applied to the manually segmented structures of each case in order to obtain a probabilistic map on the atlas. When applying the atlas for automatic structural segmentation, an MR image is first intensity inhomogeneity corrected, skull stripped and intensity calibrated to the atlas. Then the atlas image is registered to the image using an affine followed by a deformable registration matching the gray level intensity. Finally, the registration transformation is applied to the probabilistic maps of each structures, which are then thresholded at 0.5 probability. Using manual segmentations for comparison, measures of volumetric differences show high correlation with our results. Furthermore, the dice coefficient, which quantifies the volumetric overlap, is higher than 62% for all structures and is close to 80% for basal ganglia. The intraclass correlation coefficient computed on these same datasets shows a good inter-method correlation of the volumetric measurements. Using a dataset of a single patient scanned 10 times on 5 different scanners, reliability is shown with a coefficient of variance of less than 2 percents over the whole dataset. Overall, these validation and reliability studies show that our method accurately and reliably segments almost all structures. Only the hippocampus and amygdala segmentations exhibit relative low correlation with the manual segmentation in at least one of the validation studies, whereas they still show appropriate dice overlap coefficients.

  19. Topography- and nightlight-based national flood risk assessment in Canada

    NASA Astrophysics Data System (ADS)

    Elshorbagy, Amin; Bharath, Raja; Lakhanpal, Anchit; Ceola, Serena; Montanari, Alberto; Lindenschmidt, Karl-Erich

    2017-04-01

    In Canada, flood analysis and water resource management, in general, are tasks conducted at the provincial level; therefore, unified national-scale approaches to water-related problems are uncommon. In this study, a national-scale flood risk assessment approach is proposed and developed. The study focuses on using global and national datasets available with various resolutions to create flood risk maps. First, a flood hazard map of Canada is developed using topography-based parameters derived from digital elevation models, namely, elevation above nearest drainage (EAND) and distance from nearest drainage (DFND). This flood hazard mapping method is tested on a smaller area around the city of Calgary, Alberta, against a flood inundation map produced by the city using hydraulic modelling. Second, a flood exposure map of Canada is developed using a land-use map and the satellite-based nightlight luminosity data as two exposure parameters. Third, an economic flood risk map is produced, and subsequently overlaid with population density information to produce a socioeconomic flood risk map for Canada. All three maps of hazard, exposure, and risk are classified into five classes, ranging from very low to severe. A simple way to include flood protection measures in hazard estimation is also demonstrated using the example of the city of Winnipeg, Manitoba. This could be done for the entire country if information on flood protection across Canada were available. The evaluation of the flood hazard map shows that the topography-based method adopted in this study is both practical and reliable for large-scale analysis. Sensitivity analysis regarding the resolution of the digital elevation model is needed to identify the resolution that is fine enough for reliable hazard mapping, but coarse enough for computational tractability. The nightlight data are found to be useful for exposure and risk mapping in Canada; however, uncertainty analysis should be conducted to investigate the effect of the overglow phenomenon on flood risk mapping.

  20. Evaluation of Seismic Risk of Siberia Territory

    NASA Astrophysics Data System (ADS)

    Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

    The outcomes of modern geophysical researches of the Geophysical Survey SB RAS, directed on study of geodynamic situation in large industrial and civil centers on the territory of Siberia with the purpose of an evaluation of seismic risk of territories and prediction of origin of extreme situations of natural and man-caused character, are pre- sented in the paper. First of all it concerns the testing and updating of a geoinformation system developed by Russian Emergency Ministry designed for calculations regarding the seismic hazard and response to distructive earthquakes. The GIS database contains the catalogues of earthquakes and faults, seismic zonation maps, vectorized city maps, information on industrial and housing fund, data on character of building and popula- tion in inhabited places etc. The geoinformation system allows to solve on a basis of probabilistic approaches the following problems: - estimating the earthquake impact, required forces, facilities and supplies for life-support of injured population; - deter- mining the consequences of failures on chemical and explosion-dangerous objects; - optimization problems on assurance technology of conduct of salvage operations. Using this computer program, the maps of earthquake risk have been constructed for several seismically dangerous regions of Siberia. These maps display the data on the probable amount of injured people and relative economic damage from an earthquake, which can occur in various sites of the territory according to the map of seismic zona- tion. The obtained maps have allowed determining places where the detailed seismo- logical observations should be arranged. Along with it on the territory of Siberia the wide-ranging investigations with use of new methods of evaluation of physical state of industrial and civil establishments (buildings and structures, hydroelectric power stations, bridges, dams, etc.), high-performance detailed electromagnetic researches of ground conditions of city territories, roads, runways, etc., studying of seismic con- dition in large industrial and civil centers and others.

Top