Sample records for hazard characterization methodology

  1. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  2. Development of a Probabilistic Tornado Wind Hazard Model for the Continental United States Volume I: Main Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boissonnade, A; Hossain, Q; Kimball, J

    Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526,more » in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States.« less

  3. Characterizing crown fuel distribution for conifers in the interior western United States

    Treesearch

    Seth Ex; Frederick W. Smith; Tara Keyser

    2015-01-01

    Canopy fire hazard evaluation is essential for prioritizing fuel treatments and for assessing potential risk to firefighters during suppression activities. Fire hazard is usually expressed as predicted potential fire behavior, which is sensitive to the methodology used to quantitatively describe fuel profiles: methodologies that assume that fuel is distributed...

  4. Seismic Hazard Estimates Using Ill-defined Macroseismic Data at Site

    NASA Astrophysics Data System (ADS)

    Albarello, D.; Mucciarelli, M.

    - A new approach is proposed to the seismic hazard estimate based on documentary data concerning local history of seismic effects. The adopted methodology allows for the use of ``poor'' data, such as the macroseismic ones, within a formally coherent approach that permits overcoming a number of problems connected to the forcing of available information in the frame of ``standard'' methodologies calibrated on the use of instrumental data. The use of the proposed methodology allows full exploitation of all the available information (that for many towns in Italy covers several centuries) making possible a correct use of macroseismic data characterized by different levels of completeness and reliability. As an application of the proposed methodology, seismic hazard estimates are presented for two towns located in Northern Italy: Bologna and Carpi.

  5. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin Leigh

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  6. SAMCO: Society Adaptation for coping with Mountain risks in a global change COntext

    NASA Astrophysics Data System (ADS)

    Grandjean, Gilles; Bernardie, Severine; Malet, Jean-Philippe; Puissant, Anne; Houet, Thomas; Berger, Frederic; Fort, Monique; Pierre, Daniel

    2013-04-01

    The SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points with (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform. The strength and originality of the SAMCO project will be to combine different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) and to gather various interdisciplinary expertises in earth sciences, environmental sciences, and social sciences. The multidisciplinary background of the members could potentially lead to the development of new concepts and emerging strategies for mountain hazard/risk adaptation. Research areas, characterized by a variety of environmental, economical and social settings, are severely affected by landslides, and have experienced significant land use modifications (reforestation, abandonment of traditional agricultural practices) and human interferences (urban expansion, ski resorts construction) over the last century.

  7. Screening tests for hazard classification of complex waste materials - Selection of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weltens, R., E-mail: reinhilde.weltens@vito.be; Vanermen, G.; Tirez, K.

    In this study we describe the development of an alternative methodology for hazard characterization of waste materials. Such an alternative methodology for hazard assessment of complex waste materials is urgently needed, because the lack of a validated instrument leads to arbitrary hazard classification of such complex waste materials. False classification can lead to human and environmental health risks and also has important financial consequences for the waste owner. The Hazardous Waste Directive (HWD) describes the methodology for hazard classification of waste materials. For mirror entries the HWD classification is based upon the hazardous properties (H1-15) of the waste which canmore » be assessed from the hazardous properties of individual identified waste compounds or - if not all compounds are identified - from test results of hazard assessment tests performed on the waste material itself. For the latter the HWD recommends toxicity tests that were initially designed for risk assessment of chemicals in consumer products (pharmaceuticals, cosmetics, biocides, food, etc.). These tests (often using mammals) are not designed nor suitable for the hazard characterization of waste materials. With the present study we want to contribute to the development of an alternative and transparent test strategy for hazard assessment of complex wastes that is in line with the HWD principles for waste classification. It is necessary to cope with this important shortcoming in hazardous waste classification and to demonstrate that alternative methods are available that can be used for hazard assessment of waste materials. Next, by describing the pros and cons of the available methods, and by identifying the needs for additional or further development of test methods, we hope to stimulate research efforts and development in this direction. In this paper we describe promising techniques and argument on the test selection for the pilot study that we have performed on different types of waste materials. Test results are presented in a second paper. As the application of many of the proposed test methods is new in the field of waste management, the principles of the tests are described. The selected tests tackle important hazardous properties but refinement of the test battery is needed to fulfil the a priori conditions.« less

  8. Seismic Hazard and Ground Motion Characterization at the Itoiz Dam (Northern Spain)

    NASA Astrophysics Data System (ADS)

    Rivas-Medina, A.; Santoyo, M. A.; Luzón, F.; Benito, B.; Gaspar-Escribano, J. M.; García-Jerez, A.

    2012-08-01

    This paper presents a new hazard-consistent ground motion characterization of the Itoiz dam site, located in Northern Spain. Firstly, we propose a methodology with different approximation levels to the expected ground motion at the dam site. Secondly, we apply this methodology taking into account the particular characteristics of the site and of the dam. Hazard calculations were performed following the Probabilistic Seismic Hazard Assessment method using a logic tree, which accounts for different seismic source zonings and different ground-motion attenuation relationships. The study was done in terms of peak ground acceleration and several spectral accelerations of periods coinciding with the fundamental vibration periods of the dam. In order to estimate these ground motions we consider two different dam conditions: when the dam is empty ( T = 0.1 s) and when it is filled with water to its maximum capacity ( T = 0.22 s). Additionally, seismic hazard analysis is done for two return periods: 975 years, related to the project earthquake, and 4,975 years, identified with an extreme event. Soil conditions were also taken into account at the site of the dam. Through the proposed methodology we deal with different forms of characterizing ground motion at the study site. In a first step, we obtain the uniform hazard response spectra for the two return periods. In a second step, a disaggregation analysis is done in order to obtain the controlling earthquakes that can affect the dam. Subsequently, we characterize the ground motion at the dam site in terms of specific response spectra for target motions defined by the expected values SA ( T) of T = 0.1 and 0.22 s for the return periods of 975 and 4,975 years, respectively. Finally, synthetic acceleration time histories for earthquake events matching the controlling parameters are generated using the discrete wave-number method and subsequently analyzed. Because of the short relative distances between the controlling earthquakes and the dam site we considered finite sources in these computations. We conclude that directivity effects should be taken into account as an important variable in this kind of studies for ground motion characteristics.

  9. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  10. A non-intrusive screening methodology for environmental hazard assessment at waste disposal sites for water resources protection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simons, B.A.; Woldt, W.E.; Jones, D.D.

    The environmental and health risks posed by unregulated waste disposal sites are potential concerns of Pacific Rim regions and island ares because of the need to protect aquifers and other valuable water resources. A non-intrusive screening methodology to determine site characteristics including possible soil and/or groundwater contamination, areal extent of waste, etc. is being developed and tested at waste disposal sites in Nebraska. This type of methodology would be beneficial to Pacific Rim regions in investigating and/or locating unknown or poorly documented contamination areas for hazard assessment and groundwater protection. Traditional assessment methods are generally expensive, time consuming, and potentiallymore » exacerbate the problem. Ideally, a quick and inexpensive assessment method to reliably characterize these sites is desired. Electromagnetic (EM) conductivity surveying and soil-vapor sampling techniques, combined with innovative three-dimensional geostatistical methods are used to map the data to develop a site characterization of the subsurface and to aid in tracking any contaminant plumes. The EM data is analyzed to determine/estimate the extent and volume of waste and/or leachate. Soil-vapor data are analyzed to estimate a site`s volatile organic compound (VOC) emission rate to the atmosphere. The combined information could then be incorporated as one part of an overall hazard assessment system.« less

  11. Final Report: Seismic Hazard Assessment at the PGDP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhinmeng

    2007-06-01

    Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties ofmore » seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.« less

  12. Waste certification program plan for Oak Ridge National Laboratory. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrin, R.C.

    1997-05-01

    This document defines the waste certification program developed for implementation at Oak Ridge National Laboratory (ORNL). The document describes the program structure, logic, and methodology for certification of ORNL wastes. The purpose of the waste certification program is to provide assurance that wastes are properly characterized and that the Waste Acceptance Criteria (WAC) for receiving facilities are met. The program meets the waste certification requirements outlined in US Department of Energy (DOE) Order 5820.2A, Radioactive Waste Management, and ensures that 40 CFR documentation requirements for waste characterization are met for mixed (both radioactive and hazardous) and hazardous (including polychlorinated biphenyls)more » waste. Program activities will be conducted according to ORNL Level 1 document requirements.« less

  13. Alternatives Assessment Frameworks: Research Needs for the Informed Substitution of Hazardous Chemicals

    PubMed Central

    Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally

    2015-01-01

    Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research needs for the informed substitution of hazardous chemicals. Environ Health Perspect 124:265–280; http://dx.doi.org/10.1289/ehp.1409581 PMID:26339778

  14. Waste certification program plan for Oak Ridge National Laboratory. Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1997-09-01

    This document defines the waste certification program (WCP) developed for implementation at Oak Ridge National Laboratory (ORNL). The document describes the program structure, logic, and methodology for certification of ORNL wastes. The purpose of the WCP is to provide assurance that wastes are properly characterized and that the Waste Acceptance Criteria (WAC) for receiving facilities are met. The program meets the waste certification requirements for mixed (both radioactive and hazardous) and hazardous [including polychlorinated biphenyls (PCB)] waste. Program activities will be conducted according to ORNL Level 1 document requirements.

  15. Processing LiDAR Data to Predict Natural Hazards

    NASA Technical Reports Server (NTRS)

    Fairweather, Ian; Crabtree, Robert; Hager, Stacey

    2008-01-01

    ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.

  16. Proportional exponentiated link transformed hazards (ELTH) models for discrete time survival data with application

    PubMed Central

    Joeng, Hee-Koung; Chen, Ming-Hui; Kang, Sangwook

    2015-01-01

    Discrete survival data are routinely encountered in many fields of study including behavior science, economics, epidemiology, medicine, and social science. In this paper, we develop a class of proportional exponentiated link transformed hazards (ELTH) models. We carry out a detailed examination of the role of links in fitting discrete survival data and estimating regression coefficients. Several interesting results are established regarding the choice of links and baseline hazards. We also characterize the conditions for improper survival functions and the conditions for existence of the maximum likelihood estimates under the proposed ELTH models. An extensive simulation study is conducted to examine the empirical performance of the parameter estimates under the Cox proportional hazards model by treating discrete survival times as continuous survival times, and the model comparison criteria, AIC and BIC, in determining links and baseline hazards. A SEER breast cancer dataset is analyzed in details to further demonstrate the proposed methodology. PMID:25772374

  17. The SAMCO Web-platform for resilience assessment in mountainous valleys impacted by landslide risks.

    NASA Astrophysics Data System (ADS)

    Grandjean, Gilles; Thomas, Loic; Bernardie, Severine

    2016-04-01

    The ANR-SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points: (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform available on the web. The strength and originality of the SAMCO project lies in the combination of different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) that are implemented in a user-oriented web-platform, currently in development. We present the first results of this development task, architecture and functions of the web-tools, the case studies database showing the multi-hazard maps and the stakes at risks. Risk assessment over several area of interest in Alpine or Pyrenean valleys are still in progress, but the first analyses are presented for current and future periods for which climate change and land-use (economical, geographical and social aspects) scenarios are taken into account. This tool, dedicated to stakeholders, should be finally used to evaluate resilience of mountainous regions since multiple scenarios can be tested and compared.

  18. Microseismic techniques for avoiding induced seismicity during fluid injection

    DOE PAGES

    Matzel, Eric; White, Joshua; Templeton, Dennise; ...

    2014-01-01

    The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterization phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.

  19. EVALUATING ROBOT TECHNOLOGIES AS TOOLS TO EXPLORE RADIOLOGICAL AND OTHER HAZARDOUS ENVIRONMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis W. Nielsen; David I. Gertman; David J. Bruemmer

    2008-03-01

    There is a general consensus that robots could be beneficial in performing tasks within hazardous radiological environments. Most control of robots in hazardous environments involves master-slave or teleoperation relationships between the human and the robot. While teleoperation-based solutions keep humans out of harms way, they also change the training requirements to accomplish a task. In this paper we present a research methodology that allowed scientists at Idaho National Laboratory to identify, develop, and prove a semi-autonomous robot solution for search and characterization tasks within a hazardous environment. Two experiments are summarized that validated the use of semi-autonomy and show thatmore » robot autonomy can help mitigate some of the performance differences between operators who have different levels of robot experience, and can improve performance over teleoperated systems.« less

  20. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    USGS Publications Warehouse

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  1. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parisi, Carlo; Prescott, Steve; Ma, Zhegang

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA modelsmore » for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.« less

  2. New approaches to wipe sampling methods for antineoplastic and other hazardous drugs in healthcare settings.

    PubMed

    Connor, Thomas H; Smith, Jerome P

    2016-09-01

    At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.

  3. Risk Assessment Methodology for Hazardous Waste Management (1998)

    EPA Pesticide Factsheets

    A methodology is described for systematically assessing and comparing the risks to human health and the environment of hazardous waste management alternatives. The methodology selects and links appropriate models and techniques for performing the process.

  4. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  5. Establish an Agent-Simulant Technology Relationship (ASTR)

    DTIC Science & Technology

    2017-04-14

    for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT

  6. SAR/multispectral image fusion for the detection of environmental hazards with a GIS

    NASA Astrophysics Data System (ADS)

    Errico, Angela; Angelino, Cesario Vincenzo; Cicala, Luca; Podobinski, Dominik P.; Persechino, Giuseppe; Ferrara, Claudia; Lega, Massimiliano; Vallario, Andrea; Parente, Claudio; Masi, Giuseppe; Gaetano, Raffaele; Scarpa, Giuseppe; Amitrano, Donato; Ruello, Giuseppe; Verdoliva, Luisa; Poggi, Giovanni

    2014-10-01

    In this paper we propose a GIS-based methodology, using optical and SAR remote sensing data, together with more conventional sources, for the detection of small cattle breeding areas, potentially responsible of hazardous littering. This specific environmental problem is very relevant for the Caserta area, in southern Italy, where many small buffalo breeding farms exist which are not even known to the productive activity register, and are not easily monitored and surveyed. Experiments on a test area, with available specific ground truth, prove that the proposed systems is characterized by very large detection probability and negligible false alarm rate.

  7. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  8. Evaluation of biological methods for a future methodological implementation of the Hazard criterion H14 'ecotoxic' in the European waste list (2000/532/EC).

    PubMed

    Moser, Heidrun; Roembke, Joerg; Donnevert, Gerhild; Becker, Roland

    2011-02-01

    The ecotoxicological characterization of waste is part of its assessment as hazardous or non-hazardous according to the European Waste List. For this classification 15 hazard criteria are derived from the Council Directive 91/689/EEC on hazardous waste. Some of the hazard criteria are based on the content of dangerous substances. The criterion H14 'ecotoxic' lacks of an assessment and testing strategy and no specific threshold values have been defined so far. Based on the recommendations of CEN guideline 14735 (2005), an international round robin test (ring test) was organized by the German Federal Environment Agency in order to define suitable test methods for the biological assessment of waste and waste eluates. A basic test battery, consisting of three aquatic and three terrestrial tests, was compiled. In addition, data were submitted for ten additional tests (five aquatic (including a genotoxicity test) and five terrestrial ones). The tests were performed with three representative waste types: an ash from an incineration plant, a soil containing high concentrations of organic contaminants (polycyclic aromatic hydrocarbons) and a preserved wood waste. The results of this ring test confirm that a combination of a battery of biological tests and chemical residual analysis is needed for an ecotoxicological characterization of wastes. With small modifications the basic test battery is considered to be well suitable for the hazard and risk assessment of wastes and waste eluates. All results and documents are accessible via a web-based data bank application.

  9. Automated source term and wind parameter estimation for atmospheric transport and dispersion applications

    NASA Astrophysics Data System (ADS)

    Bieringer, Paul E.; Rodriguez, Luna M.; Vandenberghe, Francois; Hurst, Jonathan G.; Bieberbach, George; Sykes, Ian; Hannan, John R.; Zaragoza, Jake; Fry, Richard N.

    2015-12-01

    Accurate simulations of the atmospheric transport and dispersion (AT&D) of hazardous airborne materials rely heavily on the source term parameters necessary to characterize the initial release and meteorological conditions that drive the downwind dispersion. In many cases the source parameters are not known and consequently based on rudimentary assumptions. This is particularly true of accidental releases and the intentional releases associated with terrorist incidents. When available, meteorological observations are often not representative of the conditions at the location of the release and the use of these non-representative meteorological conditions can result in significant errors in the hazard assessments downwind of the sensors, even when the other source parameters are accurately characterized. Here, we describe a computationally efficient methodology to characterize both the release source parameters and the low-level winds (eg. winds near the surface) required to produce a refined downwind hazard. This methodology, known as the Variational Iterative Refinement Source Term Estimation (STE) Algorithm (VIRSA), consists of a combination of modeling systems. These systems include a back-trajectory based source inversion method, a forward Gaussian puff dispersion model, a variational refinement algorithm that uses both a simple forward AT&D model that is a surrogate for the more complex Gaussian puff model and a formal adjoint of this surrogate model. The back-trajectory based method is used to calculate a ;first guess; source estimate based on the available observations of the airborne contaminant plume and atmospheric conditions. The variational refinement algorithm is then used to iteratively refine the first guess STE parameters and meteorological variables. The algorithm has been evaluated across a wide range of scenarios of varying complexity. It has been shown to improve the source parameters for location by several hundred percent (normalized by the distance from source to the closest sampler), and improve mass estimates by several orders of magnitude. Furthermore, it also has the ability to operate in scenarios with inconsistencies between the wind and airborne contaminant sensor observations and adjust the wind to provide a better match between the hazard prediction and the observations.

  10. Analysis report for 241-BY-104 Auger samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, M.A.

    1994-11-10

    This report describes the analysis of the surface crust samples taken from single-shell tank (SST) BY-104, suspected of containing ferrocyanide wastes. This sampling and analysis will assist in ascertaining whether there is any hazard due to combustion (burning) or explosion of these solid wastes. These characteristics are important to future efforts to characterize the salt and sludge in this type of waste tank. This report will outline the methodology and detail the results of analyses performed during the characterization of this material. All analyses were performed by Westinghouse Hanford Company at the 222-S laboratory unless stated otherwise.

  11. 29 CFR 1926.64 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...

  12. 29 CFR 1926.64 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...

  13. Application of a new methodology for coastal multi-hazard-assessment & management on the state of Karnataka, India.

    PubMed

    Rosendahl Appelquist, Lars; Balstrøm, Thomas

    2015-04-01

    This paper presents the application of a new methodology for coastal multi-hazard assessment & management under a changing global climate on the state of Karnataka, India. The recently published methodology termed the Coastal Hazard Wheel (CHW) is designed for local, regional and national hazard screening in areas with limited data availability, and covers the hazards of ecosystem disruption, gradual inundation, salt water intrusion, erosion and flooding. The application makes use of published geophysical data and remote sensing information and is showcasing how the CHW framework can be applied at a scale relevant for regional planning purposes. It uses a GIS approach to develop regional and sub-regional hazard maps as well as to produce relevant hazard risk data, and includes a discussion of uncertainties, limitations and management perspectives. The hazard assessment shows that 61 percent of Karnataka's coastline has a high or very high inherent hazard of erosion, making erosion the most prevalent coastal hazard. The hazards of flooding and salt water intrusion are also relatively widespread as 39 percent of Karnataka's coastline has a high or very high inherent hazard for both of these hazard types. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. An integrated dispersion preparation, characterization and in vitro dosimetry methodology for engineered nanomaterials

    PubMed Central

    DeLoid, Glen M.; Cohen, Joel M.; Pyrgiotakis, Georgios; Demokritou, Philip

    2018-01-01

    Summary Evidence continues to grow of the importance of in vitro and in vivo dosimetry in the hazard assessment and ranking of engineered nanomaterials (ENMs). Accurate dose metrics are particularly important for in vitro cellular screening to assess the potential health risks or bioactivity of ENMs. In order to ensure meaningful and reproducible quantification of in vitro dose, with consistent measurement and reporting between laboratories, it is necessary to adopt standardized and integrated methodologies for 1) generation of stable ENM suspensions in cell culture media, 2) colloidal characterization of suspended ENMs, particularly properties that determine particle kinetics in an in vitro system (size distribution and formed agglomerate effective density), and 3) robust numerical fate and transport modeling for accurate determination of ENM dose delivered to cells over the course of the in vitro exposure. Here we present such an integrated comprehensive protocol based on such a methodology for in vitro dosimetry, including detailed standardized procedures for each of these three critical steps. The entire protocol requires approximately 6-12 hours to complete. PMID:28102836

  15. Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel; Malamud, Bruce D.

    2016-04-01

    Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  16. Pareto frontier analyses based decision making tool for transportation of hazardous waste.

    PubMed

    Das, Arup; Mazumder, T N; Gupta, A K

    2012-08-15

    Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Integrated multi-parameters Probabilistic Seismic Landslide Hazard Analysis (PSLHA): the case study of Ischia island, Italy

    NASA Astrophysics Data System (ADS)

    Caccavale, Mauro; Matano, Fabio; Sacchi, Marco; Mazzola, Salvatore; Somma, Renato; Troise, Claudia; De Natale, Giuseppe

    2014-05-01

    The Ischia island is a large, complex, partly submerged, active volcanic field located about 20 km east to the Campi Flegrei, a major active volcano-tectonic area near Naples. The island is morphologically characterized in its central part by the resurgent block of Mt. Epomeo, controlled by NW-SE and NE-SW trending fault systems, by mountain stream basin with high relief energy and by a heterogeneous coastline with alternation of beach and tuff/lava cliffs in a continuous reshape due to the weather and sea erosion. The volcano-tectonic process is a main factor for slope stability, as it produces seismic activity and generated steep slopes in volcanic deposits (lava, tuff, pumice and ash layers) characterized by variable strength. In the Campi Flegrei and surrounding areas the possible occurrence of a moderate/large seismic event represents a serious threat for the inhabitants, for the infrastructures as well as for the environment. The most relevant seismic sources for Ischia are represented by the Campi Flegrei caldera and a 5 km long fault located below the island north coast. However those sources are difficult to constrain. The first one due to the on-shore and off-shore extension not yet completely defined. The second characterized only by few large historical events is difficult to parameterize in the framework of probabilistic hazard approach. The high population density, the presence of many infrastructures and the more relevant archaeological sites associated with the natural and artistic values, makes this area a strategic natural laboratory to develop new methodologies. Moreover Ischia represents the only sector, in the Campi Flegrei area, with documented historical landslides originated by earthquake, allowing for the possibility of testing the adequacy and stability of the method. In the framework of the Italian project MON.I.C.A (infrastructural coastlines monitoring) an innovative and dedicated probabilistic methodology has been applied to identify the areas with higher susceptibility of landslide occurrence due to the seismic effect. The (PSLHA) combines the probability of exceedance maps for different GM parameters with the geological and geomorphological information, in terms of critical acceleration and dynamic stability factor. Generally the maps are evaluated for Peak Ground Acceleration, Velocity or Intensity, are well related with anthropic infrastructures (e.g. streets, building, etc.). Each ground motion parameter represents a different aspect in the hazard and has a different correlation with the generation of possible damages. Many works pointed out that other GM like Arias and Housner intensity and the absolute displacement could represent a better choice to analyse for example the cliffs stability. The selection of the GM parameter is of crucial importance to obtain the most useful hazard maps. However in the last decades different Ground Motion Prediction Equations for a new set of GM parameters have been published. Based on this information a series of landslide hazard maps can be produced. The new maps will lead to the identification of areas with highest probability of landslide induced by an earthquake. In a strategic site like Ischia this new methodologies will represent an innovative and advanced tool for the landslide hazard mitigation.

  18. Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henneke, Dennis W.; Robinson, James

    In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less

  19. Determination of Particular Endogenous Fires Hazard Zones in Goaf with Caving of Longwall

    NASA Astrophysics Data System (ADS)

    Tutak, Magdalena; Brodny, Jaroslaw

    2017-12-01

    Hazard of endogenous fires is one of the basic and common presented occupational safety hazards in coal mine in Poland and in the world. This hazard means possibility of coal self-ignition as the result of its self-heating process in mining heading or its surrounding. In underground coal-mining during ventilating of operating longwalls takes place migration of parts of airflow to goaf with caving. In a case when in these goaf a coal susceptible to selfignition occurs, then the airflow through these goaf may influence on formation of favourable conditions for coal oxidation and subsequently to its self-heating and self-ignition. Endogenous fire formed in such conditions can pose a serious hazard for the crew and for continuity of operation of mining plant. From the practical point of view, a very significant meaning has determination of the zone in the goaf with caving, in which necessary conditions for occurrence of endogenous fire are fulfilled. In the real conditions determination of such a zone is practically impossible. Therefore, authors of paper developed a methodology of determination of this zone basing on the results of modelling tests. This methodology includes a development of model of tested area, determination of boundary conditions and carrying out the simulation calculations. Based on the obtained results particular hazardous zone of endogenous fire is determined. A base for development of model of investigated region and selection of boundary conditions are the results of real tests. In the paper fundamental assumption of developed methodology, particularly in a range of assumed hazard criterion and sealing coefficient of goaf with caving were discussed. Also a mathematical model of gas flow through the porous media was characterized. Example of determination of a zone particularly endangered by endogenous fire for real system of mining heading in one of the hard coal mine was presented. Longwall ventilated in the „Y” system was subjected to the tests. For determined mining-geological conditions, the critical value of velocity of airflow and oxygen concentration in goaf, conditioning initiation of coal oxidation process were determined. For calculations ANSYS Fluent software based on finite volume method, which enable very precisely to determine the physical and chemical air and parameters at any point of tested mining heading and goaf with caving was used. Such precisely determination of these parameters on the base of the test in real conditions is practically impossible. Obtained results allowed to take early proper actions in order to limit the occurrence of endogenous fire. One can conclude, that presented methodology creates great possibilities of practical application of modelling tests for improvement of the occupational safety state in mine.

  20. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr events could benefit the NTHMP. The joint NFIP/NTHMP pilot study at Seaside, Oregon is organized into three closely related components: Probabilistic, Modeling, and Impact studies. Probabilistic studies (Geist, et al., this session) are led by the USGS and include the specification of near- and far-field seismic tsunami sources and their associated probabilities. Modeling studies (Titov, et al., this session) are led by NOAA and include the development and testing of a Seaside tsunami inundation model and an associated database of computed wave height and flow velocity fields. Impact studies (Synolakis, et al., this session) are led by USC and include the computation and analyses of indices for the categorization of hazard zones. The results of each component study will be integrated to produce a Seaside tsunami hazard map. This presentation will provide a brief overview of the project and an update on progress, while the above-referenced companion presentations will provide details on the methods used and the preliminary results obtained by each project component.

  1. Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snow, Robert L.; Ross, Steven B.; Sullivan, Robin S.

    2010-09-24

    The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the Hanford 200 Areas, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. The review includes all natural phenomena hazards with the exception of seismic/earthquake hazards, which are being addressed under a separate effort. It was determined that existing non-seismic NPH assessments are consistent with current design methodology and site specific data.

  2. Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)

    NASA Astrophysics Data System (ADS)

    Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.

    2016-10-01

    Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.

  3. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  4. SOCIOECONOMIC ANALYSIS OF HAZARDOUS WASTE MANAGEMENT ALTERNATIVES: METHOLOLOGY AND DEMONSTRATION

    EPA Science Inventory

    A methodology for analyzing economic and social effects of alternatives in hazardous waste management is presented and demonstrated. The approach includes the use of environmental threat scenarios and evaluation of effects on and responses by parties-at-interest. The methodology ...

  5. Evaluation of Fast-Time Wake Vortex Models using Wake Encounter Flight Test Data

    NASA Technical Reports Server (NTRS)

    Ahmad, Nashat N.; VanValkenburg, Randal L.; Bowles, Roland L.; Limon Duparcmeur, Fanny M.; Gloudesman, Thijs; van Lochem, Sander; Ras, Eelco

    2014-01-01

    This paper describes a methodology for the integration and evaluation of fast-time wake models with flight data. The National Aeronautics and Space Administration conducted detailed flight tests in 1995 and 1997 under the Aircraft Vortex Spacing System Program to characterize wake vortex decay and wake encounter dynamics. In this study, data collected during Flight 705 were used to evaluate NASA's fast-time wake transport and decay models. Deterministic and Monte-Carlo simulations were conducted to define wake hazard bounds behind the wake generator. The methodology described in this paper can be used for further validation of fast-time wake models using en-route flight data, and for determining wake turbulence constraints in the design of air traffic management concepts.

  6. Installation Restoration Program Records Search for Kingsley Field, Oregon.

    DTIC Science & Technology

    1982-06-01

    Hazardous Assesment Rating Methodology (HARM), is now used for all Air Force IRP studies. To maintain consistency, AFESC had their on-call contractors review...Installation History D. Industrial Facilities E. POL Storage Tanks F. Abandoned Tanks G. Oil/Water Separators :" H. Site Hazard Rating Methodology I. Site...and implementing regulations. The pur- pose of DOD policy is to control the migration of hazardous material contaminants from DOD installations. 3

  7. Real-time Microseismic Processing for Induced Seismicity Hazard Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzel, Eric M.

    Induced seismicity is inherently associated with underground fluid injections. If fluids are injected in proximity to a pre-existing fault or fracture system, the resulting elevated pressures can trigger dynamic earthquake slip, which could both damage surface structures and create new migration pathways. The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterizationmore » phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.« less

  8. A Guide to the Application of Probability Risk Assessment Methodology and Hazard Risk Frequency Criteria as a Hazard Control for the Use of the Mobile Servicing System on the International Space Station

    NASA Astrophysics Data System (ADS)

    D'silva, Oneil; Kerrison, Roger

    2013-09-01

    A key feature for the increased utilization of space robotics is to automate Extra-Vehicular manned space activities and thus significantly reduce the potential for catastrophic hazards while simultaneously minimizing the overall costs associated with manned space. The principal scope of the paper is to evaluate the use of industry standard accepted Probability risk/safety assessment (PRA/PSA) methodologies and Hazard Risk frequency Criteria as a hazard control. This paper illustrates the applicability of combining the selected Probability risk assessment methodology and hazard risk frequency criteria, in order to apply the necessary safety controls that allow for the increased use of the Mobile Servicing system (MSS) robotic system on the International Space Station. This document will consider factors such as component failure rate reliability, software reliability, and periods of operation and dormancy, fault tree analyses and their effects on the probability risk assessments. The paper concludes with suggestions for the incorporation of existing industry Risk/Safety plans to create an applicable safety process for future activities/programs

  9. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment by once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  10. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  11. A Case Study of Measuring Process Risk for Early Insights into Software Safety

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.

    2011-01-01

    In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.

  12. Methodology for the Assessment of the Ecotoxicological Potential of Construction Materials

    PubMed Central

    Rodrigues, Patrícia; Silvestre, José D.; Flores-Colen, Inês; Viegas, Cristina A.; de Brito, Jorge; Kurad, Rawaz; Demertzi, Martha

    2017-01-01

    Innovation in construction materials (CM) implies changing their composition by incorporating raw materials, usually non-traditional ones, which confer the desired characteristics. However, this practice may have unknown risks. This paper discusses the ecotoxicological potential associated with raw and construction materials, and proposes and applies a methodology for the assessment of their ecotoxicological potential. This methodology is based on existing laws, such as Regulation (European Commission) No. 1907/2006 (REACH—Registration, Evaluation, Authorization and Restriction of Chemicals) and Regulation (European Commission) No. 1272/2008 (CLP—Classification, Labelling and Packaging). Its application and validation showed that raw material without clear evidence of ecotoxicological potential, but with some ability to release chemicals, can lead to the formulation of a CM with a slightly lower hazardousness in terms of chemical characterization despite a slightly higher ecotoxicological potential than the raw materials. The proposed methodology can be a useful tool for the development and manufacturing of products and the design choice of the most appropriate CM, aiming at the reduction of their environmental impact and contributing to construction sustainability. PMID:28773011

  13. Subsystem Hazard Analysis Methodology for the Ares I Upper Stage Source Controlled Items

    NASA Technical Reports Server (NTRS)

    Mitchell, Michael S.; Winner, David R.

    2010-01-01

    This article describes processes involved in developing subsystem hazard analyses for Source Controlled Items (SCI), specific components, sub-assemblies, and/or piece parts, of the NASA ARES I Upper Stage (US) project. SCIs will be designed, developed and /or procured by Boeing as an end item or an off-the-shelf item. Objectives include explaining the methodology, tools, stakeholders and products involved in development of these hazard analyses. Progress made and further challenges in identifying potential subsystem hazards are also provided in an effort to assist the System Safety community in understanding one part of the ARES I Upper Stage project.

  14. Occupational hazards among the abattoir workers associated with noncompliance to the meat processing and waste disposal laws in Malaysia

    PubMed Central

    Abdullahi, Auwalu; Hassan, Azmi; Kadarman, Norizhar; Junaidu, Yakubu Muhammad; Adeyemo, Olanike Kudrat; Lua, Pei Lin

    2016-01-01

    Purpose This study aims to investigate the occupational hazards among the abattoir workers associated with noncompliance to the meat processing and waste disposal laws in Terengganu State, Malaysia. Occupational hazards are the major source of morbidity and mortality among the animal workers due to exposure to many hazardous situations in their daily practices. Occupational infections mostly contracted by abattoir workers could be caused by iatrogenic or transmissible agents, including viruses, bacteria, fungi, and parasites and the toxins produced by these organisms. Materials and methods The methodology was based on a cross-sectional survey using cluster sampling technique in the four districts of Terengganu State, Malaysia. One hundred and twenty-one abattoir workers from five abattoirs were assessed using a validated structured questionnaire and an observation checklist. Results The mean and standard deviation of occupational hazards scores of the workers were 2.32 (2.721). Physical, chemical, biological, psychosocial, musculoskeletal, and ergonomics hazards were the major findings of this study. However, the highest prevalence of occupational hazards identified among the workers was injury by sharp equipment such as a knife (20.0%), noise exposure (17.0%), and due to offensive odor within the abattoir premises (12.0%). Conclusion The major occupational hazards encountered by the workers in the study area were physical, chemical, biological, psychosocial, musculoskeletal, and ergonomics hazards. To ensure proper control of occupational health hazards among the abattoir workers, standard design and good environmental hygiene must be taken into consideration all the time. Exposure control plan, which includes risk identification, risk characterization, assessment of workers at risk, risk control, workers’ education/training, and implementation of safe work procedures, should be implemented by the government and all the existing laws governing the abattoir operation in the country should be enforced. PMID:27471416

  15. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...

  16. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...

  17. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...

  18. Concept Attainment Teaching Methodology (CATM)--An Effective Approach for Training Workers on Chemicals Health Hazards

    ERIC Educational Resources Information Center

    Suleiman, Abdulqadir Mohamad

    2016-01-01

    Workers handling chemicals need to understand the risk to health involved in their work, and this requires training. In this study effectivity of concept attainment teaching methodology (CATM) as training strategy for cleaning workers was assessed. CATM was used to train workers on chemicals information and health hazards. Pictures, illustrations,…

  19. SCI Hazard Report Methodology

    NASA Technical Reports Server (NTRS)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  20. [A preliminary mapping methodology for occupational hazards and biomechanical risk evaluation: presentation of a simple, computerized tool kit for ergonomic hazards identification and risk assessment].

    PubMed

    Colombini, Daniela; Occhipinti, E; Di Leone, G

    2011-01-01

    During the last Congress of the International Ergonomics Association (IEA), Beijing, August 2009, an international group was founded with the task of developing a "toolkit for MSD prevention" under the IEA and in collaboration with the World Health Organization. The possible users of toolkits are: members of health and safety committees; health and safety representatives; line supervisors; foremen; workers; government representatives; health workers providing basic occupational health services; occupational health and safety specialists. According to the ISO standard 11228 series and the new Draft CD ISO 12259-2009: Application document guides for the potential user, our group developed a preliminary "mapping" methodology of occupational hazards in the craft industry, supported by software (Excel). The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazards identification and risk estimation to be made. It is thus possible to decide for which occupational hazards a more exhaustive risk assessment will be necessary and which occupational consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).

  1. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  2. A methodology for physically based rockfall hazard assessment

    NASA Astrophysics Data System (ADS)

    Crosta, G. B.; Agliardi, F.

    Rockfall hazard assessment is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. The mobility of rockfalls implies a more difficult hazard definition with respect to other slope instabilities with minimal runout. Rockfall hazard assessment involves complex definitions for "occurrence probability" and "intensity". This paper is an attempt to evaluate rockfall hazard using the results of 3-D numerical modelling on a topography described by a DEM. Maps portraying the maximum frequency of passages, velocity and height of blocks at each model cell, are easily combined in a GIS in order to produce physically based rockfall hazard maps. Different methods are suggested and discussed for rockfall hazard mapping at a regional and local scale both along linear features or within exposed areas. An objective approach based on three-dimensional matrixes providing both a positional "Rockfall Hazard Index" and a "Rockfall Hazard Vector" is presented. The opportunity of combining different parameters in the 3-D matrixes has been evaluated to better express the relative increase in hazard. Furthermore, the sensitivity of the hazard index with respect to the included variables and their combinations is preliminarily discussed in order to constrain as objective as possible assessment criteria.

  3. An index-based method to assess risks of climate-related hazards in coastal zones: The case of Tetouan

    NASA Astrophysics Data System (ADS)

    Satta, Alessio; Snoussi, Maria; Puddu, Manuela; Flayou, Latifa; Hout, Radouane

    2016-06-01

    The regional risk assessment carried out within the ClimVar & ICZM Project identified the coastal zone of Tetouan as a hotspot of the Mediterranean Moroccan coast and so it was chosen for the application of the Multi-Scale Coastal Risk Index for Local Scale (CRI-LS). The local scale approach provides a useful tool for local coastal planning and management by exploring the effects and the extensions of the hazards and combining hazard, vulnerability and exposure variables in order to identify areas where the risk is relatively high. The coast of Tetouan is one of the coastal areas that have been most rapidly and densely urbanized in Morocco and it is characterized by an erosive shoreline. Local authorities are facing the complex task of balancing development and managing coastal risks, especially coastal erosion and flooding, and then be prepared to the unavoidable impacts of climate change. The first phase of the application of the CRI-LS methodology to Tetouan consisted of defining the coastal hazard zone, which results from the overlaying of the erosion hazard zone and the flooding hazard zone. Nineteen variables were chosen to describe the Hazards, Vulnerability and Exposure factors. The scores corresponding to each variable were calculated and the weights assigned through an expert judgement elicitation. The resulting values are hosted in a geographic information system (GIS) platform that enables the individual variables and aggregated risk scores to be color-coded and mapped across the coastal hazard zone. The results indicated that 10% and 27% of investigated littoral fall under respectively very high and high vulnerability because of combination of high erosion rates with high capital land use. The risk map showed that some areas, especially the flood plains of Restinga, Smir and Martil-Alila, with distances over 5 km from the coast, are characterized by high levels of risk due to the low topography of the flood plains and to the high values of exposure. The CRI-LS provides a set of maps that allow identifying areas within the coastal hazard zone with relative higher risk from climate-related hazards. The method can be used to support coastal planning and management process in selecting the most suitable adaptation measures.

  4. Comparative risk assessments for the city of Pointe-à-Pitre (French West Indies): earthquakes and storm surge

    NASA Astrophysics Data System (ADS)

    Reveillere, A. R.; Bertil, D. B.; Douglas, J. D.; Grisanti, L. G.; Lecacheux, S. L.; Monfort, D. M.; Modaressi, H. M.; Müller, H. M.; Rohmer, J. R.; Sedan, O. S.

    2012-04-01

    In France, risk assessments for natural hazards are usually carried out separately and decision makers lack comprehensive information. Moreover, since the cause of the hazard (e.g. meteorological, geological) and the physical phenomenon that causes damage (e.g. inundation, ground shaking) may be fundamentally different, the quantitative comparison of single risk assessments that were not conducted in a compatible framework is not straightforward. Comprehensive comparative risk assessments exist in a few other countries. For instance, the Risk Map Germany project has developed and applied a methodology for quantitatively comparing the risk of relevant natural hazards at various scales (city, state) in Germany. The present on-going work applies a similar methodology to the Pointe-à-Pitre urban area, which represents more than half of the population of Guadeloupe, an overseas region in the French West Indies. Relevant hazards as well as hazard intensity levels differ from continental Europe, which will lead to different conclusions. French West Indies are prone to a large number of hazards, among which hurricanes, volcanic eruptions and earthquakes dominate. Hurricanes cause damage through three phenomena: wind, heavy rainfall and storm surge, the latter having had a preeminent role during the largest historical event in 1928. Seismic risk is characterized by many induced phenomena, among which earthquake shocks dominate. This study proposes a comparison of earthquake and cyclonic storm surge risks. Losses corresponding to hazard intensities having the same probability of occurrence are calculated. They are quantified in a common loss unit, chosen to be the direct economic losses. Intangible or indirect losses are not considered. The methodology therefore relies on (i) a probabilistic hazard assessment, (ii) a loss ratio estimation for the exposed elements and (iii) an economic estimation of these assets. Storm surge hazard assessment is based on the selection of relevant historical cyclones and on the simulation of the associated wave and cyclonic surge. The combined local sea elevations, called "set-up", are then fitted with a statistical distribution in order to obtain its time return characteristics. Several run-ups are then extracted, the inundation areas are calculated and the relative losses of the affected assets are deduced. The Probabilistic Seismic Hazard Assessment and the exposed elements location and seismic vulnerability result from past public risk assessment studies. The loss estimations are computed for several return time periods, measured in percentage of buildings being in a given EMS-98 damage state per grid block, which are then converted into loss ratio. In parallel, an asset estimation is conducted. It is mainly focused on private housing, but it considers some major public infrastructures as well. The final outcome of this work is a direct economic loss-frequency plot for earthquake and storm surge. The Probable Maximum Loss and the Average Annual Loss derivate from this risk curve. In addition, different sources of uncertainty are identified through the loss estimation process. The full propagation of these uncertainties can provide an interval of confidence, which can be assigned to the risk-curve and we show how such additional information can be useful for risk comparison.

  5. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA

  6. Risk assessment of oil and gas well drilling activities in Iran - a case study: human factors.

    PubMed

    Amir-Heidari, Payam; Farahani, Hadi; Ebrahemzadih, Mehrzad

    2015-01-01

    Oil and gas well drilling activities are associated with numerous hazards which have the potential to cause injury or harm for people, property and the environment. These hazards are also a threat for the reputation of drilling companies. To prevent accidents and undesired events in drilling operations it is essential to identify, evaluate, assess and control the attendant risks. In this work, a structured methodology is proposed for risk assessment of drilling activities. A case study is performed to identify, analyze and assess the risks arising from human factors in one of the on shore drilling sites in southern Iran. A total of 17 major hazards were identified and analyzed using the proposed methodology. The results showed that the residual risks of 100% of these hazards were in the acceptable or transitional zone, and their levels were expected to be lowered further by proper controls. This structured methodology may also be used in other drilling sites and companies for assessing the risks.

  7. Assessment of the vulnerability and the resilience of the population at risk of multi-hazard: a support to geo-risk management in Central Africa

    NASA Astrophysics Data System (ADS)

    Michellier, Caroline; Kervyn, François; Tréfon, Théodore; Wolff, Eléonore

    2013-04-01

    GeoRisCA is a project which aims at studying the geo-risk in the Kivu region (DRC, Rwanda, Burundi), in order to support risk management. The approach developed in GeoRisCA combines methodologies from various disciplines, which will allow the analyses of seismic, volcanic and mass-movement hazards and the vulnerability assessment of the threatened elements. Vulnerability is a complex concept which is commonly defined as the susceptibility of the population, the infrastructures and the natural ecosystems to suffer from damages if a hazard occurs. The densely populated area extended from the North Kivu province in Democratic Republic of the Congo (DRC) to North Burundi and East Rwanda is vulnerable to several geohazards, such as landslides triggered by geodynamical processes (climate, seismicity, volcanism) and possibly worsen by anthropic actions. Located in the East African rift valley, the region is also characterized by a strong seismicity, with increasing people and infrastructure exposed. In addition, east DRC hosts the two most active African volcanoes: Nyiragongo and Nyamulagira. Their activity can have serious impacts, as in 2002 when Nyiragongo directly endangers the ~800.000 inhabitants of Goma city, located ~15 km to the south. Linked to passive volcanic degassing, SO2 and CO2 discharge may also increase the population vulnerability(morbidity, mortality). Focusing specifically on this region, the vulnerability assessment methodology developed in GeoRisCA takes into account "exposure to perturbations" and "adaptive capacity or resilience" of the vulnerable systems. On one hand, the exposure is identified as the potential degree of loss of a given element or set of elements at risk; i.e., the susceptibility of people, infrastructures and buildings with respect to a hazard (social vulnerability). It focuses mainly on land use, and on demographic and socio-economic factors that increase or attenuate the impacts of hazards events on local populations. On the other hand, the resilience of the individual, the household, the community, is its adaptive capacity to absorb disturbance and reorganize into a fully functioning system by anticipation, response, adaptation and recovery. A key contribution of GeoRisCA project is to assess the vulnerability to different geohazards by integrating geographic and time variability. This methodology takes into account the specificities highlighted at the regional and the local scale (urban sites). And it also considers that the vulnerability evolves with time, e.g. due to improved education, increased income, denser social networks and evolution of coping mechanisms. Using the above described methodology, one of the main objective of GeoRisCA is to developed vulnerability maps that, once associated with geohazards data, will provide decision making tools for existing preparedness and mitigation institutions.

  8. Differences in experiences in rockfall hazard mapping in Switzerland and Principality of Andorra

    NASA Astrophysics Data System (ADS)

    Abbruzzese, J.; Labiouse, V.

    2009-04-01

    The need to cope with rockfall hazard and risk led many countries to adopt proper strategies for hazard mapping and risk management, based on their own social and political constraints. The experience of every single country in facing this challenge provides useful information and possible approaches to evaluate rockfall hazard and risk. More, with particular regard to the hazard mapping process, some important points are common to many methodologies in Europe, especially as for the use of rock fall intensity-frequency diagrams to define specific hazard levels. This aspect could suggest a starting point for comparing and possibly harmonising existing methodologies. On the other hand, the results obtained from methodologies used in different countries may be difficult to be compared, first because the existing national guidelines are established as a consequence of what has been learned in each country from dealing with past rockfall events. Particularly, diverse social and political considerations do influence the definition of the threshold values of the parameters which determine a given degree of hazard, and eventually the type of land-use accepted for each hazard level. Therefore, a change in the threshold values for rockfall intensity and frequency is already enough to produce completely different zoning results even if the same methodology is applied. In relation with this issue, the paper introduces some of the current challenges and difficulties in comparing hazard mapping results in Europe and, subsequently, in the chance to develop a common standard procedure to assess the rockfall hazard. The present work is part of an on-going research project whose aim is to improve methodologies for rockfall hazard and risk mapping at the local scale, in the framework of the European Project "Mountain Risks: from prediction to management and governance", funded by the European Commission. As a reference, two approaches will be considered, proposed in Switzerland and in the Principality of Andorra, respectively. At first, the guidelines applied in the two countries will be outlined, showing which way the correspondent procedures differ. For this purpose, in both cases, the main philosophy in facing rockfall hazard will be discussed, together with its consequences in terms of the resulting intensity-frequency threshold values proposed to determine different classes of hazard. Then, a simple case study carried out in Switzerland, in the Canton of Valais, will show an application of the discussed theoretical issues, by means of a comparison between the two approaches. A rockfall hazard mapping will be performed on a 2D slope profile, following both the Swiss energy-probability threshold values and the ones used in the Principality of Andorra. The analysis of the results will introduce some consequences the criteria for defining classes of hazard may have on land-use planning, depending on which guidelines are applied in a study site. This aspect involves not only differences in zoning concerning the extension of the areas in danger, but as well the influence on land-use that the meaning of the same hazard level may have, according to which threshold values for rockfall intensity and frequency are used. These considerations underline what role social and political decisions can play in the hazard assessment process, on the basis of the experiences and understandings of each country in this field. More precisely, it is rather evident that a possible comparison and/or harmonisation of hazard mapping results is closely linked to this aspect as well, and not only to more technical matters, such as computing and mapping techniques.

  9. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  10. Ten Years Experience In Geo-Databases For Linear Facilities Risk Assessment (Lfra)

    NASA Astrophysics Data System (ADS)

    Oboni, F.

    2003-04-01

    Keywords: geo-environmental, database, ISO14000, management, decision-making, risk, pipelines, roads, railroads, loss control, SAR, hazard identification ABSTRACT: During the past decades, characterized by the development of the Risk Management (RM) culture, a variety of different RM models have been proposed by governmental agencies in various parts of the world. The most structured models appear to have originated in the field of environmental RM. These models are briefly reviewed in the first section of the paper focusing the attention on the difference between Hazard Management and Risk Management and the need to use databases in order to allow retrieval of specific information and effective updating. The core of the paper reviews a number of different RM approaches, based on extensions of geo-databases, specifically developed for linear facilities (LF) in transportation corridors since the early 90s in Switzerland, Italy, Canada, the US and South America. The applications are compared in terms of methodology, capabilities and resources necessary to their implementation. The paper then focuses the attention on the level of detail that applications and related data have to attain. Common pitfalls related to decision making based on hazards rather than on risks are discussed. The paper focuses the last sections on the description of the next generation of linear facility RA application, including examples of results and discussion of future methodological research. It is shown that geo-databases should be linked to loss control and accident reports in order to maximize their benefits. The links between RA and ISO 14000 (environmental management code) are explicitly considered.

  11. Software System Architecture Modeling Methodology for Naval Gun Weapon Systems

    DTIC Science & Technology

    2010-12-01

    Weapon System HAR Hazard Action Report HERO Hazards of Electromagnetic Radiation to Ordnance IOC Initial Operational Capability... radiation to ordnance ; and combinations therein. Equipment, systems, or procedures and processes whose malfunction would hazard the safe manufacturing...NDI Non-Development Item OPEVAL Operational Evaluation ORDALTS Ordnance Alterations O&SHA Operating and Support Hazard Analysis PDA

  12. Lawrence Livermore National Laboratory Site Seismic Safety Program: Summary of Findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savy, J B; Foxall, W

    The Lawrence Livermore National Laboratory (LLNL) Site Seismic Safety Program was conceived in 1979 during the preparation of the site Draft Environmental Impact Statement. The impetus for the program came from the development of new methodologies and geologic data that affect assessments of geologic hazards at the LLNL site; it was designed to develop a new assessment of the seismic hazard to the LLNL site and LLNL employees. Secondarily, the program was also intended to provide the technical information needed to make ongoing decisions about design criteria for future construction at LLNL and about the adequacy of existing facilities. Thismore » assessment was intended to be of the highest technical quality and to make use of the most recent and accepted hazard assessment methodologies. The basic purposes and objectives of the current revision are similar to those of the previous studies. Although all the data and experience assembled in the previous studies were utilized to their fullest, the large quantity of new information and new methodologies led to the formation of a new team that includes LLNL staff and outside consultants from academia and private consulting firms. A peer-review panel composed of individuals from academia (A. Cornell, Stanford University), the Department of Energy (DOE; Jeff Kimball), and consulting (Kevin Coppersmith), provided review and guidance. This panel was involved from the beginning of the project in a ''participatory'' type of review. The Senior Seismic Hazard Analysis Committee (SSHAC, a committee sponsored by the U.S. Nuclear Regulatory Commission, DOE, and the Electric Power Research Institute) strongly recommends the use of participatory reviews, in which the reviewers follow the progress of a project from the beginning, rather than waiting until the end to provide comments (Budnitz et al., 1997). Following the requirements for probabilistic seismic hazard analysis (PSHA) stipulated in the DOE standard DOE-STD-1023-95, a special effort was made to identify and quantify all types of uncertainties. The final seismic hazard estimates were de-aggregated to determine the contribution of all the seismic sources as well as the relative contributions of potential future earthquakes in terms of their magnitudes and distances from the site. It was found that, in agreement with previous studies, the Greenville Fault system contributes the most to the estimate of the seismic hazard expressed in terms of the probability of exceedance of the peak ground acceleration (PGA) at the center of the LLNL site (i.e., at high frequencies). It is followed closely by the Calaveras and Corral Hollow faults. The Mount Diablo thrust and the Springtown and Livermore faults were not considered in the hazard calculations in the 1991 study. In this study they contributed together approximately as much as the Greenville fault. At lower frequencies, more distant faults such as the Hayward and San Andreas faults begin to appear as substantial contributors to the total hazard. The results of this revision are presented in Figures 1 and 2. Figure 1 shows the estimated mean hazard curve in terms of the annual probability of exceedance of the peak ground acceleration (average of the two horizontal orthogonal components) at the LLNL site, assuming that the local site conditions are similar to those of a generic soil. Figure 2 shows the results in terms of the uniform hazard spectra (pseudo-spectral accelerations for 5% damping) for five return periods. Although this latest revision is based on a completely independent and in many respects very different set of data and methodology from the previous one, it gives essentially the same results for the prediction of the peak ground acceleration (PGA), albeit with a reduced uncertainty. The Greenville fault being a dominant contributor to the hazard, a field investigation was performed to better characterize the probability distribution of the rate of slip on the fault. Samples were collected from a trench located on the northern segment of the Greenville fault, and are in the process of being dated at the LLNL Center for Acceleration Mass Spectrometry (CAMS) using carbon-14. Preliminary results from the dating corroborate the range of values used in the hazard calculations. A final update after completion and qualification (quality assurance) of the date measurements, in the near future, will finalize the distribution of this important parameter, probably using Bayesian updating.« less

  13. Development of CNC prototype for the characterization of the nanoparticle release during physical manipulation of nanocomposites.

    PubMed

    Gendre, Laura; Marchante, Veronica; Abhyankar, Hrushikesh A; Blackburn, Kim; Temple, Clive; Brighton, James L

    2016-01-01

    This work focuses on the release of nanoparticles from commercially used nanocomposites during machining operations. A reliable and repeatable method was developed to assess the intentionally exposure to nanoparticles, in particular during drilling. This article presents the description and validation of results obtained from a new prototype used for the measurement and monitoring of nanoparticles in a controlled environment. This methodology was compared with the methodologies applied in other studies. Also, some preliminary experiments on drilling nanocomposites are included. Size, shape and chemical composition of the released nanoparticles were investigated in order to understand their hazard potential. No significant differences were found in the amount of nanoparticles released between samples with and without nanoadditives. Also, no chemical alteration was observed between the dust generated and the bulk material. Finally, further developments of the prototype are proposed.

  14. Tsunami and shelf resonance on the northern Chile coast

    NASA Astrophysics Data System (ADS)

    Cortés, Pablo; Catalán, Patricio A.; Aránguiz, Rafael; Bellotti, Giorgio

    2017-09-01

    This work presents the analysis of long waves resonance in two of the main cities along the northern coast of Chile, Arica, and Iquique, where a large tsunamigenic potential remains despite recent earthquakes. By combining a modal analysis solving the equation of free surface oscillations, with the analysis of background spectra derived from in situ measurements, the spatial and temporal structures of the modes are recovered. Comparison with spectra from three tsunamis of different characteristics shows that the modes found have been excited by past events. Moreover, the two locations show different response patterns. Arica is more sensitive to the characteristics of the tsunami source, whereas Iquique shows a smaller dependency and similar response for different tsunami events. Results are further compared with other methodologies with good agreement. These findings are relevant in characterizing the tsunami hazard in the area, and the methodology can be further extended to other regions along the Chilean coast.

  15. An integrated study to evaluate debris flow hazard in alpine environment

    NASA Astrophysics Data System (ADS)

    Tiranti, Davide; Crema, Stefano; Cavalli, Marco; Deangeli, Chiara

    2018-05-01

    Debris flows are among the most dangerous natural processes affecting the alpine environment due to their magnitude (volume of transported material) and the long runout. The presence of structures and infrastructures on alluvial fans can lead to severe problems in terms of interactions between debris flows and human activities. Risk mitigation in these areas requires identifying the magnitude, triggers, and propagation of debris flows. Here, we propose an integrated methodology to characterize these phenomena. The methodology consists of three complementary procedures. Firstly, we adopt a classification method based on the propensity of the catchment bedrocks to produce clayey-grained material. The classification allows us to identify the most likely rheology of the process. Secondly, we calculate a sediment connectivity index to estimate the topographic control on the possible coupling between the sediment source areas and the catchment channel network. This step allows for the assessment of the debris supply, which is most likely available for the channelized processes. Finally, with the data obtained in the previous steps, we modelled the propagation and depositional pattern of debris flows with a 3D code based on Cellular Automata. The results of the numerical runs allow us to identify the depositional patterns and the areas potentially involved in the flow processes. This integrated methodology is applied to a test-bed catchment located in Northwestern Alps. The results indicate that this approach can be regarded as a useful tool to estimate debris flow related potential hazard scenarios in an alpine environment in an expeditious way without possessing an exhaustive knowledge of the investigated catchment, including data on historical debris flow events.

  16. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  17. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  18. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.

  19. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA - A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Florida’s (the County) residential solid waste (characterized in this study as municipal s...

  20. 12th meeting of the Scientific Group on Methodologies for the Safety Evaluation of Chemicals: susceptibility to environmental hazards.

    PubMed Central

    Barrett, J C; Vainio, H; Peakall, D; Goldstein, B D

    1997-01-01

    The 12th meeting of the Scientific Group on Methodologies for the Safety Evaluation of Chemicals (SGOMSEC) considered the topic of methodologies for determining human and ecosystem susceptibility to environmental hazards. The report prepared at the meeting describes measurement of susceptibility through the use of biological markers of exposure, biological markers of effect, and biomarkers directly indicative of susceptibility of humans or of ecosystems. The utility and validity of these biological markers for the study of susceptibility are evaluated, as are opportunities for developing newer approaches for the study of humans or of ecosystems. For the first time a SGOMSEC workshop also formally considered the issue of ethics in relation to methodology, an issue of particular concern for studies of susceptibility. PMID:9255554

  1. Safety evaluation methodology for advanced coal extraction systems

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.

    1981-01-01

    Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.

  2. A lava flow simulation model for the development of volcanic hazard maps for Mount Etna (Italy)

    NASA Astrophysics Data System (ADS)

    Damiani, M. L.; Groppelli, G.; Norini, G.; Bertino, E.; Gigliuto, A.; Nucita, A.

    2006-05-01

    Volcanic hazard assessment is of paramount importance for the safeguard of the resources exposed to volcanic hazards. In the paper we present ELFM, a lava flow simulation model for the evaluation of the lava flow hazard on Mount Etna (Sicily, Italy), the most important active volcano in Europe. The major contributions of the paper are: (a) a detailed specification of the lava flow simulation model and the specification of an algorithm implementing it; (b) the definition of a methodological framework for applying the model to the specific volcano. For what concerns the former issue, we propose an extended version of an existing stochastic model that has been applied so far only to the assessment of the volcanic hazard on Lanzarote and Tenerife (Canary Islands). Concerning the methodological framework, we claim model validation is definitely needed for assessing the effectiveness of the lava flow simulation model. To that extent a strategy has been devised for the generation of simulation experiments and evaluation of their outcomes.

  3. Multi-scale landslide hazard assessment: Advances in global and regional methodologies

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang

    2010-05-01

    The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.

  4. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  5. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  6. A review of multi-risk methodologies for natural hazards: Consequences and challenges for a climate change impact assessment.

    PubMed

    Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio

    2016-03-01

    This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. A Methodology for Determining Statistical Performance Compliance for Airborne Doppler Radar with Forward-Looking Turbulence Detection Capability

    NASA Technical Reports Server (NTRS)

    Bowles, Roland L.; Buck, Bill K.

    2009-01-01

    The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.

  8. When gender bumps into health and safety training: working conditions, readings and challenges drawn from a case study in an industrial chemicals company.

    PubMed

    Vasconcelos, Ricardo; Teixeira, Sandra; Castelhano, Joana; Lacomblez, Marianne

    2012-01-01

    Health, safety and environmental issues are at present a social concern and an increasingly referred topic in the so called gender studies. This paper focuses on the relations between training, gender and risk perception in an industrial chemicals company, in Portugal, characterized by a mainly male population and by the presence of high occupational and environmental hazards. After characterizing the company and the training project that started up this reflection, the paper presents the reasons for its focus on gender followed by the essential methodological explanations: 14 interviews were made with male and female workers from the company; their content was transcribed from the audio recordings and it was systematically analyzed. A gender-attentive socio demographic analysis was also undertaken. Although at the beginning the company did not consider the gender issues as a problem nor was it the central topic of the training, which focused on the prevention of occupational and environmental hazards, the results reveal that the gender factor brought to light some working conditions, which so far have not been properly discussed within the group meetings. As a consequence, there is now room for the transformation of the representations on those working conditions.

  9. Methodology for national risk analysis and prioritization of toxic industrial chemicals.

    PubMed

    Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina

    2013-01-01

    The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.

  10. Initial source and site characterization studies for the U.C. Santa Barbara campus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archuleta, R.; Nicholson, C.; Steidl, J.

    1997-12-01

    The University of California Campus-Laboratory Collaboration (CLC) project is an integrated 3 year effort involving Lawrence Livermore National Laboratory (LLNL) and four UC campuses - Los Angeles (UCLA), Riverside (UCR), Santa Barbara (UCSB), and San Diego (UCSD) - plus additional collaborators at San Diego State University (SDSU), at Los Alamos National Laboratory and in industry. The primary purpose of the project is to estimate potential ground motions from large earthquakes and to predict site-specific ground motions for one critical structure on each campus. This project thus combines the disciplines of geology, seismology, geodesy, soil dynamics, and earthquake engineering into amore » fully integrated approach. Once completed, the CLC project will provide a template to evaluate other buildings at each of the four UC campuses, as well as provide a methodology for evaluating seismic hazards at other critical sites in California, including other UC locations at risk from large earthquakes. Another important objective of the CLC project is the education of students and other professional in the application of this integrated, multidisciplinary, state-of-the-art approach to the assessment of earthquake hazard. For each campus targeted by the CLC project, the seismic hazard study will consist of four phases: Phase I - Initial source and site characterization, Phase II - Drilling, logging, seismic monitoring, and laboratory dynamic soil testing, Phase III - Modeling of predicted site-specific earthquake ground motions, and Phase IV - Calculations of 3D building response. This report cover Phase I for the UCSB campus and incudes results up through March 1997.« less

  11. Multi-hazard risk analysis related to hurricanes

    NASA Astrophysics Data System (ADS)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.

  12. A method to analyze territory resilience to natural hazards, the example of the French Riviera against tsunami

    NASA Astrophysics Data System (ADS)

    Boschetti, Laurie; Provitolo, Damienne; Tric, Emmanuel

    2017-04-01

    Climate change and major crisis have been increasing during the 21st century, which have impacted people and helped them to realize that they had to protect themselves against it. That is why scientists, practitioners and institutions are more and more exploring resilience concept and methodology (Dauphiné, Provitolo, 2013). Resilience came at first from material physics, and is now developed in different disciplines, like psychology, ecology, economy, more recently geography, and more specifically natural risk analysis. The downside of this multidisciplinary interest is that this concept became a polysemous concept, resulting on the difficulty, for the scientific community, to agree about a single definition to characterize it (Reghezza et al. 2015). Our presentation will propose a resilience analysis model of a territory subject to naturel hazard, after which, this methodology will be demonstrating to a specific territory, the French Riviera, more precisely the Alpes-Maritimes. We choose, as natural hazard to realize our study, the tsunami which could impact the Alpes-Maritimes coast. This choice has been done for many reasons: - This natural hazard is almost not included in the different studies and french official documents, whereas the risk is real in Mediterranean. Two significant events had happened in our study area: the first one in 1887, following the Ligurian earthquake (Ioualalen et al. 2010); the second one in 1979, off Nice airport, produced by a submarine landslide (Migeon, 2011b). Those events present a crucial particularity, being near the source, the arrival time is quiet short, making any planed escape impossible. We can describe them as flash risks. - The study area, containing coastal cities of the Alpes-Maritimes, presents many key issues, humans and economics. - This region has a specific geography, including a territory which has been developed between sea and mountains, a high density in the coast, and an anisotropy of the networks (infrastructure, communication, etc.). Yet we know how essential it is to maintain network in the recovery after disasters. (Lhomme et al. 2010). For this purpose, we relied on the resilience analyst method suggested by the scientific group Resilience Alliance (2010), who came from the human and social sciences. This methodology caught our interest, because it appeared to have a systemic approach, and allowed to include temporal dimension of an event (prevention and crisis management). However, this model presented some limits when we translated it in the field of risks and disasters. In order to create a model fully functional in this domain, we suggested to bring some changes. This new methodology not only allowed to provide an evaluation grid to the territory and the population reactions to an event, but also to determine preventive strategies (ante-catastrophe) and after disaster recovery strategies (post-catastrophe) that could be used.

  13. Contributions to the Characterization and Mitigation of Rotorcraft Brownout

    NASA Astrophysics Data System (ADS)

    Tritschler, John Kirwin

    Rotorcraft brownout, the condition in which the flow field of a rotorcraft mobilizes sediment from the ground to generate a cloud that obscures the pilot's field of view, continues to be a significant hazard to civil and military rotorcraft operations. This dissertation presents methodologies for: (i) the systematic mitigation of rotorcraft brownout through operational and design strategies and (ii) the quantitative characterization of the visual degradation caused by a brownout cloud. In Part I of the dissertation, brownout mitigation strategies are developed through simulation-based brownout studies that are mathematically formulated within a numerical optimization framework. Two optimization studies are presented. The first study involves the determination of approach-to-landing maneuvers that result in reduced brownout severity. The second study presents a potential methodology for the design of helicopter rotors with improved brownout characteristics. The results of both studies indicate that the fundamental mechanisms underlying brownout mitigation are aerodynamic in nature, and the evolution of a ground vortex ahead of the rotor disk is seen to be a key element in the development of a brownout cloud. In Part II of the dissertation, brownout cloud characterizations are based upon the Modulation Transfer Function (MTF), a metric commonly used in the optics community for the characterization of imaging systems. The use of the MTF in experimentation is examined first, and the application of MTF calculation and interpretation methods to actual flight test data is described. The potential for predicting the MTF from numerical simulations is examined second, and an initial methodology is presented for the prediction of the MTF of a brownout cloud. Results from the experimental and analytical studies rigorously quantify the intuitively-known facts that the visual degradation caused by brownout is a space and time-dependent phenomenon, and that high spatial frequency features, i.e., fine-grained detail, are obscured before low spatial frequency features, i.e., large objects. As such, the MTF is a metric that is amenable to Handling Qualities (HQ) analyses.

  14. A national scale flood hazard mapping methodology: The case of Greece - Protection and adaptation policy approaches.

    PubMed

    Kourgialas, Nektarios N; Karatzas, George P

    2017-12-01

    The present work introduces a national scale flood hazard assessment methodology, using multi-criteria analysis and artificial neural networks (ANNs) techniques in a GIS environment. The proposed methodology was applied in Greece, where flash floods are a relatively frequent phenomenon and it has become more intense over the last decades, causing significant damages in rural and urban sectors. In order the most prone flooding areas to be identified, seven factor-maps (that are directly related to flood generation) were combined in a GIS environment. These factor-maps are: a) the Flow accumulation (F), b) the Land use (L), c) the Altitude (A), b) the Slope (S), e) the soil Erodibility (E), f) the Rainfall intensity (R), and g) the available water Capacity (C). The name to the proposed method is "FLASERC". The flood hazard for each one of these factors is classified into five categories: Very low, low, moderate, high, and very high. The above factors are combined and processed using the appropriate ANN algorithm tool. For the ANN training process spatial distribution of historical flooded points in Greece within the five different flood hazard categories of the aforementioned seven factor-maps were combined. In this way, the overall flood hazard map for Greece was determined. The final results are verified using additional historical flood events that have occurred in Greece over the last 100years. In addition, an overview of flood protection measures and adaptation policy approaches were proposed for agricultural and urban areas located at very high flood hazard areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Exposure to fall hazards and safety climate in the aircraft maintenance industry.

    PubMed

    Neitzel, Richard L; Seixas, Noah S; Harris, Michael J; Camp, Janice

    2008-01-01

    Falls represent a significant occupational hazard, particularly in industries with dynamic work environments. This paper describes rates of noncompliance with fall hazard prevention requirements, perceived safety climate and worker knowledge and beliefs, and the association between fall exposure and safety climate measures in commercial aircraft maintenance activities. Walkthrough observations were conducted on aircraft mechanics at two participating facilities (Sites A and B) to ascertain the degree of noncompliance. Mechanics at each site completed questionnaires concerning fall hazard knowledge, personal safety beliefs, and safety climate. Questionnaire results were summarized into safety climate and belief scores by workgroup and site. Noncompliance rates observed during walkthroughs were compared to the climate-belief scores, and were expected to be inversely associated. Important differences were seen in fall safety performance between the sites. The study provided a characterization of aircraft maintenance fall hazards, and also demonstrated the effectiveness of an objective hazard assessment methodology. Noncompliance varied by height, equipment used, location of work on the aircraft, shift, and by safety system. Although the expected relationship between safety climate and noncompliance was seen for site-average climate scores, workgroups with higher safety climate scores had greater observed noncompliance within Site A. Overall, use of engineered safety systems had a significant impact on working safely, while safety beliefs and climate also contributed, though inconsistently. The results of this study indicate that safety systems are very important in reducing noncompliance with fall protection requirements in aircraft maintenance facilities. Site-level fall safety compliance was found to be related to safety climate, although an unexpected relationship between compliance and safety climate was seen at the workgroup level within site. Finally, observed fall safety compliance was found to differ from self-reported compliance.

  16. Neo-deterministic seismic hazard scenarios for India—a preventive tool for disaster mitigation

    NASA Astrophysics Data System (ADS)

    Parvez, Imtiyaz A.; Magrin, Andrea; Vaccari, Franco; Ashish; Mir, Ramees R.; Peresan, Antonella; Panza, Giuliano Francesco

    2017-11-01

    Current computational resources and physical knowledge of the seismic wave generation and propagation processes allow for reliable numerical and analytical models of waveform generation and propagation. From the simulation of ground motion, it is easy to extract the desired earthquake hazard parameters. Accordingly, a scenario-based approach to seismic hazard assessment has been developed, namely the neo-deterministic seismic hazard assessment (NDSHA), which allows for a wide range of possible seismic sources to be used in the definition of reliable scenarios by means of realistic waveforms modelling. Such reliable and comprehensive characterization of expected earthquake ground motion is essential to improve building codes, particularly for the protection of critical infrastructures and for land use planning. Parvez et al. (Geophys J Int 155:489-508, 2003) published the first ever neo-deterministic seismic hazard map of India by computing synthetic seismograms with input data set consisting of structural models, seismogenic zones, focal mechanisms and earthquake catalogues. As described in Panza et al. (Adv Geophys 53:93-165, 2012), the NDSHA methodology evolved with respect to the original formulation used by Parvez et al. (Geophys J Int 155:489-508, 2003): the computer codes were improved to better fit the need of producing realistic ground shaking maps and ground shaking scenarios, at different scale levels, exploiting the most significant pertinent progresses in data acquisition and modelling. Accordingly, the present study supplies a revised NDSHA map for India. The seismic hazard, expressed in terms of maximum displacement (Dmax), maximum velocity (Vmax) and design ground acceleration (DGA), has been extracted from the synthetic signals and mapped on a regular grid over the studied territory.

  17. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    NASA Astrophysics Data System (ADS)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  18. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.

  19. Volcanic hazard management in dispersed volcanism areas

    NASA Astrophysics Data System (ADS)

    Marrero, Jose Manuel; Garcia, Alicia; Ortiz, Ramon

    2014-05-01

    Traditional volcanic hazard methodologies were developed mainly to deal with the big stratovolcanoes. In such type of volcanoes, the hazard map is an important tool for decision-makers not only during a volcanic crisis but also for territorial planning. According to the past and recent eruptions of a volcano, all possible volcanic hazards are modelled and included in the hazard map. Combining the hazard map with the Event Tree the impact area can be zoned and defining the likely eruptive scenarios that will be used during a real volcanic crisis. But in areas of disperse volcanism is very complex to apply the same volcanic hazard methodologies. The event tree do not take into account unknown vents, because the spatial concepts included in it are only related with the distance reached by volcanic hazards. The volcanic hazard simulation is also difficult because the vent scatter modifies the results. The volcanic susceptibility try to solve this problem, calculating the most likely areas to have an eruption, but the differences between low and large values obtained are often very small. In these conditions the traditional hazard map effectiveness could be questioned, making necessary a change in the concept of hazard map. Instead to delimit the potential impact areas, the hazard map should show the expected behaviour of the volcanic activity and how the differences in the landscape and internal geo-structures could condition such behaviour. This approach has been carried out in La Palma (Canary Islands), combining the concept of long-term hazard map with the short-term volcanic scenario to show the expected volcanic activity behaviour. The objective is the decision-makers understand how a volcanic crisis could be and what kind of mitigation measurement and strategy could be used.

  20. Total Risk Integrated Methodology (TRIM) - TRIM.Risk

    EPA Pesticide Factsheets

    TRIM.Riskis used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.

  1. GENETIC ACTIVITY PROFILES AND HAZARD ASSESSMENT

    EPA Science Inventory

    A methodology has been developed to display and evaluate multiple test quantitative information on genetic toxicants for purposes of hazard/risk assessment. ose information is collected from the open literature: either the lowest effective dose (LED) or the highest ineffective do...

  2. INHALATION EXPOSURE-RESPONSE METHODOLOGY

    EPA Science Inventory

    The Inhalation Exposure-Response Analysis Methodology Document is expected to provide guidance on the development of the basic toxicological foundations for deriving reference values for human health effects, focusing on the hazard identification and dose-response aspects of the ...

  3. Methodologies For A Physically Based Rockfall Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Agliardi, F.; Crosta, G. B.; Guzzetti, F.; Marian, M.

    Rockfall hazard assessment is an important land planning tool in alpine areas, where settlements progressively expand across rockfall prone areas, rising the vulnerability of the elements at risk, the worth of potential losses and the restoration costs. Nev- ertheless, hazard definition is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. In addition, the high mobility of rockfalls implies a more difficult hazard definition with respect to other slope insta- bilities for which runout is minimal. When coping with rockfalls, hazard assessment involves complex definitions for "occurrence probability" and "intensity". The local occurrence probability must derive from the combination of the triggering probability (related to the geomechanical susceptibility of rock masses to fail) and the transit or impact probability at a given location (related to the motion of falling blocks). The intensity (or magnitude) of a rockfall is a complex function of mass, velocity and fly height of involved blocks that can be defined in many different ways depending on the adopted physical description and "destructiveness" criterion. This work is an attempt to evaluate rockfall hazard using the results of numerical modelling performed by an original 3D rockfall simulation program. This is based on a kinematic algorithm and allows the spatially distributed simulation of rockfall motions on a three-dimensional topography described by a DTM. The code provides raster maps portraying the max- imum frequency of transit, velocity and height of blocks at each model cell, easily combined in a GIS in order to produce physically based rockfall hazard maps. The results of some three dimensional rockfall models, performed at both regional and lo- cal scale in areas where rockfall related problems are well known, have been used to assess rockfall hazard, by adopting an objective approach based on three-dimensional matrixes providing a positional "hazard index". Different hazard maps have been ob- tained combining and classifying variables in different ways. The performance of the different hazard maps has been evaluated on the basis of past rockfall events and com- pared to the results of existing methodologies. The sensitivity of the hazard index with respect to the included variables and their combinations is discussed in order to constrain as objective as possible assessment criteria.

  4. The Hollin Hill Landslide Observatory - a decade of geophysical characterization and monitoring

    NASA Astrophysics Data System (ADS)

    Uhlemann, S.; Wilkinson, P. B.; Meldrum, P.; Smith, A.; Dixon, N.; Merritt, A.; Swift, R. T.; Whiteley, J.; Gunn, D.; Chambers, J. E.

    2017-12-01

    Landslides are major and frequent natural hazards. They shape the Earth's surface, and endanger communities and infrastructure worldwide. Within the last decade, landslides caused more than 28,000 fatalities and direct damage exceeding $1.8 billion. Climate change, causing more frequent weather extremes, is likely to increase occurrences of shallow slope failures worldwide. Thus, there is a need to improve our understanding of these shallow, rainfall-induced landslides. In this context, integrated geophysical characterization and monitoring can play a crucial role by providing volumetric data that can be linked to the hydrological and geotechnical conditions of a slope. This enables understanding of the complex hydrological processes most-often being associated with landslides. Here we present a review of a decade of characterizing and monitoring a complex, inland, clayey landslide - forming the "Hollin Hill Landslide Observatory". Within the last decade, this landslide has experienced different activity characteristics, including creep, flow, and rotational failures - thereby providing an excellent testbed for the development of geophysical and geotechnical monitoring instrumentation and methodologies. These include developments of 4D geoelectrical monitoring techniques to estimate electrode positions from the resistivity data, incorporating these into a time-lapse inversion, and imaging moisture dynamics that control the landslide behaviour. Other developments include acoustic emission monitoring, and active and passive seismic monitoring. This work is underpinned by detailed characterization of the landslide, using geomorphological and geological mapping, geotechnical investigations, and a thorough geoelectrical and seismic characterization of the landslide mass. Hence, the data gained from the Hollin Hill landslide observatory has improved our understanding of the shallow landslide dynamics in response to climate change, their mechanics and evolution. The methodological and technical developments achieved at this site are suitable and applicable for implementation on other landslides worldwide.

  5. K Basin Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PECH, S.H.

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  6. A Probabilistic Tsunami Hazard Assessment Methodology and Its Application to Crescent City, CA

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Leveque, R. J.; Waagan, K.; Adams, L.; Lin, G.

    2012-12-01

    A PTHA methodology, based in large part on Probabilistic Seismic Hazard Assessment methods (e.g., Cornell, 1968; SSHAC, 1997; Geist and Parsons, 2005), was previously applied to Seaside, OR (Gonzalez, et al., 2009). This initial version of the method has been updated to include: a revised method to estimate tidal uncertainty; an improved method for generating stochastic realizations to estimate slip distribution uncertainty (Mai and Beroza, 2002; Blair, et al., 2011); additional near-field sources in the Cascadia Subduction Zone, based on the work of Goldfinger, et al. (2012); far-field sources in Japan, based on information updated since the 3 March 2011 Tohoku tsunami (Japan Earthquake Research Committee, 2011). The GeoClaw tsunami model (Berger, et. al, 2011) is used to simulate generation, propagation and inundation. We will discuss this revised PTHA methodology and the results of its application to Crescent City, CA. Berger, M.J., D. L. George, R. J. LeVeque, and K. T. Mandli, The GeoClaw software for depth-averaged flows with adaptive refinement, Adv. Water Res. 34 (2011), pp. 1195-1206. Blair, J.L., McCrory, P.A., Oppenheimer, D.H., and Waldhauser, F. (2011): A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity: U.S. Geological Survey Data Series 633, v.1.0, available at http://pubs.usgs.gov/ds/633/. Cornell, C. A. (1968): Engineering seismic risk analysis, Bull. Seismol. Soc. Am., 58, 1583-1606. Geist, E. L., and T. Parsons (2005): Probabilistic Analysis of Tsunami Hazards, Nat. Hazards, 37 (3), 277-314. Goldfinger, C., Nelson, C.H., Morey, A.E., Johnson, J.E., Patton, J.R., Karabanov, E., Gutiérrez-Pastor, J., Eriksson, A.T., Gràcia, E., Dunhill, G., Enkin, R.J., Dallimore, A., and Vallier, T. (2012): Turbidite event history—Methods and implications for Holocene paleoseismicity of the Cascadia subduction zone: U.S. Geological Survey Professional Paper 1661-F, 170 p. (Available at http://pubs.usgs.gov/pp/pp1661f/). González, F.I., E.L. Geist, B. Jaffe, U. Kânoglu, H. Mofjeld, C.E. Synolakis, V.V Titov, D. Arcas, D. Bellomo, D. Carlton, T. Horning, J. Johnson, J. Newman, T. Parsons, R. Peters, C. Peterson, G .Priest, A. Venturato, J. Weber, F. Wong, and A. Yalciner (2009): Probabilistic Tsunami Hazard Assessment at Seaside, Oregon, for Near- and Far-Field Seismic Sources, J. Geophys. Res., 114, C11023, doi:10.1029/2008JC005132. Japan Earthquake Research Committee, (2011): http://www.jishin.go.jp/main/p_hyoka02.htm Mai, P. M., and G. C. Beroza (2002): A spatial random field model to characterize complexity in earthquake slip, J. Geophys. Res., 107(B11), 2308, doi:10.1029/2001JB000588. SSHAC (Senior Seismic Hazard Analysis Committee) (1997): Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, Main Report Rep. NUREG/CR-6372 UCRL-ID-122160 Vol. 1, 256 pp, U.S. Nuclear Regulatory Commission.

  7. Incorporating advanced EMI technologies in operational munitions characterization surveys

    NASA Astrophysics Data System (ADS)

    Miller, Jonathan S.; Shubiditze, Fridon; Pasion, Leonard; Schultz, Gregory; Chung, Heesoo

    2011-06-01

    The presence of unexploded ordnance (UXO), discarded military munitions (DMM), and munitions constituents (MC) at both active and formerly used defense sites (FUDS) has created a necessity for production-level efforts to remove these munitions and explosives of concern (MEC). Ordnance and explosives (OE) and UXO removal operations typically employ electromagnetic induction (EMI) or magnetometer surveys to identify potential MEC hazards in previously determined areas of interest. A major cost factor in these operations is the significant allocation of resources for the excavation of harmless objects associated with fragmentation, scrap, or geological clutter. Recent advances in classification and discrimination methodologies, as well as the development of sensor technologies that fully exploit physics-based analysis, have demonstrated promise for significantly reducing the false alarm rate due to MEC related clutter. This paper identifies some of the considerations for and the challenges associated with implementing these discrimination methodologies and advanced sensor technologies in production-level surveys. Specifically, we evaluate the implications of deploying an advanced multi-axis EMI sensor at a variety of MEC sites, the discrimination methodologies that leverage the data produced by this sensor, and the potential for productivity increase that could be realized by incorporating this advanced technology as part of production protocol.

  8. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  9. Canister Storage Building (CSB) Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    POWERS, T.B.

    2000-03-16

    This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less

  10. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    PubMed

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  11. Assessment of Component-level Emission Measurements ...

    EPA Pesticide Factsheets

    Oil and natural gas (ONG) production facilities have the potential to emit a substantial amount of greenhouse gasses, hydrocarbons and hazardous air pollutants into the atmosphere. These emissions come from a wide variety of sources including engine exhaust, combustor gases, atmospheric venting from uncontrolled tanks and leaks. Engine exhaust, combustor gases and atmospheric tank venting are included in the initial estimation of a production facilities cumulative emissions. However, there is a large amount of uncertainty associated with magnitude and composition of leaks at these facilities. In order to understand the environmental impacts of these emissions we must first be able characterize the emission flow rate and chemical composition of these leaks/venting. A number of recent publications regarding emission flow rate measurements of components at ONG production facilities have brought into question the validity of such measurements and the sampling methodology. An accurate methodology for quantifying hydrocarbon leaks/venting is needed to support both emission inventories and environmental compliance. This interim report will summarize recent results from a small leak survey completed at ONG production facilities in Utah to characterize their flow rate and chemical composition using a suite of instruments using a high volume sampler (Bacharach Hi Flow Sampler; Bacharach, Inc.), as well as infrared (IR) cameras, a photoionization detector (PID), a fl

  12. Hazard identification and pre-map with a simple specific tool: synthesis of application experience in handicrafts in various productive sectors.

    PubMed

    Colombini, Daniela; Occhipinti, Enrico; Peluso, Raffaele; Montomoli, Loretta

    2012-01-01

    In August 2009, an international group was founded with the task of developing a "toolkit for MSD prevention" under the IEA and in collaboration with the World Health Organization.According to the ISO standard 11228 series and the new Draft ISO TR 12259 "Application document guides for the potential user", our group developed a preliminary "mapping" methodology of occupational hazards in the craft industry, supported by software (Excel®, free download on: www.epmresearch.org).The possible users of toolkits are: members of health and safety committees; health and safety representatives; line supervisors; foremen; workers; government representatives; health workers providing basic occupational health services; occupational health and safety specialists.The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazards identification and risk estimation to be made. It is thus possible to decide for which occupational hazards a more exhaustive risk assessment will be necessary and which occupational consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).The methodology has been applied in different situations of small and medium craftsmanship Italian enterprises: leather goods, food, technical dental work, production of artistic ceramics and stained glass, beekeeping activities. The results are synthetically reported and discussed in this paper.

  13. A Subject Reference: Benefit-Cost Analysis of Toxic Substances, Hazardous Materials and Solid Waste Control (1977)

    EPA Pesticide Factsheets

    Discussion of methodological issues for conducting benefit-cost analysis and provides guidance for selecting and applying the most appropriate and useful mechanisms in benefit-cost analysis of toxic substances, hazardous materials, and solid waste control

  14. APPLICATION OF A GEOGRAPHIC INFORMATION SYSTEM FOR A CONTAINMENT SYSTEM LEAK DETECTION

    EPA Science Inventory

    The use of physical and hydraulic containment systems for the isolation of contaminated ground water associated with hazardous waste sites has increased during the last decade. Existing methodologies for monitoring and evaluating leakage from hazardous waste containment systems ...

  15. Analyzing the sensitivity of a flood risk assessment model towards its input data

    NASA Astrophysics Data System (ADS)

    Glas, Hanne; Deruyter, Greet; De Maeyer, Philippe; Mandal, Arpita; James-Williamson, Sherene

    2016-11-01

    The Small Island Developing States are characterized by an unstable economy and low-lying, densely populated cities, resulting in a high vulnerability to natural hazards. Flooding affects more people than any other hazard. To limit the consequences of these hazards, adequate risk assessments are indispensable. Satisfactory input data for these assessments are hard to acquire, especially in developing countries. Therefore, in this study, a methodology was developed and evaluated to test the sensitivity of a flood model towards its input data in order to determine a minimum set of indispensable data. In a first step, a flood damage assessment model was created for the case study of Annotto Bay, Jamaica. This model generates a damage map for the region based on the flood extent map of the 2001 inundations caused by Tropical Storm Michelle. Three damages were taken into account: building, road and crop damage. Twelve scenarios were generated, each with a different combination of input data, testing one of the three damage calculations for its sensitivity. One main conclusion was that population density, in combination with an average number of people per household, is a good parameter in determining the building damage when exact building locations are unknown. Furthermore, the importance of roads for an accurate visual result was demonstrated.

  16. Review of Natural Phenomena Hazard (NPH) Assessments for the DOE Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snow, Robert L.; Ross, Steven B.

    2011-09-15

    The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the DOE's Hanford Site, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. This review is an update and expansion to the September 2010 review of PNNL-19751, Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic).

  17. The Development of a Tri-Service Notification System for Type 1 Medical Materiel Complaints.

    DTIC Science & Technology

    1992-09-01

    Hazardous Food and Nonprescription Drug Recall System ...... ............... .... 24 Chapter Summary ..... ............... .... 27 III. Methodology...examination of an existing DOD notification process for hazardous food and nonprescription drugs. It must be emphasized that the process being investigated in...notification process for defective medical materiel has not been accomplished. Hazardous Food and Nonprescription Drug Recall System In examining the DoD

  18. Safety assessment methodology in management of spent sealed sources.

    PubMed

    Mahmoud, Narmine Salah

    2005-02-14

    Environmental hazards can be caused from radioactive waste after their disposal. It was therefore important that safety assessment methodologies be developed and established to study and estimate the possible hazards, and institute certain safety methodologies that lead and prevent the evolution of these hazards. Spent sealed sources are specific type of radioactive waste. According to IAEA definition, spent sealed sources are unused sources because of activity decay, damage, misuse, loss, or theft. Accidental exposure of humans from spent sealed sources can occur at the moment they become spent and before their disposal. Because of that reason, safety assessment methodologies were tailored to suit the management of spent sealed sources. To provide understanding and confidence of this study, validation analysis was undertaken by considering the scenario of an accident that occurred in Egypt, June 2000 (the Meet-Halfa accident from an iridium-192 source). The text of this work includes consideration related to the safety assessment approaches of spent sealed sources which constitutes assessment context, processes leading an active source to be spent, accident scenarios, mathematical models for dose calculations, and radiological consequences and regulatory criteria. The text also includes a validation study, which was carried out by evaluating a theoretical scenario compared to the real scenario of Meet-Halfa accident depending on the clinical assessment of affected individuals.

  19. A comprehensive multi-scenario based approach for a reliable flood-hazard assessment: a case-study application

    NASA Astrophysics Data System (ADS)

    Lanni, Cristiano; Mazzorana, Bruno; Volcan, Claudio; Bertagnolli, Rudi

    2015-04-01

    Flood hazard is generally assessed by assuming the return period of the rainfall as a proxy for the return period of the discharge and the related hydrograph. Frequently this deterministic view is extended also to the straightforward application of hydrodynamic models. However, the climate (i.e. precipitation), the catchment (i.e. geology, soil and antecedent soil-moisture condition) and the anthropogenic (i.e. drainage system and its regulation) systems interact in a complex way, and the occurrence probability of a flood inundation event can significantly differ from the occurrence probability of the triggering event (i.e. rainfall). In order to reliably determine the spatial patterns of flood intensities and probabilities, the rigorous determination of flood event scenarios is beneficial because it provides a clear, rationale method to recognize and unveil the inherent stochastic behavior of natural processes. Therefore, a multi-scenario approach for hazard assessment should be applied and should consider the possible events taking place in the area potentially subject to flooding (i.e. floodplains). Here, we apply a multi-scenario approach for the assessment of the flood hazard around the Idro lake (Italy). We consider and estimate the probability of occurrence of several scenarios related to the initial (i.e. initial water level in the lake) and boundary (i.e. shape of the hydrograph, downslope drainage, spillway opening operations) conditions characterizing the lake. Finally, we discuss the advantages and issues of the presented methodological procedure compared to traditional (and essentially deterministic) approaches.

  20. A new approach to characterize very-low-level radioactive waste produced at hadron accelerators.

    PubMed

    Zaffora, Biagio; Magistris, Matteo; Chevalier, Jean-Pierre; Luccioni, Catherine; Saporta, Gilbert; Ulrici, Luisa

    2017-04-01

    Radioactive waste is produced as a consequence of preventive and corrective maintenance during the operation of high-energy particle accelerators or associated dismantling campaigns. Their radiological characterization must be performed to ensure an appropriate disposal in the disposal facilities. The radiological characterization of waste includes the establishment of the list of produced radionuclides, called "radionuclide inventory", and the estimation of their activity. The present paper describes the process adopted at CERN to characterize very-low-level radioactive waste with a focus on activated metals. The characterization method consists of measuring and estimating the activity of produced radionuclides either by experimental methods or statistical and numerical approaches. We adapted the so-called Scaling Factor (SF) and Correlation Factor (CF) techniques to the needs of hadron accelerators, and applied them to very-low-level metallic waste produced at CERN. For each type of metal we calculated the radionuclide inventory and identified the radionuclides that most contribute to hazard factors. The methodology proposed is of general validity, can be extended to other activated materials and can be used for the characterization of waste produced in particle accelerators and research centres, where the activation mechanisms are comparable to the ones occurring at CERN. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. CHARACTERIZATION OF ORGANIC EMISSIONS FROM HAZARDOUS WASTE INCINERATION PROCESSES UNDER THE NEW EPA DRAFT RISK BURN GUIDANCE: MEASUREMENT ISSUES

    EPA Science Inventory

    The paper discusses measurement issues relating to the characterization of organic emissions from hazardous waste incineration processes under EPA's new risk burn guidance. The recently published draft quidance recommends that hazardous waste combustion facilities complete a mass...

  2. A PROBABILISTIC METHOD FOR ESTIMATING MONITORING POINT DENSITY FOR CONTAINMENT SYSTEM LEAK DETECTION

    EPA Science Inventory

    The use of physical and hydraulic containment systems for the isolation of contaminated ground water and aquifer materials ssociated with hazardous waste sites has increased during the last decade. The existing methodologies for monitoring and evaluating leakage from hazardous w...

  3. [Working hypothesis of simplified techniques for the first mapping of occupational hazards in handicraft. First part: ergonomics hazards].

    PubMed

    Colombini, D; Di Leone, G; Occhipinti, E; Montomoli, L; Ruschioni, A; Giambartolomei, M; Ardissone, S; Fanti, M; Pressiani, S; Placci, M; Cerbai, M; Preite, S

    2009-01-01

    During the last Congress of the International Ergonomics Association (IEA), Beijing - China August 2009, in collaboration with World Health Organization an international group for developing a "toolkit for MSD prevention" was founded. Possible users of toolkits are: members of a health and safety committee; health and safety representatives; line supervisors; foremen; workers; government representatives; health workers implementing basic occupational health services; occupational health and safety specialists. According with ISO standard 11228 series and their ISO Application document for the Key enters and Quick Assessment (green/red conditions), our group developed a first mapping methodology of occupational hazards in handicraft, working with the support of the information technology (Excel). This methodology, utilizing specific key enters and quick evaluation, allows a simple risk estimation. So it is possible to decide for which occupational hazards will be necessary an exhaustive assessment and to which professional consultant it is better to direct them to (worker's doctor, engineer, chemical, etc.).

  4. Regional ash fall hazard II: Asia-Pacific modelling results and implications

    NASA Astrophysics Data System (ADS)

    Jenkins, Susanna; McAneney, John; Magill, Christina; Blong, Russell

    2012-09-01

    In a companion paper (this volume), the authors propose a methodology for assessing ash fall hazard on a regional scale. In this study, the methodology is applied to the Asia-Pacific region, determining the hazard from 190 volcanoes to over one million square kilometre of urban area. Ash fall hazard is quantified for each square kilometre grid cell of urban area in terms of the annual exceedance probability (AEP), and its inverse, the average recurrence interval (ARI), for ash falls exceeding 1, 10 and 100 mm. A surrogate risk variable, the Population-Weighted Hazard Score: the product of AEP and population density, approximates the relative risk for each grid cell. Within the Asia-Pacific region, urban areas in Indonesia are found to have the highest levels of hazard and risk, while Australia has the lowest. A clear demarcation emerges between the hazard in countries close to and farther from major subduction plate boundaries, with the latter having ARIs at least 2 orders of magnitude longer for the same thickness thresholds. Countries with no volcanoes, such as North Korea and Malaysia, also face ash falls from volcanoes in neighbouring countries. Ash falls exceeding 1 mm are expected to affect more than one million people living in urban areas within the study region; in Indonesia, Japan and the Philippines, this situation could occur with ARIs less than 40 years.

  5. Probabilistic Hazard Estimation at a Densely Urbanised Area: the Neaples Volcanoes

    NASA Astrophysics Data System (ADS)

    de Natale, G.; Mastrolorenzo, G.; Panizza, A.; Pappalardo, L.; Claudia, T.

    2005-12-01

    The Neaples volcanic area (Southern Italy), including Vesuvius, Campi Flegrei caldera and Ischia island, is the highest risk one in the World, where more than 2 million people live within about 10 km from an active volcanic vent. Such an extreme risk calls for accurate methodologies aimed to quantify it, in a probabilistic way, considering all the available volcanological information as well as modelling results. In fact, simple hazard maps based on the observation of deposits from past eruptions have the major problem that eruptive history generally samples a very limited number of possible outcomes, thus resulting almost meaningless to get the event probability in the area. This work describes a methodology making the best use (from a Bayesian point of view) of volcanological data and modelling results, to compute probabilistic hazard maps from multi-vent explosive eruptions. The method, which follows an approach recently developed by the same authors for pyroclastic flows hazard, has been here improved and extended to compute also fall-out hazard. The application of the method to the Neapolitan volcanic area, including the densely populated city of Naples, allows, for the first time, to get a global picture of the areal distribution for the main hazards from multi-vent explosive eruptions. From a joint consideration of the hazard contributions from all the three volcanic areas, new insight on the volcanic hazard distribution emerges, which will have strong implications for urban and emergency planning in the area.

  6. Uncertainty factors in screening ecological risk assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duke, L.D.; Taggart, M.

    2000-06-01

    The hazard quotient (HQ) method is commonly used in screening ecological risk assessments (ERAs) to estimate risk to wildlife at contaminated sites. Many ERAs use uncertainty factors (UFs) in the HQ calculation to incorporate uncertainty associated with predicting wildlife responses to contaminant exposure using laboratory toxicity data. The overall objective was to evaluate the current UF methodology as applied to screening ERAs in California, USA. Specific objectives included characterizing current UF methodology, evaluating the degree of conservatism in UFs as applied, and identifying limitations to the current approach. Twenty-four of 29 evaluated ERAs used the HQ approach: 23 of thesemore » used UFs in the HQ calculation. All 24 made interspecies extrapolations, and 21 compensated for its uncertainty, most using allometric adjustments and some using RFs. Most also incorporated uncertainty for same-species extrapolations. Twenty-one ERAs used UFs extrapolating from lowest observed adverse effect level (LOAEL) to no observed adverse effect level (NOAEL), and 18 used UFs extrapolating from subchronic to chronic exposure. Values and application of all UF types were inconsistent. Maximum cumulative UFs ranged from 10 to 3,000. Results suggest UF methodology is widely used but inconsistently applied and is not uniformly conservative relative to UFs recommended in regulatory guidelines and academic literature. The method is limited by lack of consensus among scientists, regulators, and practitioners about magnitudes, types, and conceptual underpinnings of the UF methodology.« less

  7. Adding seismic broadband analysis to characterize Andean backarc seismicity in Argentina

    NASA Astrophysics Data System (ADS)

    Alvarado, P.; Giuliano, A.; Beck, S.; Zandt, G.

    2007-05-01

    Characterization of the highly seismically active Andean backarc is crucial for assessment of earthquake hazards in western Argentina. Moderate-to-large crustal earthquakes have caused several deaths, damage and drastic economic consequences in Argentinean history. We have studied the Andean backarc crust between 30°S and 36°S using seismic broadband data available from a previous ("the CHARGE") IRIS-PASSCAL experiment. We collected more than 12 terabytes of continuous seismic data from 22 broadband instruments deployed across Chile and Argentina during 1.5 years. Using free software we modeled full regional broadband waveforms and obtained seismic moment tensor inversions of crustal earthquakes testing for the best focal depth for each event. We also mapped differences in the Andean backarc crustal structure and found a clear correlation with different types of crustal seismicity (i.e. focal depths, focal mechanisms, magnitudes and frequencies of occurrence) and previously mapped terrane boundaries. We now plan to use the same methodology to study other regions in Argentina using near-real time broadband data available from the national seismic (INPRES) network and global seismic networks operating in the region. We will re-design the national seismic network to optimize short-period and broadband seismic station coverage for different network purposes. This work is an international effort that involves researchers and students from universities and national government agencies with the goal of providing more information about earthquake hazards in western Argentina.

  8. [Scientific and methodologic approaches to evaluating medical management for workers of Kazakhstan].

    PubMed

    2012-01-01

    The article covers topical problems of workers' health preservation. Complex research results enabled to evaluate and analyze occupational risks in leading industries of Kazakhstan, for improving scientific and methodologic approaches to medical management for workers subjected to hazardous conditions.

  9. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  10. Characterizing the nature and variability of avalanche hazard in western Canada

    NASA Astrophysics Data System (ADS)

    Shandro, Bret; Haegeli, Pascal

    2018-04-01

    The snow and avalanche climate types maritime, continental and transitional are well established and have been used extensively to characterize the general nature of avalanche hazard at a location, study inter-seasonal and large-scale spatial variabilities and provide context for the design of avalanche safety operations. While researchers and practitioners have an experience-based understanding of the avalanche hazard associated with the three climate types, no studies have described the hazard character of an avalanche climate in detail. Since the 2009/2010 winter, the consistent use of Statham et al. (2017) conceptual model of avalanche hazard in public avalanche bulletins in Canada has created a new quantitative record of avalanche hazard that offers novel opportunities for addressing this knowledge gap. We identified typical daily avalanche hazard situations using self-organizing maps (SOMs) and then calculated seasonal prevalence values of these situations. This approach produces a concise characterization that is conducive to statistical analyses, but still provides a comprehensive picture that is informative for avalanche risk management due to its link to avalanche problem types. Hazard situation prevalence values for individual seasons, elevations bands and forecast regions provide unprecedented insight into the inter-seasonal and spatial variability of avalanche hazard in western Canada.

  11. Dynamic evaluation of seismic hazard and risks based on the Unified Scaling Law for Earthquakes

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Nekrasova, A.

    2016-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A + B•(6 - M) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L, A characterizes the average annual rate of strong (M = 6) earthquakes, B determines the balance between magnitude ranges, and C estimates the fractal dimension of seismic locus in projection to the Earth surface. The parameters A, B, and C of USLE are used to assess, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity or paleo data), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures. The hazard maps for a given territory change dramatically, when the methodology is applied to a certain size moving time window, e.g. about a decade long for an intermediate-term regional assessment or exponentially increasing intervals for a daily local strong aftershock forecasting. The of dynamical seismic hazard and risks assessment is illustrated by applications to the territory of Greater Caucasus and Crimea and the two-year series of aftershocks of the 11 October 2008 Kurchaloy, Chechnya earthquake which case-history appears to be encouraging for further systematic testing as potential short-term forecasting tool.

  12. The NHERI RAPID Facility: Enabling the Next-Generation of Natural Hazards Reconnaissance

    NASA Astrophysics Data System (ADS)

    Wartman, J.; Berman, J.; Olsen, M. J.; Irish, J. L.; Miles, S.; Gurley, K.; Lowes, L.; Bostrom, A.

    2017-12-01

    The NHERI post-disaster, rapid response research (or "RAPID") facility, headquartered at the University of Washington (UW), is a collaboration between UW, Oregon State University, Virginia Tech, and the University of Florida. The RAPID facility will enable natural hazard researchers to conduct next-generation quick response research through reliable acquisition and community sharing of high-quality, post-disaster data sets that will enable characterization of civil infrastructure performance under natural hazard loads, evaluation of the effectiveness of current and previous design methodologies, understanding of socio-economic dynamics, calibration of computational models used to predict civil infrastructure component and system response, and development of solutions for resilient communities. The facility will provide investigators with the hardware, software and support services needed to collect, process and assess perishable interdisciplinary data following extreme natural hazard events. Support to the natural hazards research community will be provided through training and educational activities, field deployment services, and by promoting public engagement with science and engineering. Specifically, the RAPID facility is undertaking the following strategic activities: (1) acquiring, maintaining, and operating state-of-the-art data collection equipment; (2) developing and supporting mobile applications to support interdisciplinary field reconnaissance; (3) providing advisory services and basic logistics support for research missions; (4) facilitating the systematic archiving, processing and visualization of acquired data in DesignSafe-CI; (5) training a broad user base through workshops and other activities; and (6) engaging the public through citizen science, as well as through community outreach and education. The facility commenced operations in September 2016 and will begin field deployments beginning in September 2018. This poster will provide an overview of the vision for the RAPID facility, the equipment that will be available for use, the facility's operations, and opportunities for user training and facility use.

  13. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  14. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  15. TESTING OF TOXICOLOGY AND EMISSIONS SAMPLING METHODOLOGY FOR OCEAN INCINERATION OF HAZARDOUS WASTES

    EPA Science Inventory

    The report addresses the development and testing of a system to expose marine organisms to hazardous waste emissions in order to assess the potential toxicity of incinerator plumes at sea as they contact the marine environment through air-sea exchange and initial mixing. A sampli...

  16. A methodology for the assessment of flood hazards at the regional scale

    NASA Astrophysics Data System (ADS)

    Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Zabeo, Alex; Semenzin, Elena; Marcomini, Antonio

    2013-04-01

    In recent years, the frequency of water-related disasters has increased and recent flood events in Europe (e.g. 2002 in Central Europe, 2007 in UK, 2010 in Italy) caused physical-environmental and socio-economic damages. Specifically, floods are the most threatening water-related disaster that affects humans, their lives and properties. Within the KULTURisk project (FP7) a Regional Risk Assessment (RRA) methodology is proposed to evaluate the benefits of risk prevention in terms of reduced environmental risks due to floods. The method is based on the KULTURisk framework and allows the identification and prioritization of targets (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) and areas at risk from floods in the considered region by comparing the baseline scenario (i.e. current state) with alternative scenarios (i.e. where different structural and/or non-structural measures are planned). The RRA methodology is flexible and can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale). The final aim of RRA is to help decision-makers in examining the possible environmental risks associated with uncertain future flood hazards and in identifying which prevention scenario could be the most suitable one. The RRA methodology employs Multi-Criteria Decision Analysis (MCDA functions) in order to integrate stakeholder preferences and experts judgments into the analysis. Moreover, Geographic Information Systems (GISs) are used to manage, process, analyze, and map data to facilitate the analysis and the information sharing with different experts and stakeholders. In order to characterize flood risks, the proposed methodology integrates the output of hydrodynamic models with the analysis of site-specific bio-geophysical and socio-economic indicators (e.g. slope of the territory, land cover, population density, economic activities) of several case studies in order to develop risk maps that identify and prioritize relative hot-spot areas and targets at risk at the regional scale. The main outputs of the RRA are receptor-based maps of risks useful to communicate the potential implications of floods in non-monetary terms to stakeholders and administrations. These maps can be a basis for the management of flood risks as they can provide information about the indicative number of inhabitants, the type of economic activities, natural systems and cultural heritages potentially affected by flooding. Moreover, they can provide suitable information about flood risk in the considered area in order to define priorities for prevention measures, for land use planning and management. Finally, the outputs of the RRA methodology can be used as data input in the Socio- Economic Regional Risk Assessment methodology for the economic evaluation of different damages (e.g. tangible costs, intangible costs) and for the social assessment considering the benefits of the human dimension of vulnerability (i.e. adaptive and coping capacity). Within the KULTURisk project, the methodology has been applied and validated in several European case studies. Moreover, its generalization to address other types of natural hazards (e.g. earthquakes, forest fires) will be evaluated. The preliminary results of the RRA application in the KULTURisk project will be here presented and discussed.

  17. A Vulnerability Index and Analysis for the Road Network of Rural Chile

    NASA Astrophysics Data System (ADS)

    Braun, Andreas; Stötzer, Johanna; Kubisch, Susanne; Dittrich, Andre; Keller, Sina

    2017-04-01

    Natural hazards impose considerable threats to the physical and socio-economic wellbeing of people, a fact, which is well understood and investigated for many regions. However, not only people are vulnerable. During the last decades, a considerable amount of literature has focussed the particular vulnerability of the critical infrastructure: for example road networks. Considering critical infrastructure, far less reliable information exists for many regions worldwide - particularly, regions outside of the so called developed world. Critical infrastructure is destroyed in many disasters, causing cascade and follow up effects, for instance, impediments during evacuation, rescue and during the resilience phase. These circumstances, which are general enough to be applied to most regions, aggravate in regions characterized by high disparities between the urban and the rural sphere. Peripheral rural areas are especially prone to get isolated due to defects of the few roads which connect them to larger urban centres (where, frequently, disaster and emergency actors are situated). The rural area of Central Chile is a appropriate example for these circumstances. It is prone to destruction by several geo-hazards and furthermore, characterized by the aforementioned disparities. Past disasters, e.g. the 1991 Cerro Hudson eruption and the 2010 Maule earthquake have led to follow up effects (e.g. farmers, being unable to evacuate their animals due to road failures in the first case, and difficultires to evacuate people from places such as Caleta Tumbes or Dichato, which are connected by just a single road only in the second). The contribution develops a methodology to investigate into the critical infrastructure of such places. It develops a remoteness index for Chile, which identifies remote, peripheral rural areas, prone to get isolated due to road network failures during disasters. The approach is graph based. It offers particular advantages for regions like rural Chile since 1. it does not require traffic flow data which do not exist, 2. identifies peripheral areas particularly well, 3. identifies both nodes (places) prone to isolation and edges (roads) critical for the connectivity of rural areas, 4. based on a mathematical structure, it implies several possible planning solutions to reduce vulnerability of the critical infrastructure and people dependent on it. The methodology is presented and elaborated theoretically. Afterwards, it is demonstrated on an actual dataset from central Chile. It is demonstrated, how the methodology can be applied to derive planning solutions for peripheral rural areas.

  18. Concerns related to Safety Management of Engineered Nanomaterials in research environment

    NASA Astrophysics Data System (ADS)

    Groso, A.; Meyer, Th

    2013-04-01

    Since the rise of occupational safety and health research on nanomaterials a lot of progress has been made in generating health effects and exposure data. However, when detailed quantitative risk analysis is in question, more research is needed, especially quantitative measures of workers exposure and standards to categorize toxicity/hazardousness data. In the absence of dose-response relationships and quantitative exposure measurements, control banding (CB) has been widely adopted by OHS community as a pragmatic tool in implementing a risk management strategy based on a precautionary approach. Being in charge of health and safety in a Swiss university, where nanomaterials are largely used and produced, we are also faced with the challenge related to nanomaterials' occupational safety. In this work, we discuss the field application of an in-house risk management methodology similar to CB as well as some other methodologies. The challenges and issues related to the process will be discussed. Since exact data on nanomaterials hazardousness are missing for most of the situations, we deduce that the outcome of the analysis for a particular process is essentially the same with a simple methodology that determines only exposure potential and the one taking into account the hazardousness of ENPs. It is evident that when reliable data on hazardousness factors (as surface chemistry, solubility, carcinogenicity, toxicity etc.) will be available, more differentiation will be possible in determining the risk for different materials. On the protective measures side, all CB methodologies are inclined to overprotection side, only that some of them suggest comprehensive protective/preventive measures and others remain with basic advices. The implementation and control of protective measures in research environment will also be discussed.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeRosa, C.T.; Choudhury, H.; Schoeny, R.S.

    Risk assessment can be thought of as a conceptual approach to bridge the gap between the available data and the ultimate goal of characterizing the risk or hazard associated with a particular environmental problem. To lend consistency to and to promote quality in the process, the US Environmental Protection Agency (EPA) published Guidelines for Risk Assessment of Carcinogenicity, Developmental Toxicity, Germ Cell Mutagenicity and Exposure Assessment, and Risk Assessment of Chemical Mixtures. The guidelines provide a framework for organizing the information, evaluating data, and for carrying out the risk assessment in a scientifically plausible manner. In the absence of sufficientmore » scientific information or when abundant data are available, the guidelines provide alternative methodologies that can be employed in the risk assessment. 4 refs., 3 figs., 2 tabs.« less

  20. Model-Data Fusion and Adaptive Sensing for Large Scale Systems: Applications to Atmospheric Release Incidents

    NASA Astrophysics Data System (ADS)

    Madankan, Reza

    All across the world, toxic material clouds are emitted from sources, such as industrial plants, vehicular traffic, and volcanic eruptions can contain chemical, biological or radiological material. With the growing fear of natural, accidental or deliberate release of toxic agents, there is tremendous interest in precise source characterization and generating accurate hazard maps of toxic material dispersion for appropriate disaster management. In this dissertation, an end-to-end framework has been developed for probabilistic source characterization and forecasting of atmospheric release incidents. The proposed methodology consists of three major components which are combined together to perform the task of source characterization and forecasting. These components include Uncertainty Quantification, Optimal Information Collection, and Data Assimilation. Precise approximation of prior statistics is crucial to ensure performance of the source characterization process. In this work, an efficient quadrature based method has been utilized for quantification of uncertainty in plume dispersion models that are subject to uncertain source parameters. In addition, a fast and accurate approach is utilized for the approximation of probabilistic hazard maps, based on combination of polynomial chaos theory and the method of quadrature points. Besides precise quantification of uncertainty, having useful measurement data is also highly important to warranty accurate source parameter estimation. The performance of source characterization is highly affected by applied sensor orientation for data observation. Hence, a general framework has been developed for the optimal allocation of data observation sensors, to improve performance of the source characterization process. The key goal of this framework is to optimally locate a set of mobile sensors such that measurement of textit{better} data is guaranteed. This is achieved by maximizing the mutual information between model predictions and observed data, given a set of kinetic constraints on mobile sensors. Dynamic Programming method has been utilized to solve the resulting optimal control problem. To complete the loop of source characterization process, two different estimation techniques, minimum variance estimation framework and Bayesian Inference method has been developed to fuse model forecast with measurement data. Incomplete information regarding the distribution of associated noise signal in measurement data, is another major challenge in the source characterization of plume dispersion incidents. This frequently happens in data assimilation of atmospheric data by using the satellite imagery. This occurs due to the fact that satellite imagery data can be polluted with noise, depending on weather conditions, clouds, humidity, etc. Unfortunately, there is no accurate procedure to quantify the error in recorded satellite data. Hence, using classical data assimilation methods in this situation is not straight forward. In this dissertation, the basic idea of a novel approach has been proposed to tackle these types of real world problems with more accuracy and robustness. A simple example demonstrating the real-world scenario is presented to validate the developed methodology.

  1. Hazard and risk assessment strategies for nanoparticle exposures: how far have we come in the past 10 years?

    PubMed Central

    Warheit, David B

    2018-01-01

    Nanotechnology is an emerging, cross-disciplinary technology designed to create and synthesize new materials at the nanoscale (generally defined as a particle size range of ≤10 -9 meters) to generate innovative or altered material properties. The particle properties can be modified to promote different and more flexible applications, resulting in consumer benefits, particularly in medical, cosmetic, and industrial applications. As this applied science matures and flourishes, concerns have arisen regarding potential health effects of exposures to untested materials, as many newly developed products have not been adequately evaluated. Indeed, it is necessary to ensure that societal and commercial advantages are not outweighed by potential human health or environmental disadvantages. Therefore, a variety of international planning activities or research efforts have been proposed or implemented, particularly in the European Union and United States, with the expectation that significant advances will be made in understanding potential hazards related to exposures in the occupational and/or consumer environments. One of the first conclusions reached regarding hazardous effects of nanoparticles stemmed from the findings of early pulmonary toxicology studies, suggesting that lung exposures to ultrafine particles were more toxic than those to larger, fine-sized particles of similar chemistry. This review documents some of the conceptual planning efforts, implementation strategies/activities, and research accomplishments over the past 10 years or so. It also highlights (in this author’s opinion) some shortcomings in the research efforts and accomplishments over the same duration. In general, much progress has been made in developing and implementing environmental, health, and safety research-based protocols for addressing nanosafety issues. However, challenges remain in adequately investigating health effects given 1) many different nanomaterial types, 2) various potential routes of exposure, 3) nanomaterial characterization issues, 4) limitations in research methodologies, such as time-course and dose-response issues, and 5) inadequate in vitro methodologies for in vivo standardized, guideline toxicity testing. PMID:29636906

  2. Hazard and risk assessment strategies for nanoparticle exposures: how far have we come in the past 10 years?

    PubMed

    Warheit, David B

    2018-01-01

    Nanotechnology is an emerging, cross-disciplinary technology designed to create and synthesize new materials at the nanoscale (generally defined as a particle size range of ≤10 -9 meters) to generate innovative or altered material properties. The particle properties can be modified to promote different and more flexible applications, resulting in consumer benefits, particularly in medical, cosmetic, and industrial applications. As this applied science matures and flourishes, concerns have arisen regarding potential health effects of exposures to untested materials, as many newly developed products have not been adequately evaluated. Indeed, it is necessary to ensure that societal and commercial advantages are not outweighed by potential human health or environmental disadvantages. Therefore, a variety of international planning activities or research efforts have been proposed or implemented, particularly in the European Union and United States, with the expectation that significant advances will be made in understanding potential hazards related to exposures in the occupational and/or consumer environments. One of the first conclusions reached regarding hazardous effects of nanoparticles stemmed from the findings of early pulmonary toxicology studies, suggesting that lung exposures to ultrafine particles were more toxic than those to larger, fine-sized particles of similar chemistry. This review documents some of the conceptual planning efforts, implementation strategies/activities, and research accomplishments over the past 10 years or so. It also highlights (in this author's opinion) some shortcomings in the research efforts and accomplishments over the same duration. In general, much progress has been made in developing and implementing environmental, health, and safety research-based protocols for addressing nanosafety issues. However, challenges remain in adequately investigating health effects given 1) many different nanomaterial types, 2) various potential routes of exposure, 3) nanomaterial characterization issues, 4) limitations in research methodologies, such as time-course and dose-response issues, and 5) inadequate in vitro methodologies for in vivo standardized, guideline toxicity testing.

  3. Use of Electrical Conductivity Logging to Characterize the Geological Context of Releases at UST Sites

    EPA Science Inventory

    Risk is the combination of hazard and exposure. Risk characterization at UST release sites has traditionally emphasized hazard (presence of residual fuel) with little attention to exposure. Exposure characterization often limited to a one-dimensional model such as the RBCA equa...

  4. Review article: the use of remotely piloted aircraft systems (RPASs) for natural hazards monitoring and management

    NASA Astrophysics Data System (ADS)

    Giordan, Daniele; Hayakawa, Yuichi; Nex, Francesco; Remondino, Fabio; Tarolli, Paolo

    2018-04-01

    The number of scientific studies that consider possible applications of remotely piloted aircraft systems (RPASs) for the management of natural hazards effects and the identification of occurred damages strongly increased in the last decade. Nowadays, in the scientific community, the use of these systems is not a novelty, but a deeper analysis of the literature shows a lack of codified complex methodologies that can be used not only for scientific experiments but also for normal codified emergency operations. RPASs can acquire on-demand ultra-high-resolution images that can be used for the identification of active processes such as landslides or volcanic activities but can also define the effects of earthquakes, wildfires and floods. In this paper, we present a review of published literature that describes experimental methodologies developed for the study and monitoring of natural hazards.

  5. 'Worst case' methodology for the initial assessment of societal risk from proposed major accident installations.

    PubMed

    Carter, D A; Hirst, I L

    2000-01-07

    This paper considers the application of one of the weighted risk indicators used by the Major Hazards Assessment Unit (MHAU) of the Health and Safety Executive (HSE) in formulating advice to local planning authorities on the siting of new major accident hazard installations. In such cases the primary consideration is to ensure that the proposed installation would not be incompatible with existing developments in the vicinity, as identified by the categorisation of the existing developments and the estimation of individual risk values at those developments. In addition a simple methodology, described here, based on MHAU's "Risk Integral" and a single "worst case" even analysis, is used to enable the societal risk aspects of the hazardous installation to be considered at an early stage of the proposal, and to determine the degree of analysis that will be necessary to enable HSE to give appropriate advice.

  6. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  7. Modelling the changing cumulative vulnerability to climate-related hazards for river basin management using a GIS-based multicriteria decision approach

    NASA Astrophysics Data System (ADS)

    Hung, Hung-Chih; Wu, Ju-Yu; Hung, Chih-Hsuan

    2017-04-01

    1. Background Asia-Pacific region is one of the most vulnerable areas of the world to climate-related hazards and extremes due to rapid urbanization and over-development in hazard-prone areas. It is thus increasingly recognized that the management of land use and reduction of hazard risk are inextricably linked. This is especially critical from the perspective of integrated river basin management. A range of studies has targeted existing vulnerability assessments. However, limited attention has been paid to the cumulative effects of multiple vulnerable factors and their dynamics faced by local communities. This study proposes a novel methodology to access the changing cumulative vulnerability to climate-related hazards, and to examine the relationship between the attraction factors relevant to the general process of urbanization and vulnerability variability with a focus on a river basin management unit. 2. Methods and data The methods applied in this study include three steps. First, using Intergovernmental Panel on Climate Change's (IPCC) approach, a Cumulative Vulnerability Assessment Framework (CVAF) is built with a goal to characterize and compare the vulnerability to climate-related hazards within river basin regions based on a composition of multiple indicators. We organize these indicator metrics into three categories: (1) hazard exposure; (2) socioeconomic sensitivity, and (3) adaptive capacity. Second, the CVAF is applied by combining a geographical information system (GIS)-based spatial statistics technique with a multicriteria decision analysis (MCDA) to assess and map the changing cumulative vulnerability, comparing conditions in 1996 and 2006 in Danshui River Basin, Taiwan. Third, to examine the affecting factors of vulnerability changing, we develop a Vulnerability Changing Model (VCM) using four attraction factors to reflect how the process of urban developments leads to vulnerability changing. The factors are transport networks, land uses, production values of industries, and infrastructures. We then conduct a regression analysis to test the VCM. To illustrate the proposed methodology, the data are collected from the National Science and Technology Center for Disaster Reduction, Taiwan as well as the National Land Use Investigation and official census statistics. 3. Results and policy implications Results of CVAF analysis demonstrate heterogeneous patterns of vulnerability in the region, and highlight trends of long-term changes. The vulnerable areas unfold as clustered patterns and spatial analogues across regions, rather than randomly distributed. Highest cumulative vulnerability is concentrated in densely populated and downstream reaches (such as Taipei City) of the Danshui River in both time periods. When examining the VCM, it indicates that upper stream and more remote areas generally show low vulnerability, increases are observed in some areas between 1996 and 2006 due to land use intensification, industrial and infrastructure expansion. These findings suggest that land use planning should consider the socioeconomic progression and infrastructure investment factors that contribute to urban sprawl and address current as well as future urban developments vulnerable to hazard risk transmission. The cumulative vulnerability assessment, mapping methods and modelling presented here can be applied to other climate change and hazard risks to highlight priority areas for further investigation and contribute towards improving river basin management.

  8. Flood Hazard Mapping by Applying Fuzzy TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.

    2017-12-01

    There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS Acknowlegement This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  9. The availability of public information for insurance risk decision-making in the UK

    NASA Astrophysics Data System (ADS)

    Davis, Nigel; Gibbs, Mark; Chadwick, Ben; Foote, Matthew

    2010-05-01

    At present, there is a wealth of hazard and exposure data which cannot or is not being full used by risk modelling community. The reasons for this under-utilisation of data are many: restrictive and complex data policies and pricing, risks involved in information sharing, technological shortcomings, and variable resolution of data, particularly with catastrophe models only recently having been adjusted to consume high-resolution exposure data. There is therefore an urgent need for the development of common modelling practices and applications for climate and geo-hazard risk assessment, all of which would be highly relevant to public policy, disaster risk management and financial risk transfer communities. This paper will present a methodology to overcome these obstacles and to review the availability of hazard data at research institutions in a consistent format. Such a methodology would facilitate the collation of hazard and other auxiliary data, as well as present data within a geo-spatial framework suitable for public and commercial use. The methodology would also review the suitability of datasets and how these could be made more freely available in conjunction with other research institutions in order to present a consistent data standard. It is clear that an understanding of these different issues of data and data standards have significant ramifications when used in Natural Hazard Risk Assessment. Scrutinising the issue of data standards also allows the data to be evaluated and re-evaluated for its gaps, omissions, fitness, purpose, availability and precision. Not only would there be a quality check on data, but it would also help develop and fine-tune the tools used for decision-making and assessment of risk.

  10. Central and Eastern United States (CEUS) Seismic Source Characterization (SSC) for Nuclear Facilities Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin J. Coppersmith; Lawrence A. Salomone; Chris W. Fuller

    2012-01-31

    This report describes a new seismic source characterization (SSC) model for the Central and Eastern United States (CEUS). It will replace the Seismic Hazard Methodology for the Central and Eastern United States, EPRI Report NP-4726 (July 1986) and the Seismic Hazard Characterization of 69 Nuclear Plant Sites East of the Rocky Mountains, Lawrence Livermore National Laboratory Model, (Bernreuter et al., 1989). The objective of the CEUS SSC Project is to develop a new seismic source model for the CEUS using a Senior Seismic Hazard Analysis Committee (SSHAC) Level 3 assessment process. The goal of the SSHAC process is to representmore » the center, body, and range of technically defensible interpretations of the available data, models, and methods. Input to a probabilistic seismic hazard analysis (PSHA) consists of both seismic source characterization and ground motion characterization. These two components are used to calculate probabilistic hazard results (or seismic hazard curves) at a particular site. This report provides a new seismic source model. Results and Findings The product of this report is a regional CEUS SSC model. This model includes consideration of an updated database, full assessment and incorporation of uncertainties, and the range of diverse technical interpretations from the larger technical community. The SSC model will be widely applicable to the entire CEUS, so this project uses a ground motion model that includes generic variations to allow for a range of representative site conditions (deep soil, shallow soil, hard rock). Hazard and sensitivity calculations were conducted at seven test sites representative of different CEUS hazard environments. Challenges and Objectives The regional CEUS SSC model will be of value to readers who are involved in PSHA work, and who wish to use an updated SSC model. This model is based on a comprehensive and traceable process, in accordance with SSHAC guidelines in NUREG/CR-6372, Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts. The model will be used to assess the present-day composite distribution for seismic sources along with their characterization in the CEUS and uncertainty. In addition, this model is in a form suitable for use in PSHA evaluations for regulatory activities, such as Early Site Permit (ESPs) and Combined Operating License Applications (COLAs). Applications, Values, and Use Development of a regional CEUS seismic source model will provide value to those who (1) have submitted an ESP or COLA for Nuclear Regulatory Commission (NRC) review before 2011; (2) will submit an ESP or COLA for NRC review after 2011; (3) must respond to safety issues resulting from NRC Generic Issue 199 (GI-199) for existing plants and (4) will prepare PSHAs to meet design and periodic review requirements for current and future nuclear facilities. This work replaces a previous study performed approximately 25 years ago. Since that study was completed, substantial work has been done to improve the understanding of seismic sources and their characterization in the CEUS. Thus, a new regional SSC model provides a consistent, stable basis for computing PSHA for a future time span. Use of a new SSC model reduces the risk of delays in new plant licensing due to more conservative interpretations in the existing and future literature. Perspective The purpose of this study, jointly sponsored by EPRI, the U.S. Department of Energy (DOE), and the NRC was to develop a new CEUS SSC model. The team assembled to accomplish this purpose was composed of distinguished subject matter experts from industry, government, and academia. The resulting model is unique, and because this project has solicited input from the present-day larger technical community, it is not likely that there will be a need for significant revision for a number of years. See also Sponsors Perspective for more details. The goal of this project was to implement the CEUS SSC work plan for developing a regional CEUS SSC model. The work plan, formulated by the project manager and a technical integration team, consists of a series of tasks designed to meet the project objectives. This report was reviewed by a participatory peer review panel (PPRP), sponsor reviewers, the NRC, the U.S. Geological Survey, and other stakeholders. Comments from the PPRP and other reviewers were considered when preparing the report. The SSC model was completed at the end of 2011.« less

  11. Methodology for assessing the safety of Hydrogen Systems: HyRAM 1.1 technical reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina; Hecht, Ethan; Reynolds, John Thomas

    The HyRAM software toolkit provides a basis for conducting quantitative risk assessment and consequence modeling for hydrogen infrastructure and transportation systems. HyRAM is designed to facilitate the use of state-of-the-art science and engineering models to conduct robust, repeatable assessments of hydrogen safety, hazards, and risk. HyRAM is envisioned as a unifying platform combining validated, analytical models of hydrogen behavior, a stan- dardized, transparent QRA approach, and engineering models and generic data for hydrogen installations. HyRAM is being developed at Sandia National Laboratories for the U. S. De- partment of Energy to increase access to technical data about hydrogen safety andmore » to enable the use of that data to support development and revision of national and international codes and standards. This document provides a description of the methodology and models contained in the HyRAM version 1.1. HyRAM 1.1 includes generic probabilities for hydrogen equipment fail- ures, probabilistic models for the impact of heat flux on humans and structures, and computa- tionally and experimentally validated analytical and first order models of hydrogen release and flame physics. HyRAM 1.1 integrates deterministic and probabilistic models for quantifying accident scenarios, predicting physical effects, and characterizing hydrogen hazards (thermal effects from jet fires, overpressure effects from deflagrations), and assessing impact on people and structures. HyRAM is a prototype software in active development and thus the models and data may change. This report will be updated at appropriate developmental intervals.« less

  12. Use of quantified risk assessment techniques in relation to major hazard installations

    NASA Astrophysics Data System (ADS)

    Elliott, M. J.

    Over the past decade, industry and regulatory authorities have expressed interest in the development and use of hazard assessment techniques, particularly in relation to the control of major hazards. However, misconceptions about the methodology and role of quantified hazard assessment techniques in decision-making has hindered productive dialogues on the use and value of these techniques, both within industry and between industry and regulatory authorities. This Paper outlines the nature, role and current uses of hazard assessment as perceived by the author; and identifies and differentiates between those areas and types of decisions where quantification should prove beneficial, and those where it is unwarranted and should be discouraged.

  13. An integrated quality function deployment and capital budgeting methodology for occupational safety and health as a systems thinking approach: the case of the construction industry.

    PubMed

    Bas, Esra

    2014-07-01

    In this paper, an integrated methodology for Quality Function Deployment (QFD) and a 0-1 knapsack model is proposed for occupational safety and health as a systems thinking approach. The House of Quality (HoQ) in QFD methodology is a systematic tool to consider the inter-relationships between two factors. In this paper, three HoQs are used to consider the interrelationships between tasks and hazards, hazards and events, and events and preventive/protective measures. The final priority weights of events are defined by considering their project-specific preliminary weights, probability of occurrence, and effects on the victim and the company. The priority weights of the preventive/protective measures obtained in the last HoQ are fed into a 0-1 knapsack model for the investment decision. Then, the selected preventive/protective measures can be adapted to the task design. The proposed step-by-step methodology can be applied to any stage of a project to design the workplace for occupational safety and health, and continuous improvement for safety is endorsed by the closed loop characteristic of the integrated methodology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Probabilistic Flood Maps to support decision-making: Mapping the Value of Information

    NASA Astrophysics Data System (ADS)

    Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.

    2016-02-01

    Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.

  15. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less

  16. An approach to trial design and analysis in the era of non-proportional hazards of the treatment effect.

    PubMed

    Royston, Patrick; Parmar, Mahesh K B

    2014-08-07

    Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.

  17. New mapping technologies - mapping and imaging from space

    NASA Technical Reports Server (NTRS)

    Blom, R. G.

    2000-01-01

    New and significantly enhanced space based observational capabiities are available which are of potential use to the hazards community. In combination with existing methodologies, these instruments and data can significantly enhance and extend current procedures for seismic zonation and hazards evaluation. This paper provides a brief overview of several of the more useful data sets available.

  18. Characterisation of Liquefaction Effects for Beyond-Design Basis Safety Assessment of Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Bán, Zoltán; Győri, Erzsébet; János Katona, Tamás; Tóth, László

    2015-04-01

    Preparedness of nuclear power plants to beyond design base external effects became high importance after 11th of March 2011 Great Tohoku Earthquakes. In case of some nuclear power plants constructed at the soft soil sites, liquefaction should be considered as a beyond design basis hazard. The consequences of liquefaction have to be analysed with the aim of definition of post-event plant condition, identification of plant vulnerabilities and planning the necessary measures for accident management. In the paper, the methodology of the analysis of liquefaction effects for nuclear power plants is outlined. The case of Nuclear Power Plant at Paks, Hungary is used as an example for demonstration of practical importance of the presented results and considerations. Contrary to the design, conservatism of the methodology for the evaluation of beyond design basis liquefaction effects for an operating plant has to be limited to a reasonable level. Consequently, applicability of all existing methods has to be considered for the best estimation. The adequacy and conclusiveness of the results is mainly limited by the epistemic uncertainty of the methods used for liquefaction hazard definition and definition of engineering parameters characterizing the consequences of liquefaction. The methods have to comply with controversial requirements. They have to be consistent and widely accepted and used in the practice. They have to be based on the comprehensive database. They have to provide basis for the evaluation of dominating engineering parameters that control the post-liquefaction response of the plant structures. Experience of Kashiwazaki-Kariwa plant hit by Niigata-ken Chuetsu-oki earthquake of 16 July 2007 and analysis of site conditions and plant layout at Paks plant have shown that the differential settlement is found to be the dominating effect in case considered. They have to be based on the probabilistic seismic hazard assessment and allow the integration into logic-tree procedure. Earlier studies have shown that the potentially liquefiable layer at Paks Nuclear Power Plant is situated in relatively large depth. Therefore the applicability and adequacy of the methods at high overburden pressure is important. In case of existing facilities, the geotechnical data gained before construction aren't sufficient for the comprehensive liquefaction analysis. Performance of new geotechnical survey is limited. Consequently, the availability of the data has to be accounted while selection the analysis methods. Considerations have to be made for dealing with aleatory uncertainty related to the knowledge of the soil conditions. It is shown in the paper, a careful comparison and analysis of the results obtained by different methodologies provides the basis of the selection of practicable methods for the safety analysis of nuclear power plant for beyond design basis liquefaction hazard.

  19. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  20. A balanced hazard ratio for risk group evaluation from survival data.

    PubMed

    Branders, Samuel; Dupont, Pierre

    2015-07-30

    Common clinical studies assess the quality of prognostic factors, such as gene expression signatures, clinical variables or environmental factors, and cluster patients into various risk groups. Typical examples include cancer clinical trials where patients are clustered into high or low risk groups. Whenever applied to survival data analysis, such groups are intended to represent patients with similar survival odds and to select the most appropriate therapy accordingly. The relevance of such risk groups, and of the related prognostic factors, is typically assessed through the computation of a hazard ratio. We first stress three limitations of assessing risk groups through the hazard ratio: (1) it may promote the definition of arbitrarily unbalanced risk groups; (2) an apparently optimal group hazard ratio can be largely inconsistent with the p-value commonly associated to it; and (3) some marginal changes between risk group proportions may lead to highly different hazard ratio values. Those issues could lead to inappropriate comparisons between various prognostic factors. Next, we propose the balanced hazard ratio to solve those issues. This new performance metric keeps an intuitive interpretation and is as simple to compute. We also show how the balanced hazard ratio leads to a natural cut-off choice to define risk groups from continuous risk scores. The proposed methodology is validated through controlled experiments for which a prescribed cut-off value is defined by design. Further results are also reported on several cancer prognosis studies, and the proposed methodology could be applied more generally to assess the quality of any prognostic markers. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Assessing the impacts induced by global climate change through a multi-risk approach: lessons learned from the North Adriatic coast (Italy)

    NASA Astrophysics Data System (ADS)

    Gallina, Valentina; Torressan, Silvia; Zabeo, Alex; Critto, Andrea; Glade, Thomas; Marcomini, Antonio

    2015-04-01

    Climate change is expected to pose a wide range of impacts on natural and human systems worldwide, increasing risks from long-term climate trends and disasters triggered by weather extremes. Accordingly, in the future, one region could be potentially affected by interactions, synergies and trade-offs of multiple hazards and impacts. A multi-risk risk approach is needed to effectively address multiple threats posed by climate change across regions and targets supporting decision-makers toward a new paradigm of multi-hazard and risk management. Relevant initiatives have been already developed for the assessment of multiple hazards and risks affecting the same area in a defined timeframe by means of quantitative and semi-quantitative approaches. Most of them are addressing the relations of different natural hazards, however, the effect of future climate change is usually not considered. In order to fill this gap, an advanced multi-risk methodology was developed at the Euro-Mediterranean Centre on Climate Change (CMCC) for estimating cumulative impacts related to climate change at the regional (i.e. sub-national) scale. This methodology was implemented into an assessment tool which allows to scan and classify quickly natural systems and human assets at risk resulting from different interacting hazards. A multi-hazard index is proposed to evaluate the relationships of different climate-related hazards (e.g. sea-level rise, coastal erosion, storm surge) occurring in the same spatial and temporal area, by means of an influence matrix and the disjoint probability function. Future hazard scenarios provided by regional climate models are used as input for this step in order to consider possible effects of future climate change scenarios. Then, the multi-vulnerability of different exposed receptors (e.g. natural systems, beaches, agricultural and urban areas) is estimated through a variety of vulnerability indicators (e.g. vegetation cover, sediment budget, % of urbanization), tailored case by case to different sets of natural hazards and elements at risk. Finally, the multi-risk assessment integrates the multi-hazard with the multi-vulnerability index of exposed receptors, providing a relative ranking of areas and targets potentially affected by multiple risks in the considered region. The methodology was applied to the North Adriatic coast (Italy) producing a range of GIS-based multi-hazard, exposure, multi-vulnerability and multi-risk maps that can be used by policy-makers to define risk management and adaptation strategies. Results show that areas affected by higher multi-hazard scores are located close to the coastline where all the investigated hazards are present. Multi-vulnerability assumes relatively high scores in the whole case study, showing that beaches, wetlands, protected areas and river mouths are the more sensible targets. The final estimate of multi-risk for coastal municipalities provides useful information for local public authorities to set future priorities for adaptation and define future plans for shoreline and coastal management in view of climate change.

  2. Landslide hazard analysis for pipelines: The case of the Simonette river crossing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grivas, D.A.; Schultz, B.C.; O`Neil, G.

    1995-12-31

    The overall objective of this study is to develop a probabilistic methodology to analyze landslide hazards and their effects on the safety of buried pipelines. The methodology incorporates a range of models that can accommodate differences in the ground movement modes and the amount and type of information available at various site locations. Two movement modes are considered, namely (a) instantaneous (catastrophic) slides, and (b) gradual ground movement which may result in cumulative displacements over the pipeline design life (30--40 years) that are in excess of allowable values. Probabilistic analysis is applied in each case to address the uncertainties associatedmore » with important factors that control slope stability. Availability of information ranges from relatively well studied, instrumented installations to cases where data is limited to what can be derived from topographic and geologic maps. The methodology distinguishes between procedures applied where there is little information and those that can be used when relatively extensive data is available. important aspects of the methodology are illustrated in a case study involving a pipeline located in Northern Alberta, Canada, in the Simonette river valley.« less

  3. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  4. Hazardous Waste Clean-Up Information (CLU-IN) On-line Characterization and Remediation Databases Fact Sheet

    EPA Pesticide Factsheets

    This fact sheet provides an overview of the 10 on-line characterization and remediation databases available on the Hazardous Waste Clean-Up Information (CLU-IN) website sponsored by the U.S. Environmental Protection Agency.

  5. Seismic risk assessment of Navarre (Northern Spain)

    NASA Astrophysics Data System (ADS)

    Gaspar-Escribano, J. M.; Rivas-Medina, A.; García Rodríguez, M. J.; Benito, B.; Tsige, M.; Martínez-Díaz, J. J.; Murphy, P.

    2009-04-01

    The RISNA project, financed by the Emergency Agency of Navarre (Northern Spain), aims at assessing the seismic risk of the entire region. The final goal of the project is the definition of emergency plans for future earthquakes. With this purpose, four main topics are covered: seismic hazard characterization, geotechnical classification, vulnerability assessment and damage estimation to structures and exposed population. A geographic information system is used to integrate, analyze and represent all information colleted in the different phases of the study. Expected ground motions on rock conditions with a 90% probability of non-exceedance in an exposure time of 50 years are determined following a Probabilistic Seismic Hazard Assessment (PSHA) methodology that includes a logic tree with different ground motion and source zoning models. As the region under study is located in the boundary between Spain and France, an effort is required to collect and homogenise seismological data from different national and regional agencies. A new homogenised seismic catalogue, merging data from Spanish, French, Catalonian and international agencies and establishing correlations between different magnitude scales, is developed. In addition, a new seismic zoning model focused on the study area is proposed. Results show that the highest ground motions on rock conditions are expected in the northeastern part of the region, decreasing southwards. Seismic hazard can be expressed as low-to-moderate. A geotechnical classification of the entire region is developed based on surface geology, available borehole data and morphotectonic constraints. Frequency-dependent amplification factors, consistent with code values, are proposed. The northern and southern parts of the region are characterized by stiff and soft soils respectively, being the softest soils located along river valleys. Seismic hazard maps including soil effects are obtained by applying these factors to the seismic hazard maps on rock conditions (for the same probability level). Again, the highest hazard is found in the northeastern part of the region. The lowest hazard is obtained along major river valleys The vulnerability assessment of the Navarra building stock is accomplished using as proxy a combination of building age, location, number of floors and the implantation of building codes. Field surveys help constraining the extent of traditional and technological construction types. The vulnerability characterization is carried out following three methods: European Macroseismic Scale (EMS 98), RISK UE vulnerability index and the capacity spectrum method implemented in Hazus. Vulnerability distribution maps for each Navarrean municipality are provided, adapted to the EMS98 vulnerability classes. The vulnerability of Navarre is medium to high, except for recent urban, highly populated developments. For each vulnerability class and expected ground motion, damage distribution is estimated by means of damage probability matrixes. Several damage indexes, embracing relative and absolute damage estimates, are used. Expected average damage is low. Whereas the largest amounts of damaged structures are found in big cities, the highest percentages are obtained in some muniucipalities of northeastern Navarre. Additionally, expected percentages and amounts of affected persons by earthquake damage are calculated for each municipality. Expected amounts of affected people are low, reflecting the low expected damage degree.

  6. Utilization of accident databases and fuzzy sets to estimate frequency of HazMat transport accidents.

    PubMed

    Qiao, Yuanhua; Keren, Nir; Mannan, M Sam

    2009-08-15

    Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.

  7. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    NASA Astrophysics Data System (ADS)

    Graves, Robert; Jordan, Thomas H.; Callaghan, Scott; Deelman, Ewa; Field, Edward; Juve, Gideon; Kesselman, Carl; Maechling, Philip; Mehta, Gaurang; Milner, Kevin; Okaya, David; Small, Patrick; Vahi, Karan

    2011-03-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i.e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process.

  8. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    USGS Publications Warehouse

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process. ?? 2010 Springer Basel AG.

  9. The effect of directivity in a PSHA framework

    NASA Astrophysics Data System (ADS)

    Spagnuolo, E.; Herrero, A.; Cultrera, G.

    2012-09-01

    We propose a method to introduce a refined representation of the ground motion in the framework of the Probabilistic Seismic Hazard Analysis (PSHA). This study is especially oriented to the incorporation of a priori information about source parameters, by focusing on the directivity effect and its influence on seismic hazard maps. Two strategies have been followed. One considers the seismic source as an extended source, and it is valid when the PSHA seismogenetic sources are represented as fault segments. We show that the incorporation of variables related to the directivity effect can lead to variations up to 20 per cent of the hazard level in case of dip-slip faults with uniform distribution of hypocentre location, in terms of spectral acceleration response at 5 s, exceeding probability of 10 per cent in 50 yr. The second one concerns the more general problem of the seismogenetic areas, where each point is a seismogenetic source having the same chance of enucleate a seismic event. In our proposition the point source is associated to the rupture-related parameters, defined using a statistical description. As an example, we consider a source point of an area characterized by strike-slip faulting style. With the introduction of the directivity correction the modulation of the hazard map reaches values up to 100 per cent (for strike-slip, unilateral faults). The introduction of directivity does not increase uniformly the hazard level, but acts more like a redistribution of the estimation that is consistent with the fault orientation. A general increase appears only when no a priori information is available. However, nowadays good a priori knowledge exists on style of faulting, dip and orientation of faults associated to the majority of the seismogenetic zones of the present seismic hazard maps. The percentage of variation obtained is strongly dependent on the type of model chosen to represent analytically the directivity effect. Therefore, it is our aim to emphasize more on the methodology following which, all the information collected may be easily converted to obtain a more comprehensive and meaningful probabilistic seismic hazard formulation.

  10. Landscape Hazards in Yukon Communities: Geological Mapping for Climate Change Adaptation Planning

    NASA Astrophysics Data System (ADS)

    Kennedy, K.; Kinnear, L.

    2010-12-01

    Climate change is considered to be a significant challenge for northern communities where the effects of increased temperature and climate variability are beginning to affect infrastructure and livelihoods (Arctic Climate Impact Assessment, 2004). Planning for and adapting to ongoing and future changes in climate will require the identification and characterization of social, economic, cultural, political and biophysical vulnerabilities. This pilot project addresses physical landscape vulnerabilities in two communities in the Yukon Territory through community-scale landscape hazard mapping and focused investigations of community permafrost conditions. Landscape hazards are identified by combining pre-existing data from public utilities and private-sector consultants with new geophysical techniques (ground penetrating radar and electrical resistivity), shallow drilling, surficial geological mapping, and permafrost characterization. Existing landscape vulnerabilities are evaluated based on their potential for hazard (low, medium or high) under current climate conditions, as well as under future climate scenarios. Detailed hazard maps and landscape characterizations for both communities will contribute to overall adaptation plans and allow for informed development, planning and mitigation of potentially threatening hazards in and around the communities.

  11. Microbial Characterization and Comparison of Isolates During the Mir and ISS Missions

    NASA Technical Reports Server (NTRS)

    Fontenot, Sondra L.; Castro, Victoria; Bruce, Rebekah; Ott, C. Mark; Pierson, Duane L.

    2004-01-01

    Spacecraft represent a semi-closed ecosystem that provides a unique model of microbial interaction with other microbes, potential hosts, and their environment. Environmental samples from the Mir Space Station (1995-1998) and the International Space Station (ISS) (2000-Present) were collected and processed to provide insight into the characterization of microbial diversity aboard spacecraft over time and assess any potential health risks to the crew. All microbiota were isolated using standard media-based methodologies. Isolates from Mir and ISS were processed using various methods of analysis, including VITEK biochemical analysis, 16s ribosomal identification, and fingerprinting using rep-PCR analysis. Over the first 41 months of habitation, the diversity of the microbiota from air and surface samples aboard ISS increased from an initial six to 53 different bacterial species. During the same period, fungal diversity increased from 2 to 24 species. Based upon rep-PCR analysis, the majority of isolates were unique suggesting the need for increased sampling frequency and a more thorough analysis of samples to properly characterize the ISS microbiota. This limited fungal and bacterial data from environmental samples acquired during monitoring currently do not indicate a microbial hazard to ISS or any trends suggesting potential health risks.

  12. Reviewing and visualizing the interactions of natural hazards

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2014-12-01

    This paper presents a broad overview, characterization, and visualization of the interaction relationships between 21 natural hazards, drawn from six hazard groups (geophysical, hydrological, shallow Earth, atmospheric, biophysical, and space hazards). A synthesis is presented of the identified interaction relationships between these hazards, using an accessible visual format particularly suited to end users. Interactions considered are primarily those where a primary hazard triggers or increases the probability of secondary hazards occurring. In this paper we do the following: (i) identify, through a wide-ranging review of grey- and peer-review literature, 90 interactions; (ii) subdivide the interactions into three levels, based on how well we can characterize secondary hazards, given information about the primary hazard; (iii) determine the spatial overlap and temporal likelihood of the triggering relationships occurring; and (iv) examine the relationship between primary and secondary hazard intensities for each identified hazard interaction and group these into five possible categories. In this study we have synthesized, using accessible visualization techniques, large amounts of information drawn from many scientific disciplines. We outline the importance of constraining hazard interactions and reinforce the importance of a holistic (or multihazard) approach to natural hazard assessment. This approach allows those undertaking research into single hazards to place their work within the context of other hazards. It also communicates important aspects of hazard interactions, facilitating an effective analysis by those working on reducing and managing disaster risk within both the policy and practitioner communities.

  13. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.

  14. Tsunami risk zoning in south-central Chile

    NASA Astrophysics Data System (ADS)

    Lagos, M.

    2010-12-01

    The recent 2010 Chilean tsunami revealed the need to optimize methodologies for assessing the risk of disaster. In this context, modern techniques and criteria for the evaluation of the tsunami phenomenon were applied in the coastal zone of south-central Chile as a specific methodology for the zoning of tsunami risk. This methodology allows the identification and validation of a scenario of tsunami hazard; the spatialization of factors that have an impact on the risk; and the zoning of the tsunami risk. For the hazard evaluation, different scenarios were modeled by means of numerical simulation techniques, selecting and validating the results that better fit with the observed tsunami data. Hydrodynamic parameters of the inundation as well as physical and socioeconomic vulnerability aspects were considered for the spatialization of the factors that affect the tsunami risk. The tsunami risk zoning was integrated into a Geographic Information System (GIS) by means of multicriteria evaluation (MCE). The results of the tsunami risk zoning show that the local characteristics and their location, together with the concentration of poverty levels, establish spatial differentiated risk levels. This information builds the basis for future applied studies in land use planning that tend to minimize the risk levels associated to the tsunami hazard. This research is supported by Fondecyt 11090210.

  15. A New Lifetime Distribution with Bathtube and Unimodal Hazard Function

    NASA Astrophysics Data System (ADS)

    Barriga, Gladys D. C.; Louzada-Neto, Francisco; Cancho, Vicente G.

    2008-11-01

    In this paper we propose a new lifetime distribution which accommodate bathtub-shaped, unimodal, increasing and decreasing hazard function. Some special particular cases are derived, including the standard Weibull distribution. Maximum likelihood estimation is considered for estimate the tree parameters present in the model. The methodology is illustrated in a real data set on industrial devices on a lite test.

  16. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  17. A good practice guide for safe work with nanoparticles: The Quebec approach

    NASA Astrophysics Data System (ADS)

    Ostiguy, C.; Roberge, B.; Ménard, L.; Endo, C. A.

    2009-02-01

    new industrial revolution has already begun around nanotechnologies, letting us anticipate major scientific breakthroughs that will affect each economic activity sector and whose expected global economic impacts will exceed 1000 billion annually by 2012. Simultaneously, many studies reveal that nanoparticles represent different occupational health and safety (OHS) risks unique to them and that often differ from the risks related to the same chemical substances with larger dimensions. As the number of potentially exposed workers increases and much uncertainty persists about OHS risks, this extended abstract proposes a framework for occupational risk management with the objective of controlling exposure to NPs in a context of a major lack of specific data related to the hazards of these substances and to the level of occupational exposure. The framework takes into consideration the equal representation of both the employers and workers in the Québec legislation and accounts the potential routes of exposure and focuses on a structured approach dealing with hazard identification, exposure characterization, risk assessment and risk management through different control methodologies. These are included in a prevention program that must be followed up, once it has been implemented, and refined through an iterative approach as new data become available.

  18. Space vehicle propulsion systems: Environmental space hazards

    NASA Technical Reports Server (NTRS)

    Disimile, P. J.; Bahr, G. K.

    1990-01-01

    The hazards that exist in geolunar space which may degrade, disrupt, or terminate the performance of space-based LOX/LH2 rocket engines are evaluated. Accordingly, a summary of the open literature pertaining to the geolunar space hazards is provided. Approximately 350 citations and about 200 documents and abstracts were reviewed; the documents selected give current and quantitative detail. The methodology was to categorize the various space hazards in relation to their importance in specified regions of geolunar space. Additionally, the effect of the various space hazards in relation to spacecraft and their systems were investigated. It was found that further investigation of the literature would be required to assess the effects of these hazards on propulsion systems per se; in particular, possible degrading effects on exterior nozzle structure, directional gimbals, and internal combustion chamber integrity and geometry.

  19. Assessing the Climate Resilience of Transport Infrastructure Investments in Tanzania

    NASA Astrophysics Data System (ADS)

    Hall, J. W.; Pant, R.; Koks, E.; Thacker, S.; Russell, T.

    2017-12-01

    Whilst there is an urgent need for infrastructure investment in developing countries, there is a risk that poorly planned and built infrastructure will introduce new vulnerabilities. As climate change increases the magnitudes and frequency of natural hazard events, incidence of disruptive infrastructure failures are likely to become more frequent. Therefore, it is important that infrastructure planning and investment is underpinned by climate risk assessment that can inform adaptation planning. Tanzania's rapid economic growth is placing considerable strain on the country's transportation infrastructure (roads, railways, shipping and aviation); especially at the port of Dar es Salaam and its linking transport corridors. A growing number of natural hazard events, in particular flooding, are impacting the reliability of this already over-used network. Here we report on new methodology to analyse vulnerabilities and risks due to failures of key locations in the intermodal transport network of Tanzania, including strategic connectivity to neighboring countries. To perform the national-scale risk analysis we will utilize a system-of-systems methodology. The main components of this general risk assessment, when applied to transportation systems, include: (1) Assembling data on: spatially coherent extreme hazards and intermodal transportation networks; (2) Intersecting hazards with transport network models to initiate failure conditions that trigger failure propagation across interdependent networks; (3) Quantifying failure outcomes in terms of social impacts (customers/passengers disrupted) and/or macroeconomic consequences (across multiple sectors); and (4) Simulating, testing and collecting multiple failure scenarios to perform an exhaustive risk assessment in terms of probabilities and consequences. The methodology is being used to pinpoint vulnerability and reduce climate risks to transport infrastructure investments.

  20. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less

  1. A methodological proposal to evaluate the cost of duration moral hazard in workplace accident insurance.

    PubMed

    Martín-Román, Ángel; Moral, Alfonso

    2017-12-01

    The cost of duration moral hazard in workplace accident insurance has been amply explored by North-American scholars. Given the current context of financial constraints in public accounts, and particularly in the Social Security system, we feel that the issue merits inquiry in the case of Spain. The present research posits a methodological proposal using the econometric technique of stochastic frontiers, which allows us to break down the duration of work-related leave into what we term "economic days" and "medical days". Our calculations indicate that during the 9-year period spanning 2005-2013, the cost of sick leave amongst full-time salaried workers amounted to 6920 million Euros (in constant 2011 Euros). Of this total, and bearing in mind that "economic days" are those attributable to duration moral hazard, over 3000 million Euros might be linked to workplace absenteeism. It is on this figure where economic policy measures might prove more effective.

  2. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevougian, S. David

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent ofmore » this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.]« less

  3. 75 FR 8139 - Biweekly Notice; Applications and Amendments to Facility Operating Licenses Involving No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-23

    ... the large break loss-of-coolant accident (LOCA) analysis methodology with a reference to WCAP-16009-P... required by 10 CFR 50.91(a), the licensee has provided its analysis of the issue of no significant hazards... Section 5.6.5 to incorporate a new large break LOCA analysis methodology. Specifically, the proposed...

  4. A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis

    DTIC Science & Technology

    2012-01-01

    probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY

  5. Food-safety hazards in the pork chain in Nagaland, North East India: implications for human health.

    PubMed

    Fahrion, Anna Sophie; Jamir, Lanu; Richa, Kenivole; Begum, Sonuwara; Rutsa, Vilatuo; Ao, Simon; Padmakumar, Varijaksha P; Deka, Ram Pratim; Grace, Delia

    2013-12-24

    Pork occupies an important place in the diet of the population of Nagaland, one of the North East Indian states. We carried out a pilot study along the pork meat production chain, from live animal to end consumer. The goal was to obtain information about the presence of selected food borne hazards in pork in order to assess the risk deriving from these hazards to the health of the local consumers and make recommendations for improving food safety. A secondary objective was to evaluate the utility of risk-based approaches to food safety in an informal food system. We investigated samples from pigs and pork sourced at slaughter in urban and rural environments, and at retail, to assess a selection of food-borne hazards. In addition, consumer exposure was characterized using information about hygiene and practices related to handling and preparing pork. A qualitative hazard characterization, exposure assessment and hazard characterization for three representative hazards or hazard proxies, namely Enterobacteriaceae, T. solium cysticercosis and antibiotic residues, is presented. Several important potential food-borne pathogens are reported for the first time including Listeria spp. and Brucella suis. This descriptive pilot study is the first risk-based assessment of food safety in Nagaland. We also characterise possible interventions to be addressed by policy makers, and supply data to inform future risk assessments.

  6. Local SPTHA through tsunami inundation simulations: a test case for two coastal critical infrastructures in the Mediterranean

    NASA Astrophysics Data System (ADS)

    Volpe, M.; Selva, J.; Tonini, R.; Romano, F.; Lorito, S.; Brizuela, B.; Argyroudis, S.; Salzano, E.; Piatanesi, A.

    2016-12-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) is a methodology to assess the exceedance probability for different thresholds of tsunami hazard intensity, at a specific site or region in a given time period, due to a seismic source. A large amount of high-resolution inundation simulations is typically required for taking into account the full variability of potential seismic sources and their slip distributions. Starting from regional SPTHA offshore results, the computational cost can be reduced by considering for inundation calculations only a subset of `important' scenarios. We here use a method based on an event tree for the treatment of the seismic source aleatory variability; a cluster analysis on the offshore results to define the important sources; epistemic uncertainty treatment through an ensemble modeling approach. We consider two target sites in the Mediterranean (Milazzo, Italy, and Thessaloniki, Greece) where coastal (non nuclear) critical infrastructures (CIs) are located. After performing a regional SPTHA covering the whole Mediterranean, for each target site, few hundreds of representative scenarios are filtered out of all the potential seismic sources and the tsunami inundation is explicitly modeled, obtaining a site-specific SPTHA, with a complete characterization of the tsunami hazard in terms of flow depth and velocity time histories. Moreover, we also explore the variability of SPTHA at the target site accounting for coseismic deformation (i.e. uplift or subsidence) due to near field sources located in very shallow water. The results are suitable and will be applied for subsequent multi-hazard risk analysis for the CIs. These applications have been developed in the framework of the Italian Flagship Project RITMARE, EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, and of the INGV-DPC Agreement.

  7. Revision of Time-Independent Probabilistic Seismic Hazard Maps for Alaska

    USGS Publications Warehouse

    Wesson, Robert L.; Boyd, Oliver S.; Mueller, Charles S.; Bufe, Charles G.; Frankel, Arthur D.; Petersen, Mark D.

    2007-01-01

    We present here time-independent probabilistic seismic hazard maps of Alaska and the Aleutians for peak ground acceleration (PGA) and 0.1, 0.2, 0.3, 0.5, 1.0 and 2.0 second spectral acceleration at probability levels of 2 percent in 50 years (annual probability of 0.000404), 5 percent in 50 years (annual probability of 0.001026) and 10 percent in 50 years (annual probability of 0.0021). These maps represent a revision of existing maps based on newly obtained data and assumptions reflecting best current judgments about methodology and approach. These maps have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States. A significant improvement relative to the 2002 methodology is the ability to include variable slip rate along a fault where appropriate. These maps incorporate new data, the responses to comments received at workshops held in Fairbanks and Anchorage, Alaska, in May, 2005, and comments received after draft maps were posted on the National Seismic Hazard Mapping Web Site. These maps will be proposed for adoption in future revisions to the International Building Code. In this documentation we describe the maps and in particular explain and justify changes that have been made relative to the 1999 maps. We are also preparing a series of experimental maps of time-dependent hazard that will be described in future documents.

  8. The Cellular Automata for modelling of spreading of lava flow on the earth surface

    NASA Astrophysics Data System (ADS)

    Jarna, A.

    2012-12-01

    Volcanic risk assessment is a very important scientific, political and economic issue in densely populated areas close to active volcanoes. Development of effective tools for early prediction of a potential volcanic hazard and management of crises are paramount. However, to this date volcanic hazard maps represent the most appropriate way to illustrate the geographical area that can potentially be affected by a volcanic event. Volcanic hazard maps are usually produced by mapping out old volcanic deposits, however dynamic lava flow simulation gaining popularity and can give crucial information to corroborate other methodologies. The methodology which is used here for the generation of volcanic hazard maps is based on numerical simulation of eruptive processes by the principle of Cellular Automata (CA). The python script is integrated into ArcToolbox in ArcMap (ESRI) and the user can select several input and output parameters which influence surface morphology, size and shape of the flow, flow thickness, flow velocity and length of lava flows. Once the input parameters are selected, the software computes and generates hazard maps on the fly. The results can be exported to Google Maps (.klm format) to visualize the results of the computation. For validation of the simulation code are used data from a real lava flow. Comparison of the simulation results with real lava flows mapped out from satellite images will be presented.

  9. Evaluation of subsidence hazard in mantled karst setting: a case study from Val d'Orléans (France)

    NASA Astrophysics Data System (ADS)

    Perrin, Jérôme; Cartannaz, Charles; Noury, Gildas; Vanoudheusden, Emilie

    2015-04-01

    Soil subsidence/collapse is a major geohazard occurring in karst region. It occurs as suffosion or dropout sinkholes developing in the soft cover. Less frequently it corresponds to a breakdown of karst void ceiling (i.e., collapse sinkhole). This hazard can cause significant engineering challenges. Therefore decision-makers require the elaboration of methodologies for reliable predictions of such hazards (e.g., karst subsidence susceptibility and hazards maps, early-warning monitoring systems). A methodological framework was developed to evaluate relevant conditioning factors favouring subsidence (Perrin et al. submitted) and then to combine these factors to produce karst subsidence susceptibility maps. This approach was applied to a mantled karst area south of Paris (Val d'Orléans). Results show the significant roles of the overburden lithology (presence/absence of low-permeability layer) and of the karst aquifer piezometric surface position within the overburden. In parallel, an experimental site has been setup to improve the understanding of key processes leading to subsidence/collapse and includes piezometers for measurements of water levels and physico-chemical parameters in both the alluvial and karst aquifers as well as surface deformation monitoring. Results should help in designing monitoring systems to anticipate occurrence of subsidence/collapse. Perrin J., Cartannaz C., Noury G., Vanoudheusden E. 2015. A multicriteria approach to karst subsidence hazard mapping supported by Weights-of-Evidence analysis. Submitted to Engineering Geology.

  10. The Spatial Assessment of the Current Seismic Hazard State for Hard Rock Underground Mines

    NASA Astrophysics Data System (ADS)

    Wesseloo, Johan

    2018-06-01

    Mining-induced seismic hazard assessment is an important component in the management of safety and financial risk in mines. As the seismic hazard is a response to the mining activity, it is non-stationary and variable both in space and time. This paper presents an approach for implementing a probabilistic seismic hazard assessment to assess the current hazard state of a mine. Each of the components of the probabilistic seismic hazard assessment is considered within the context of hard rock underground mines. The focus of this paper is the assessment of the in-mine hazard distribution and does not consider the hazard to nearby public or structures. A rating system and methodologies to present hazard maps, for the purpose of communicating to different stakeholders in the mine, i.e. mine managers, technical personnel and the work force, are developed. The approach allows one to update the assessment with relative ease and within short time periods as new data become available, enabling the monitoring of the spatial and temporal change in the seismic hazard.

  11. A tiered asthma hazard characterization and exposure assessment approach for evaluation of consumer product ingredients.

    PubMed

    Maier, Andrew; Vincent, Melissa J; Parker, Ann; Gadagbui, Bernard K; Jayjock, Michael

    2015-12-01

    Asthma is a complex syndrome with significant consequences for those affected. The number of individuals affected is growing, although the reasons for the increase are uncertain. Ensuring the effective management of potential exposures follows from substantial evidence that exposure to some chemicals can increase the likelihood of asthma responses. We have developed a safety assessment approach tailored to the screening of asthma risks from residential consumer product ingredients as a proactive risk management tool. Several key features of the proposed approach advance the assessment resources often used for asthma issues. First, a quantitative health benchmark for asthma or related endpoints (irritation and sensitization) is provided that extends qualitative hazard classification methods. Second, a parallel structure is employed to include dose-response methods for asthma endpoints and methods for scenario specific exposure estimation. The two parallel tracks are integrated in a risk characterization step. Third, a tiered assessment structure is provided to accommodate different amounts of data for both the dose-response assessment (i.e., use of existing benchmarks, hazard banding, or the threshold of toxicological concern) and exposure estimation (i.e., use of empirical data, model estimates, or exposure categories). Tools building from traditional methods and resources have been adapted to address specific issues pertinent to asthma toxicology (e.g., mode-of-action and dose-response features) and the nature of residential consumer product use scenarios (e.g., product use patterns and exposure durations). A case study for acetic acid as used in various sentinel products and residential cleaning scenarios was developed to test the safety assessment methodology. In particular, the results were used to refine and verify relationships among tiered approaches such that each lower data tier in the approach provides a similar or greater margin of safety for a given scenario. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  12. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less

  13. 78 FR 69745 - Safety and Security Plans for Class 3 Hazardous Materials Transported by Rail

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Security Plans for Class 3 Hazardous Materials Transported by Rail AGENCY: Pipeline and Hazardous Materials... characterization, classification, and selection of a packing group for Class 3 materials, and the corresponding...

  14. Augmenting the Deliberative Method for Ranking Risks.

    PubMed

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. © 2015 Society for Risk Analysis.

  15. Environmental characteristics, agricultural land use, and vulnerability to degradation in Malopolska Province (Poland).

    PubMed

    Nowak, Agnieszka; Schneider, Christian

    2017-07-15

    Environmental degradation encompasses multiple processes that are rarely combined in analyses. This study refers to three types of environmental degradation resulting from agricultural activity: soil erosion, nutrient loss, and groundwater pollution. The research was conducted in seven distinct study areas in the Malopolska Province, Poland, each characterized by different environmental properties. Calculations were made on the basis of common models, i.e., USLE (soil erosion), InVEST (nutrient loss), and DRASTIC (groundwater pollution). Two scenarios were calculated to identify the areas contributing to potential and actual degradation. For the potential degradation scenario all study areas were treated as arable land. To identify the areas actually contributing to all three types of degradation, the de facto land use pattern was used for a second scenario. The results show that the areas most endangered by agricultural activity are located in the mountainous region, whereas most of the degraded zones were located in valley bottoms and areas with intensive agriculture. The different hazards rarely overlap spatially in the given study areas - meaning that different areas require different management approaches. The distribution of arable land was negatively correlated with soil erosion hazard, whereas no linkage was found between nutrient loss or groundwater pollution hazards and the proportion of arable land. This indicates that the soil erosion hazard is the most influential factor in the distribution of arable land, whereas nutrient loss and groundwater pollution is widely ignored during land use decision-making. Slope largely and most frequently influences all hazard types, whereas land use also played an important role in the case of soil and nutrient losses. In this study we presented a consistent methodology to capture complex degradation processes and provide robust indicators which can be included in existing impact assessment approaches like Life Cycle Assessments and Grey Water Footprint analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Resource Reallocation Methodology for the U.S. Army Center for Health Promotion and Preventive Medicine.

    DTIC Science & Technology

    1996-01-01

    review 8.3 55 NDUSTRIAL HYGIENE FIELD SERVICES Ergonomie study 8.2 16 PEST MANAGEMENT Training classes and materials 8.0 37 HAZARDOUS AND MEDICAL WASTE...INDUSTRIAL HYGIENE FIELD SERVICES 9.2 Field study 8.8 0.1 0.9 15 55 INDUSTRIAL HYGIENE FIELD SERVICES 9.2 Ergonomie study 8.2 0.05 0.5 7 37 HAZARDOUS AND

  17. A Combined Hazard Index Fire Test Methodology for Aircraft Cabin Materials. Volume II.

    DTIC Science & Technology

    1982-04-01

    Technical Center. The report was divided into two parts: Part I described the improved technology investigated to upgrade existin methods for testing...proper implementation of the computerized data acquisition and reduction programs will improve materials hazards measurement precision. Thus, other...the hold chamber before and after injection of a sample, will improve precision and repeatability of measurement. The listed data acquisition and

  18. Assessing the Fire Risk for a Historic Hangar

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Morrison, Richard S.

    2010-01-01

    NASA Ames Research Center (ARC) is evaluating options of reuse of its historic Hangar 1. As a part of this evaluation, a qualitative fire risk assessment study was performed to evaluate the potential threat of combustion of the historic hangar. The study focused on the fire risk trade-off of either installing or not installing a Special Hazard Fire Suppression System in the Hangar 1 deck areas. The assessment methodology was useful in discussing the important issues among various groups within the Center. Once the methodology was deemed acceptable, the results were assessed. The results showed that the risk remained in the same risk category, whether Hangar 1 does or does not have a Special Hazard Fire Suppression System. Note that the methodology assessed the risk to Hangar 1 and not the risk to an aircraft in the hangar. If one had a high value aircraft, the aircraft risk analysis could potentially show a different result. The assessed risk results were then communicated to management and other stakeholders.

  19. Integrate urban‐scale seismic hazard analyses with the U.S. National Seismic Hazard Model

    USGS Publications Warehouse

    Moschetti, Morgan P.; Luco, Nicolas; Frankel, Arthur; Petersen, Mark D.; Aagaard, Brad T.; Baltay, Annemarie S.; Blanpied, Michael; Boyd, Oliver; Briggs, Richard; Gold, Ryan D.; Graves, Robert; Hartzell, Stephen; Rezaeian, Sanaz; Stephenson, William J.; Wald, David J.; Williams, Robert A.; Withers, Kyle

    2018-01-01

    For more than 20 yrs, damage patterns and instrumental recordings have highlighted the influence of the local 3D geologic structure on earthquake ground motions (e.g., M">M 6.7 Northridge, California, Gao et al., 1996; M">M 6.9 Kobe, Japan, Kawase, 1996; M">M 6.8 Nisqually, Washington, Frankel, Carver, and Williams, 2002). Although this and other local‐scale features are critical to improving seismic hazard forecasts, historically they have not been explicitly incorporated into the U.S. National Seismic Hazard Model (NSHM, national model and maps), primarily because the necessary basin maps and methodologies were not available at the national scale. Instead,...

  20. Coastal erosion hazard and vulnerability using sig tools. Comparison between "La Barra town, Buenaventura, (Pacific Ocean of Colombia) and Providence - Santa Catalina islands (Colombian Caribbean Sea)

    NASA Astrophysics Data System (ADS)

    Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza; Morales-Giraldo, David; Rangel-Buitrago, Nelson

    2014-05-01

    Analysis of hazards and vulnerability associated to coastal erosion along coastlines is a first issue in order to establish plans for adaptation to climate change in coastal areas. La Barra Town, Buenaventura (Pacific ocean of Colombia) and Providence - Santa Catalina Islands (Colombian Caribbean) were selected to develop a detailed analysis of coastal erosion hazard and vulnerability from different perspectives: i) physical (hazard) , ii) social , iii) conservation approach and iv) cultural heritage (Raizal). The analysis was made by a semi quantitative approximation method, applying variables associated with the intrinsic coastal zone properties (i.e. type of beach, exposure of the coast to waves, etc.). Coastal erosion data and associated variables as well land use; conservation and heritage data were used to carry out a further detailed analysis of the human - structural vulnerability and exposure to hazards. The data shows erosion rates close to -17 m yr-1 in La Barra Town (highlighting their critical condition and urgent relocation process), while in some sectors of Providence Island, such as Old Town, erosion rate was -5 m yr-1. The observed erosion process affects directly the land use and the local and regional economy. The differences between indexes and the structural and physical vulnerability as well the use of methodological variables are presented in the context of each region. In this work, all the information was worked using a GIS environment since this allows editing and updating the information continuously. The application of this methodology generates useful information in order to promote risk management as well prevention, mitigation and reduction plans. In both areas the adaptation must be a priority strategy to be considered, including relocation alternatives and sustainable protection with the support of studies of uses and future outlooks in the coast. The methodology is framed into the use of GIS tools and it highlights their benefits in the analysis of information.

  1. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less

  2. Revealing the underlying drivers of disaster risk: a global analysis

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2017-04-01

    Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.

  3. Flows of Selected Hazardous Materials by Rail

    DOT National Transportation Integrated Search

    1990-03-01

    This report reviews the hazardous materials rail traffic of 33 selected hazardous materials commoditites or commodity groups in 1986, a relatively typical recent year. The flow of the selected commodities by rail are characterized and their geographi...

  4. Capturing spatiotemporal variation in wildfires for improving postwildfire debris-flow hazard assessments: Chapter 20

    USGS Publications Warehouse

    Haas, Jessica R.; Thompson, Matthew P.; Tillery, Anne C.; Scott, Joe H.

    2017-01-01

    Wildfires can increase the frequency and magnitude of catastrophic debris flows. Integrated, proactive natural hazard assessment would therefore characterize landscapes based on the potential for the occurrence and interactions of wildfires and postwildfire debris flows. This chapter presents a new modeling effort that can quantify the variability surrounding a key input to postwildfire debris-flow modeling, the amount of watershed burned at moderate to high severity, in a prewildfire context. The use of stochastic wildfire simulation captures variability surrounding the timing and location of ignitions, fire weather patterns, and ultimately the spatial patterns of watershed area burned. Model results provide for enhanced estimates of postwildfire debris-flow hazard in a prewildfire context, and multiple hazard metrics are generated to characterize and contrast hazards across watersheds. Results can guide mitigation efforts by allowing planners to identify which factors may be contributing the most to the hazard rankings of watersheds.

  5. ST-HASSET for volcanic hazard assessment: A Python tool for evaluating the evolution of unrest indicators

    NASA Astrophysics Data System (ADS)

    Bartolini, Stefania; Sobradelo, Rosa; Martí, Joan

    2016-08-01

    Short-term hazard assessment is an important part of the volcanic management cycle, above all at the onset of an episode of volcanic agitation (unrest). For this reason, one of the main tasks of modern volcanology is to use monitoring data to identify and analyse precursory signals and so determine where and when an eruption might occur. This work follows from Sobradelo and Martí [Short-term volcanic hazard assessment through Bayesian inference: retrospective application to the Pinatubo 1991 volcanic crisis. Journal of Volcanology and Geothermal Research 290, 111, 2015] who defined the principle for a new methodology for conducting short-term hazard assessment in unrest volcanoes. Using the same case study, the eruption on Pinatubo (15 June 1991), this work introduces a new free Python tool, ST-HASSET, for implementing Sobradelo and Martí (2015) methodology in the time evolution of unrest indicators in the volcanic short-term hazard assessment. Moreover, this tool is designed for complementing long-term hazard assessment with continuous monitoring data when the volcano goes into unrest. It is based on Bayesian inference and transforms different pre-eruptive monitoring parameters into a common probabilistic scale for comparison among unrest episodes from the same volcano or from similar ones. This allows identifying common pre-eruptive behaviours and patterns. ST-HASSET is especially designed to assist experts and decision makers as a crisis unfolds, and allows detecting sudden changes in the activity of a volcano. Therefore, it makes an important contribution to the analysis and interpretation of relevant data for understanding the evolution of volcanic unrest.

  6. Regional risk assessment approaches to land planning for industrial polluted areas in China: the Hulunbeier region case study.

    PubMed

    Li, Daiqing; Zhang, Chen; Pizzol, Lisa; Critto, Andrea; Zhang, Haibo; Lv, Shihai; Marcomini, Antonio

    2014-04-01

    The rapid industrial development and urbanization processes that occurred in China over the past 30years has increased dramatically the consumption of natural resources and raw materials, thus exacerbating the human pressure on environmental ecosystems. In result, large scale environmental pollution of soil, natural waters and urban air were recorded. The development of effective industrial planning to support regional sustainable economy development has become an issue of serious concern for local authorities which need to select safe sites for new industrial settlements (i.e. industrial plants) according to assessment approaches considering cumulative impacts, synergistic pollution effects and risks of accidental releases. In order to support decision makers in the development of efficient and effective regional land-use plans encompassing the identification of suitable areas for new industrial settlements and areas in need of intervention measures, this study provides a spatial regional risk assessment methodology which integrates relative risk assessment (RRA) and socio-economic assessment (SEA) and makes use of spatial analysis (GIS) methodologies and multicriteria decision analysis (MCDA) techniques. The proposed methodology was applied to the Chinese region of Hulunbeier which is located in eastern Inner Mongolia Autonomous Region, adjacent to the Republic of Mongolia. The application results demonstrated the effectiveness of the proposed methodology in the identification of the most hazardous and risky industrial settlements, the most vulnerable regional receptors and the regional districts which resulted to be the most relevant for intervention measures since they are characterized by high regional risk and excellent socio-economic development conditions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Probabilistic estimation of long-term volcanic hazard under evolving tectonic conditions in a 1 Ma timeframe

    NASA Astrophysics Data System (ADS)

    Jaquet, O.; Lantuéjoul, C.; Goto, J.

    2017-10-01

    Risk assessments in relation to the siting of potential deep geological repositories for radioactive wastes demand the estimation of long-term tectonic hazards such as volcanicity and rock deformation. Owing to their tectonic situation, such evaluations concern many industrial regions around the world. For sites near volcanically active regions, a prevailing source of uncertainty is related to volcanic hazard. For specific situations, in particular in relation to geological repository siting, the requirements for the assessment of volcanic and tectonic hazards have to be expanded to 1 million years. At such time scales, tectonic changes are likely to influence volcanic hazard and therefore a particular stochastic model needs to be developed for the estimation of volcanic hazard. The concepts and theoretical basis of the proposed model are given and a methodological illustration is provided using data from the Tohoku region of Japan.

  8. Assessing qualitative long-term volcanic hazards at Lanzarote Island (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Becerril, Laura; Martí, Joan; Bartolini, Stefania; Geyer, Adelina

    2017-07-01

    Conducting long-term hazard assessment in active volcanic areas is of primary importance for land-use planning and defining emergency plans able to be applied in case of a crisis. A definition of scenario hazard maps helps to mitigate the consequences of future eruptions by anticipating the events that may occur. Lanzarote is an active volcanic island that has hosted the largest (> 1.5 km3 DRE) and longest (6 years) eruption, the Timanfaya eruption (1730-1736), on the Canary Islands in historical times (last 600 years). This eruption brought severe economic losses and forced local people to migrate. In spite of all these facts, no comprehensive hazard assessment or hazard maps have been developed for the island. In this work, we present an integrated long-term volcanic hazard evaluation using a systematic methodology that includes spatial analysis and simulations of the most probable eruptive scenarios.

  9. Evolution of vulnerability of communities facing repeated hazards

    PubMed Central

    Guikema, Seth D.; Zhu, Laiyin; Igusa, Takeru

    2017-01-01

    The decisions that individuals make when recovering from and adapting to repeated hazards affect a region’s vulnerability in future hazards. As such, community vulnerability is not a static property but rather a dynamic property dependent on behavioral responses to repeated hazards and damage. This paper is the first of its kind to build a framework that addresses the complex interactions between repeated hazards, regional damage, mitigation decisions, and community vulnerability. The framework enables researchers and regional planners to visualize and quantify how a community could evolve over time in response to repeated hazards under various behavioral scenarios. An illustrative example using parcel-level data from Anne Arundel County, Maryland—a county that experiences fairly frequent hurricanes—is presented to illustrate the methodology and to demonstrate how the interplay between individual choices and regional vulnerability is affected by the region’s hurricane experience. PMID:28953893

  10. Development of resting membrane potentials in differentiating murine neuroblastoma cells (N1E-115) evaluated by flow cytometry.

    PubMed

    Kisaalita, W S; Bowen, J M

    1997-09-01

    With the aid of a voltage-sensitive oxonol dye, flow cytometry was used to measure relative changes in resting membrane potential (V(m)) and forward angle light scatter (FALS) profiles of a differentiating/differentiated murine neuroblastoma cell line (N1E-115). Electrophysiological differentiation was characterized by V(m) establishment. The (V(m))-time profile was found to be seed cell concentration-dependent for cell densities of less than 2 × 10(4) cells/cm(2). At higher initial cell densities, under differentiating culture conditions, V(m) development commenced on day 2 and reached a steady-state on day 12. The relative distribution of differentiated cells between low and high FALS has been proposed as a potential culture electrophysiological differentiation state index. These experiments offer a general methodology to characterize cultured excitable cells of nervous system origin, with respect to electrophysiological differentiation. This information is valuable in studies employing neuroblastoma cells as in vitro screening models for safety/hazard evaluation and/or risk assessment of therapeutical and industrial chemicals under development.

  11. Characterization of Buoyant Fluorescent Particles for Field Observations of Water Flows

    PubMed Central

    Tauro, Flavia; Aureli, Matteo; Porfiri, Maurizio; Grimaldi, Salvatore

    2010-01-01

    In this paper, the feasibility of off-the-shelf buoyant fluorescent microspheres as particle tracers in turbid water flows is investigated. Microspheres’ fluorescence intensity is experimentally measured and detected in placid aqueous suspensions of increasing concentrations of clay to simulate typical conditions occurring in natural drainage networks. Experiments are conducted in a broad range of clay concentrations and particle immersion depths by using photoconductive cells and image-based sensing technologies. Results obtained with both methodologies exhibit comparable trends and show that the considered particles are fairly detectable in critically turbid water flows. Further information on performance and integration of the studied microspheres in low-cost measurement instrumentation for field observations is obtained through experiments conducted in a custom built miniature water channel. This experimental characterization provides a first assessment of the feasibility of commercially available buoyant fluorescent beads in the analysis of high turbidity surface water flows. The proposed technology may serve as a minimally invasive sensing system for hazardous events, such as pollutant diffusion in natural streams and flash flooding due to extreme rainfall. PMID:22163540

  12. Characterization of buoyant fluorescent particles for field observations of water flows.

    PubMed

    Tauro, Flavia; Aureli, Matteo; Porfiri, Maurizio; Grimaldi, Salvatore

    2010-01-01

    In this paper, the feasibility of off-the-shelf buoyant fluorescent microspheres as particle tracers in turbid water flows is investigated. Microspheres' fluorescence intensity is experimentally measured and detected in placid aqueous suspensions of increasing concentrations of clay to simulate typical conditions occurring in natural drainage networks. Experiments are conducted in a broad range of clay concentrations and particle immersion depths by using photoconductive cells and image-based sensing technologies. Results obtained with both methodologies exhibit comparable trends and show that the considered particles are fairly detectable in critically turbid water flows. Further information on performance and integration of the studied microspheres in low-cost measurement instrumentation for field observations is obtained through experiments conducted in a custom built miniature water channel. This experimental characterization provides a first assessment of the feasibility of commercially available buoyant fluorescent beads in the analysis of high turbidity surface water flows. The proposed technology may serve as a minimally invasive sensing system for hazardous events, such as pollutant diffusion in natural streams and flash flooding due to extreme rainfall.

  13. Characterization of aerosols produced by cell sorters and evaluation of containment

    PubMed Central

    Holmes, Kevin L.

    2011-01-01

    In spite of the recognition by the flow cytometry community of potential aerosol hazards associated with cell sorting, there has been no previous study that has thoroughly characterized the aerosols that can be produced by cell sorters. In this study an Aerodynamic Particle Sizer was used to determine the concentration and aerodynamic diameter of aerosols produced by a FACS Aria II cell sorter under various conditions. Aerosol containment and evacuation was also evaluated using this novel methodology. The results showed that high concentrations of aerosols in the range of 1–3 μm can be produced in fail mode and that with decreased sheath pressure, aerosol concentration decreased and aerodynamic diameter increased. Although the engineering controls of the FACS Aria II for containment were effective, sort chamber evacuation of aerosols following a simulated nozzle obstruction was ineffective. However, simple modifications to the FACS Aria II are described that greatly improved sort chamber aerosol evacuation. The results of this study will facilitate the risk assessment of cell sorting potentially biohazardous samples by providing much needed data regarding aerosol production and containment. PMID:22052694

  14. Critique on the use of the standardized avian acute oral toxicity test for first generation anticoagulant rodenticides

    USGS Publications Warehouse

    Vyas, Nimish B.; Rattner, Barnett A.

    2012-01-01

    Avian risk assessments for rodenticides are often driven by the results of standardized acute oral toxicity tests without regards to a toxicant's mode of action and time course of adverse effects. First generation anticoagulant rodenticides (FGARs) generally require multiple feedings over several days to achieve a threshold concentration in tissue and cause adverse effects. This exposure regimen is much different than that used in the standardized acute oral toxicity test methodology. Median lethal dose values derived from standardized acute oral toxicity tests underestimate the environmental hazard and risk of FGARs. Caution is warranted when FGAR toxicity, physiological effects, and pharmacokinetics derived from standardized acute oral toxicity testing are used for forensic confirmation of the cause of death in avian mortality incidents and when characterizing FGARs' risks to free-ranging birds.

  15. Simulation of Asymmetric Destabilization of Mine-void Rock Masses Using a Large 3D Physical Model

    NASA Astrophysics Data System (ADS)

    Lai, X. P.; Shan, P. F.; Cao, J. T.; Cui, F.; Sun, H.

    2016-02-01

    When mechanized sub-horizontal section top coal caving (SSTCC) is used as an underground mining method for exploiting extremely steep and thick coal seams (ESTCS), a large-scale surrounding rock caving may be violently created and have the potential to induce asymmetric destabilization from mine voids. In this study, a methodology for assessing the destabilization was developed to simulate the Weihuliang coal mine in the Urumchi coal field, China. Coal-rock mass and geological structure characterization were integrated with rock mechanics testing for assessment of the methodology and factors influencing asymmetric destabilization. The porous rock-like composite material ensured accuracy for building a 3D geological physical model of mechanized SSTCC by combining multi-mean timely track monitoring including acoustic emission, crack optical acquirement, roof separation observation, and close-field photogrammetry. An asymmetric 3D modeling analysis for destabilization characteristics was completed. Data from the simulated hydraulic support and buried pressure sensor provided effective information that was linked with stress-strain relationship of the working face in ESTCS. The results of the 3D physical model experiments combined with hybrid statistical methods were effective for predicting dynamic hazards in ESTCS.

  16. Risk of fetal mortality after exposure to Listeria monocytogenes based on dose-response data from pregnant guinea pigs and primates.

    PubMed

    Williams, Denita; Castleman, Jennifer; Lee, Chi-Ching; Mote, Beth; Smith, Mary Alice

    2009-11-01

    One-third of the annual cases of listeriosis in the United States occur during pregnancy and can lead to miscarriage or stillbirth, premature delivery, or infection of the newborn. Previous risk assessments completed by the Food and Drug Administration/the Food Safety Inspection Service of the U.S. Department of Agriculture/the Centers for Disease Control and Prevention (FDA/USDA/CDC) and Food and Agricultural Organization/the World Health Organization (FAO/WHO) were based on dose-response data from mice. Recent animal studies using nonhuman primates and guinea pigs have both estimated LD(50)s of approximately 10(7) Listeria monocytogenes colony forming units (cfu). The FAO/WHO estimated a human LD(50) of 1.9 x 10(6) cfu based on data from a pregnant woman consuming contaminated soft cheese. We reevaluated risk based on dose-response curves from pregnant rhesus monkeys and guinea pigs. Using standard risk assessment methodology including hazard identification, exposure assessment, hazard characterization, and risk characterization, risk was calculated based on the new dose-response information. To compare models, we looked at mortality rate per serving at predicted doses ranging from 10(-4) to 10(12) L. monocytogenes cfu. Based on a serving of 10(6) L. monocytogenes cfu, the primate model predicts a death rate of 5.9 x 10(-1) compared to the FDA/USDA/CDC (fig. IV-12) predicted rate of 1.3 x 10(-7). Based on the guinea pig and primate models, the mortality rate calculated by the FDA/USDA/CDC is underestimated for this susceptible population.

  17. Grain-size analysis of volcanic ash for the rapid assessment of respiratory health hazard.

    PubMed

    Horwell, Claire J

    2007-10-01

    Volcanic ash has the potential to cause acute and chronic respiratory diseases if the particles are sufficiently fine to enter the respiratory system. Characterization of the grain-size distribution (GSD) of volcanic ash is, therefore, a critical first step in assessing its health hazard. Quantification of health-relevant size fractions is challenging without state-of-the-art technology, such as the laser diffractometer. Here, several methods for GSD characterization for health assessment are considered, the potential for low-cost measurements is investigated and the first database of health-pertinent GSD data is presented for a suite of ash samples from around the world. Methodologies for accurate measurement of the GSD of volcanic ash by laser diffraction are presented by experimental analysis of optimal refractive indices for different magmatic compositions. Techniques for representative sampling of small quantities of ash are also experimentally investigated. GSD results for health-pertinent fractions for a suite of 63 ash samples show that the fraction of respirable (<4 microm) material ranges from 0-17 vol%, with the variation reflecting factors such as the style of the eruption and the distance from the source. A strong correlation between the amount of <4 and <10 microm material is observed for all ash types. This relationship is stable at all distances from the volcano and with all eruption styles and can be applied to volcanic plume and ash fallout models. A weaker relationship between the <4 and <63 microm fractions provides a novel means of estimating the quantity of respirable material from data obtained by sieving.

  18. An experimental system for flood risk forecasting at global scale

    NASA Astrophysics Data System (ADS)

    Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.

    2016-12-01

    Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.

  19. Fragility Analysis of Concrete Gravity Dams

    NASA Astrophysics Data System (ADS)

    Tekie, Paulos B.; Ellingwood, Bruce R.

    2002-09-01

    Concrete gravity dams are an important part ofthe nation's infrastructure. Many dams have been in service for over 50 years, during which time important advances in the methodologies for evaluation of natural phenomena hazards have caused the design-basis events to be revised upwards, in some cases significantly. Many existing dams fail to meet these revised safety criteria and structural rehabilitation to meet newly revised criteria may be costly and difficult. A probabilistic safety analysis (PSA) provides a rational safety assessment and decision-making tool managing the various sources of uncertainty that may impact dam performance. Fragility analysis, which depicts fl%e uncertainty in the safety margin above specified hazard levels, is a fundamental tool in a PSA. This study presents a methodology for developing fragilities of concrete gravity dams to assess their performance against hydrologic and seismic hazards. Models of varying degree of complexity and sophistication were considered and compared. The methodology is illustrated using the Bluestone Dam on the New River in West Virginia, which was designed in the late 1930's. The hydrologic fragilities showed that the Eluestone Dam is unlikely to become unstable at the revised probable maximum flood (PMF), but it is likely that there will be significant cracking at the heel ofthe dam. On the other hand, the seismic fragility analysis indicated that sliding is likely, if the dam were to be subjected to a maximum credible earthquake (MCE). Moreover, there will likely be tensile cracking at the neck of the dam at this level of seismic excitation. Probabilities of relatively severe limit states appear to be only marginally affected by extremely rare events (e.g. the PMF and MCE). Moreover, the risks posed by the extreme floods and earthquakes were not balanced for the Bluestone Dam, with seismic hazard posing a relatively higher risk.

  20. Describing the association between socioeconomic inequalities and cancer survival: methodological guidelines and illustration with population-based data.

    PubMed

    Belot, Aurélien; Remontet, Laurent; Rachet, Bernard; Dejardin, Olivier; Charvat, Hadrien; Bara, Simona; Guizard, Anne-Valérie; Roche, Laurent; Launoy, Guy; Bossard, Nadine

    2018-01-01

    Describing the relationship between socioeconomic inequalities and cancer survival is important but methodologically challenging. We propose guidelines for addressing these challenges and illustrate their implementation on French population-based data. We analyzed 17 cancers. Socioeconomic deprivation was measured by an ecological measure, the European Deprivation Index (EDI). The Excess Mortality Hazard (EMH), ie, the mortality hazard among cancer patients after accounting for other causes of death, was modeled using a flexible parametric model, allowing for nonlinear and/or time-dependent association between the EDI and the EMH. The model included a cluster-specific random effect to deal with the hierarchical structure of the data. We reported the conventional age-standardized net survival (ASNS) and described the changes of the EMH over the time since diagnosis at different levels of deprivation. We illustrated nonlinear and/or time-dependent associations between the EDI and the EMH by plotting the excess hazard ratio according to EDI values at different times after diagnosis. The median excess hazard ratio quantified the general contextual effect. Lip-oral cavity-pharynx cancer in men showed the widest deprivation gap, with 5-year ASNS at 41% and 29% for deprivation quintiles 1 and 5, respectively, and we found a nonlinear association between the EDI and the EMH. The EDI accounted for a substantial part of the general contextual effect on the EMH. The association between the EDI and the EMH was time dependent in stomach and pancreas cancers in men and in cervix cancer. The methodological guidelines proved efficient in describing the way socioeconomic inequalities influence cancer survival. Their use would allow comparisons between different health care systems.

  1. Installation Restoration Program. Phase 2. Confirmation/Quantification. Stage 1. Volume 2.

    DTIC Science & Technology

    1986-10-01

    contaminatioa. Details of the data base received daily over a lifetime. For non - feeding studies on experimental used in these projections for each of...might be generated experimentally for a evaluation of the health effects of the highest no-observed-adverse-effect-level non -carcinogenic end-point of...8217 HARDFILL: Disposal sites receiving construction debris, wood, miscellaneous spoil material. HARM: Hazard Assessment Rating Methodology HAZARDOUS

  2. Methodology for prediction of rip currents using a three-dimensional numerical, coupled, wave current model

    USGS Publications Warehouse

    Voulgaris, George; Kumar, Nirnimesh; Warner, John C.; Leatherman, Stephen; Fletemeyer, John

    2011-01-01

    Rip current currents constitute one of the most common hazards in the nearshore that threaten the lives of the unaware public that makes recreational use of the coastal zone. Society responds to this danger through a number of measures that include: (a) the deployment of trained lifeguards; (b) public education related to the hidden hazards of the nearshore; and (c) establishment of warning systems.

  3. Mapping Natech risk due to earthquakes using RAPID-N

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2013-04-01

    Natural hazard-triggered technological accidents (so-called Natech accidents) at hazardous installations are an emerging risk with possibly serious consequences due to the potential for release of hazardous materials, fires or explosions. For the reduction of Natech risk, one of the highest priority needs is the identification of Natech-prone areas and the systematic assessment of Natech risks. With hardly any Natech risk maps existing within the EU the European Commission's Joint Research Centre has developed a Natech risk analysis and mapping tool called RAPID-N, that estimates the overall risk of natural-hazard impact to industrial installations and its possible consequences. The results are presented as risk summary reports and interactive risk maps which can be used for decision making. Currently, RAPID-N focuses on Natech risk due to earthquakes at industrial installations. However, it will be extended to also analyse and map Natech risk due to floods in the near future. The RAPID-N methodology is based on the estimation of on-site natural hazard parameters, use of fragility curves to determine damage probabilities of plant units for various damage states, and the calculation of spatial extent, severity, and probability of Natech events potentially triggered by the natural hazard. The methodology was implemented as a web-based risk assessment and mapping software tool which allows easy data entry, rapid local or regional risk assessment and mapping. RAPID-N features an innovative property estimation framework to calculate on-site natural hazard parameters, industrial plant and plant unit characteristics, and hazardous substance properties. Custom damage states and fragility curves can be defined for different types of plant units. Conditional relationships can be specified between damage states and Natech risk states, which describe probable Natech event scenarios. Natech consequences are assessed using a custom implementation of U.S. EPA's Risk Management Program (RMP) Guidance for Offsite Consequence Analysis methodology. This custom implementation is based on the property estimation framework and allows the easy modification of model parameters and the substitution of equations with alternatives. RAPID-N can be applied at different stages of the Natech risk management process: It allows on the one hand the analysis of hypothetical Natech scenarios to prevent or prepare for a Natech accident by supporting land-use and emergency planning. On the other hand, once a natural disaster occurs RAPID-N can be used for rapidly locating facilities with potential Natech accident damage based on actual natural-hazard information. This provides a means to warn the population in the vicinity of the facilities in a timely manner. This presentation will introduce the specific features of RAPID-N and show the use of the tool by application to a case-study area.

  4. Quantifying the uncertainty in site amplification modeling and its effects on site-specific seismic-hazard estimation in the upper Mississippi embayment and adjacent areas

    USGS Publications Warehouse

    Cramer, C.H.

    2006-01-01

    The Mississippi embayment, located in the central United States, and its thick deposits of sediments (over 1 km in places) have a large effect on earthquake ground motions. Several previous studies have addressed how these thick sediments might modify probabilistic seismic-hazard maps. The high seismic hazard associated with the New Madrid seismic zone makes it particularly important to quantify the uncertainty in modeling site amplification to better represent earthquake hazard in seismic-hazard maps. The methodology of the Memphis urban seismic-hazard-mapping project (Cramer et al., 2004) is combined with the reference profile approach of Toro and Silva (2001) to better estimate seismic hazard in the Mississippi embayment. Improvements over previous approaches include using the 2002 national seismic-hazard model, fully probabilistic hazard calculations, calibration of site amplification with improved nonlinear soil-response estimates, and estimates of uncertainty. Comparisons are made with the results of several previous studies, and estimates of uncertainty inherent in site-amplification modeling for the upper Mississippi embayment are developed. I present new seismic-hazard maps for the upper Mississippi embayment with the effects of site geology incorporating these uncertainties.

  5. Hydrological modelling for flood forecasting: Calibrating the post-fire initial conditions

    NASA Astrophysics Data System (ADS)

    Papathanasiou, C.; Makropoulos, C.; Mimikou, M.

    2015-10-01

    Floods and forest fires are two of the most devastating natural hazards with severe socioeconomic, environmental as well as aesthetic impacts on the affected areas. Traditionally, these hazards are examined from different perspectives and are thus investigated through different, independent systems, overlooking the fact that they are tightly interrelated phenomena. In fact, the same flood event is more severe, i.e. associated with increased runoff discharge and peak flow and decreased time to peak, if it occurs over a burnt area than that occurring over a land not affected by fire. Mediterranean periurban areas, where forests covered with flammable vegetation coexist with agricultural land and urban zones, are typical areas particularly prone to the combined impact of floods and forest fires. Hence, the accurate assessment and effective management of post-fire flood risk becomes an issue of priority. The research presented in this paper aims to develop a robust methodological framework, using state of art tools and modern technologies to support the estimation of the change in time of five representative hydrological parameters for post-fire conditions. The proposed methodology considers both longer- and short-term initial conditions in order to assess the dynamic evolution of the selected parameters. The research focuses on typical Mediterranean periurban areas that are subjected to both hazards and concludes with a set of equations that associate post-fire and pre-fire conditions for five Fire Severity (FS) classes and three soil moisture states. The methodology has been tested for several flood events on the Rafina catchment, a periurban catchment in Eastern Attica (Greece). In order to validate the methodology, simulated hydrographs were produced and compared against available observed data. Results indicate a close convergence of observed and simulated flows. The proposed methodology is particularly flexible and thus easily adaptable to catchments with similar hydrometeorological and geomorphological features.

  6. Uncertainty in natural hazards, modeling and decision support: An introduction to this volume [Chapter 1

    Treesearch

    Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde

    2017-01-01

    Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...

  7. The work environment disability-adjusted life year for use with life cycle assessment: a methodological approach.

    PubMed

    Scanlon, Kelly A; Gray, George M; Francis, Royce A; Lloyd, Shannon M; LaPuma, Peter

    2013-03-06

    Life cycle assessment (LCA) is a systems-based method used to determine potential impacts to the environment associated with a product throughout its life cycle. Conclusions from LCA studies can be applied to support decisions regarding product design or public policy, therefore, all relevant inputs (e.g., raw materials, energy) and outputs (e.g., emissions, waste) to the product system should be evaluated to estimate impacts. Currently, work-related impacts are not routinely considered in LCA. The objectives of this paper are: 1) introduce the work environment disability-adjusted life year (WE-DALY), one portion of a characterization factor used to express the magnitude of impacts to human health attributable to work-related exposures to workplace hazards; 2) outline the methods for calculating the WE-DALY; 3) demonstrate the calculation; and 4) highlight strengths and weaknesses of the methodological approach. The concept of the WE-DALY and the methodological approach to its calculation is grounded in the World Health Organization's disability-adjusted life year (DALY). Like the DALY, the WE-DALY equation considers the years of life lost due to premature mortality and the years of life lived with disability outcomes to estimate the total number of years of healthy life lost in a population. The equation requires input in the form of the number of fatal and nonfatal injuries and illnesses that occur in the industries relevant to the product system evaluated in the LCA study, the age of the worker at the time of the fatal or nonfatal injury or illness, the severity of the injury or illness, and the duration of time lived with the outcomes of the injury or illness. The methodological approach for the WE-DALY requires data from various sources, multi-step instructions to determine each variable used in the WE-DALY equation, and assumptions based on professional opinion. Results support the use of the WE-DALY in a characterization factor in LCA. Integrating occupational health into LCA studies will provide opportunities to prevent shifting of impacts between the work environment and the environment external to the workplace and co-optimize human health, to include worker health, and environmental health.

  8. Analysis of occupational health hazards and associated risks in fuzzy environment: a case research in an Indian underground coal mine.

    PubMed

    Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar

    2017-09-01

    This paper presents a unique hierarchical structure on various occupational health hazards including physical, chemical, biological, ergonomic and psychosocial hazards, and associated adverse consequences in relation to an underground coal mine. The study proposes a systematic health hazard risk assessment methodology for estimating extent of hazard risk using three important measuring parameters: consequence of exposure, period of exposure and probability of exposure. An improved decision making method using fuzzy set theory has been attempted herein for converting linguistic data into numeric risk ratings. The concept of 'centre of area' method for generalized triangular fuzzy numbers has been explored to quantify the 'degree of hazard risk' in terms of crisp ratings. Finally, a logical framework for categorizing health hazards into different risk levels has been constructed on the basis of distinguished ranges of evaluated risk ratings (crisp). Subsequently, an action requirement plan has been suggested, which could provide guideline to the managers for successfully managing health hazard risks in the context of underground coal mining exercise.

  9. Quantitative Risk Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helms, J.

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investmentsmore » or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.« less

  10. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  11. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: "Kursk" submarine study

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.

    2003-06-01

    There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq) are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2) over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2) for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  12. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: "Kursk"? submarine study

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.

    2003-03-01

    There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the "Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release are performed. The analysis showed that the possible deposition fractions of 1011 over the Kola Peninsula, and 10-12 - 10-13 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  13. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  14. A comprehensive risk assessment framework for offsite transportation of inflammable hazardous waste.

    PubMed

    Das, Arup; Gupta, A K; Mazumder, T N

    2012-08-15

    A framework for risk assessment due to offsite transportation of hazardous wastes is designed based on the type of event that can be triggered from an accident of a hazardous waste carrier. The objective of this study is to design a framework for computing the risk to population associated with offsite transportation of inflammable and volatile wastes. The framework is based on traditional definition of risk and is designed for conditions where accident databases are not available. The probability based variable in risk assessment framework is substituted by a composite accident index proposed in this study. The framework computes the impacts due to a volatile cloud explosion based on TNO Multi-energy model. The methodology also estimates the vulnerable population in terms of disability adjusted life years (DALY) which takes into consideration the demographic profile of the population and the degree of injury on mortality and morbidity sustained. The methodology is illustrated using a case study of a pharmaceutical industry in the Kolkata metropolitan area. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Volcanic hazard assessment in western Europe

    NASA Astrophysics Data System (ADS)

    Chester, David K.; Dibben, Christopher J. L.; Duncan, Angus M.

    2002-06-01

    Volcanology has been in the past and in many respects remains a subject dominated by pure research grounded in the earth sciences. Over the past 30 years a paradigm shift has occurred in hazard assessment which has been aided by significant changes in the social theory of natural hazards and the first-hand experience gained in the 1990s by volcanologists working on projects conceived during the International Decade for Natural Disaster Reduction (IDNDR). Today much greater stress is placed on human vulnerability, the potential for marginalisation of disadvantaged individuals and social groups, and the requirement to make applied volcanology sensitive to the characteristics of local demography, economy, culture and politics. During the IDNDR a methodology, broadly similar to environmental impact analysis, has emerged as the preferred method for studying human vulnerability and risk assessment in volcanically active regions. The characteristics of this new methodology are discussed and the progress which has been made in innovating it on the European Union laboratory volcanoes located in western Europe is reviewed. Furnas (São Miguel, Azores) and Vesuvius in Italy are used as detailed case studies.

  16. Nanomaterials in consumer products: a challenging analytical problem.

    PubMed

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  17. Nanomaterials in consumer products: a challenging analytical problem

    NASA Astrophysics Data System (ADS)

    Contado, Catia

    2015-08-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  18. Nanomaterials in consumer products: a challenging analytical problem

    PubMed Central

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices. PMID:26301216

  19. Nanomaterial characterization: considerations and needs for hazard assessment and safety evaluation.

    PubMed

    Boverhof, Darrell R; David, Raymond M

    2010-02-01

    Nanotechnology is a rapidly emerging field of great interest and promise. As new materials are developed and commercialized, hazard information also needs to be generated to reassure regulators, workers, and consumers that these materials can be used safely. The biological properties of nanomaterials are closely tied to the physical characteristics, including size, shape, dissolution rate, agglomeration state, and surface chemistry, to name a few. Furthermore, these properties can be altered by the medium used to suspend or disperse these water-insoluble particles. However, the current toxicology literature lacks much of the characterization information that allows toxicologists and regulators to develop "rules of thumb" that could be used to assess potential hazards. To effectively develop these rules, toxicologists need to know the characteristics of the particle that interacts with the biological system. This void leaves the scientific community with no options other than to evaluate all materials for all potential hazards. Lack of characterization could also lead to different laboratories reporting discordant results on seemingly the same test material because of subtle differences in the particle or differences in the dispersion medium used that resulted in altered properties and toxicity of the particle. For these reasons, good characterization using a minimal characterization data set should accompany and be required of all scientific publications on nanomaterials.

  20. Program and plans of the U.S. Geological Survey for producing information needed in National Seismic hazards and risk assessment, fiscal years 1980-84

    USGS Publications Warehouse

    Hays, Walter W.

    1979-01-01

    In accordance with the provisions of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124), the U.S. Geological Survey has developed comprehensive plans for producing information needed to assess seismic hazards and risk on a national scale in fiscal years 1980-84. These plans are based on a review of the needs of Federal Government agencies, State and local government agencies, engineers and scientists engaged in consulting and research, professional organizations and societies, model code groups, and others. The Earthquake Hazards Reduction Act provided an unprecedented opportunity for participation in a national program by representatives of State and local governments, business and industry, the design professions, and the research community. The USGS and the NSF (National Science Foundation) have major roles in the national program. The ultimate goal of the program is to reduce losses from earthquakes. Implementation of USGS research in the Earthquake Hazards Reduction Program requires the close coordination of responsibility between Federal, State and local governments. The projected research plan in national seismic hazards and risk for fiscal years 1980-84 will be accomplished by USGS and non-USGS scientists and engineers. The latter group will participate through grants and contracts. The research plan calls for (1) national maps based on existing methods, (2) improved definition of earthquake source zones nationwide, (3) development of improved methodology, (4) regional maps based on the improved methodology, and (5) post-earthquake investigations. Maps and reports designed to meet the needs, priorities, concerns, and recommendations of various user groups will be the products of this research and provide the technical basis for improved implementation.

  1. Combining heuristic and statistical techniques in landslide hazard assessments

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  2. An operational-oriented approach to the assessment of low probability seismic ground motions for critical infrastructures

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, Mariano; Assatourians, Karen; Jimenez, Maria-Jose

    2018-01-01

    Extreme natural hazard events have the potential to cause significant disruption to critical infrastructure (CI) networks. Among them, earthquakes represent a major threat as sudden-onset events with limited, if any, capability of forecast, and high damage potential. In recent years, the increased exposure of interdependent systems has heightened concern, motivating the need for a framework for the management of these increased hazards. The seismic performance level and resilience of existing non-nuclear CIs can be analyzed by identifying the ground motion input values leading to failure of selected key elements. Main interest focuses on the ground motions exceeding the original design values, which should correspond to low probability occurrence. A seismic hazard methodology has been specifically developed to consider low-probability ground motions affecting elongated CI networks. The approach is based on Monte Carlo simulation, which allows for building long-duration synthetic earthquake catalogs to derive low-probability amplitudes. This approach does not affect the mean hazard values and allows obtaining a representation of maximum amplitudes that follow a general extreme-value distribution. This facilitates the analysis of the occurrence of extremes, i.e., very low probability of exceedance from unlikely combinations, for the development of, e.g., stress tests, among other applications. Following this methodology, extreme ground-motion scenarios have been developed for selected combinations of modeling inputs including seismic activity models (source model and magnitude-recurrence relationship), ground motion prediction equations (GMPE), hazard levels, and fractiles of extreme ground motion. The different results provide an overview of the effects of different hazard modeling inputs on the generated extreme motion hazard scenarios. This approach to seismic hazard is at the core of the risk analysis procedure developed and applied to European CI transport networks within the framework of the European-funded INFRARISK project. Such an operational seismic hazard framework can be used to provide insight in a timely manner to make informed risk management or regulating further decisions on the required level of detail or on the adoption of measures, the cost of which can be balanced against the benefits of the measures in question.

  3. How well do we understand oil spill hazard mapping?

    NASA Astrophysics Data System (ADS)

    Sepp Neves, Antonio Augusto; Pinardi, Nadia

    2017-04-01

    In simple terms, we could describe the marine oil spill hazard as related to three main factors: the spill event itself, the spill trajectory and the arrival and adsorption of oil to the shore or beaching. Regarding the first factor, spill occurrence rates and magnitude distribution and their respective uncertainties have been estimated mainly relying on maritime casualty reports. Abascal et al. (2010) and Sepp Neves et al. (2015) demonstrated for the Prestige (Spain, 2002) and Jiyeh (Lebanon, 2006) spills that ensemble numerical oil spill simulations can generate reliable estimaes of the most likely oil trajectories and impacted coasts. Although paramount to estimate the spill impacts on coastal resources, the third component of the oil spill hazard (i.e. oil beaching) is still subject of discussion. Analysts have employed different methodologies to estimate the coastal component of the hazard relying, for instance, on the beaching frequency solely, the time which a given coastal segment is subject to oil concentrations above a certain preset threshold, percentages of oil beached compared to the original spilled volume and many others. Obviously, results are not comparable and sometimes not consistent with the present knowledge about the environmental impacts of oil spills. The observed inconsistency in the hazard mapping methodologies suggests that there is still a lack of understanding of the beaching component of the oil spill hazard itself. The careful statistical description of the beaching process could finally set a common ground in oil spill hazard mapping studies as observed for other hazards such as earthquakes and landslides. This paper is the last of a series of efforts to standardize oil spill hazard and risk assessments through an ISO-compliant framework (IT - OSRA, see Sepp Neves et al., (2015)). We performed two large ensemble oil spill experiments addressing uncertainties in the spill characteristics and location, and meteocean conditions for two different areas (Algarve and Uruguay) aiming at quantifying the hazard due to accidental (large volumes and rare events) and operational (frequent and usually involving small volumes) spills associated with the maritime traffic. In total, over 60,000 240h-long simulations were run and the statistical behavior of the beached concentrations found was described. The concentration distributions for both study areas were successfully fit using a Gamma distribution demonstrating the generality of our conclusions. The oil spill hazard and its uncertainties were quantified for accidental and operational events relying on the statistical distribution parameters. Therefore, the hazard estimates were comparable between areas and allowed to identify priority coastal segments for protection and rank sources of hazard.

  4. 44 CFR 201.4 - Standard State Mitigation Plans.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... reduce risks from natural hazards and serves as a guide for State decision makers as they commit resources to reducing the effects of natural hazards. (b) Planning process. An effective planning process is... risk assessments must characterize and analyze natural hazards and risks to provide a statewide...

  5. A preliminary probabilistic analysis of tsunami sources of seismic and non-seismic origin applied to the city of Naples, Italy

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Anita, G.

    2011-12-01

    In both worldwide and regional historical catalogues, most of the tsunamis are caused by earthquakes and a minor percentage is represented by all the other non-seismic sources. On the other hand, tsunami hazard and risk studies are often applied to very specific areas, where this global trend can be different or even inverted, depending on the kind of potential tsunamigenic sources which characterize the case study. So far, few probabilistic approaches consider the contribution of landslides and/or phenomena derived by volcanic activity, i.e. pyroclastic flows and flank collapses, as predominant in the PTHA, also because of the difficulties to estimate the correspondent recurrence time. These considerations are valid, for example, for the city of Naples, Italy, which is surrounded by a complex active volcanic system (Vesuvio, Campi Flegrei, Ischia) that presents a significant number of potential tsunami sources of non-seismic origin compared to the seismic ones. In this work we present the preliminary results of a probabilistic multi-source tsunami hazard assessment applied to Naples. The method to estimate the uncertainties will be based on Bayesian inference. This is the first step towards a more comprehensive task which will provide a tsunami risk quantification for this town in the frame of the Italian national project ByMuR (http://bymur.bo.ingv.it). This three years long ongoing project has the final objective of developing a Bayesian multi-risk methodology to quantify the risk related to different natural hazards (volcanoes, earthquakes and tsunamis) applied to the city of Naples.

  6. Reduced Risk of Importing Ebola Virus Disease because of Travel Restrictions in 2014: A Retrospective Epidemiological Modeling Study

    PubMed Central

    Otsuki, Shiori

    2016-01-01

    Background An epidemic of Ebola virus disease (EVD) from 2013–16 posed a serious risk of global spread during its early growth phase. A post-epidemic evaluation of the effectiveness of travel restrictions has yet to be conducted. The present study aimed to estimate the effectiveness of travel restrictions in reducing the risk of importation from mid-August to September, 2014, using a simple hazard-based statistical model. Methodology/Principal Findings The hazard rate was modeled as an inverse function of the effective distance, an excellent predictor of disease spread, which was calculated from the airline transportation network. By analyzing datasets of the date of EVD case importation from the 15th of July to the 15th of September 2014, and assuming that the network structure changed from the 8th of August 2014 because of travel restrictions, parameters that characterized the hazard rate were estimated. The absolute risk reduction and relative risk reductions due to travel restrictions were estimated to be less than 1% and about 20%, respectively, for all models tested. Effectiveness estimates among African countries were greater than those for other countries outside Africa. Conclusions The travel restrictions were not effective enough to expect the prevention of global spread of Ebola virus disease. It is more efficient to control the spread of disease locally during an early phase of an epidemic than to attempt to control the epidemic at international borders. Capacity building for local containment and coordinated and expedited international cooperation are essential to reduce the risk of global transmission. PMID:27657544

  7. A Comprehensive Probabilistic Tsunami Hazard Assessment: Multiple Sources and Short-Term Interactions

    NASA Astrophysics Data System (ADS)

    Anita, G.; Selva, J.; Laura, S.

    2011-12-01

    We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).

  8. Geomorphological hazards and environmental impact: Assessment and mapping

    NASA Astrophysics Data System (ADS)

    Panizza, Mario

    In five sections the author develops the methods for the integration of geomorphological concepts into Environmental Impact and Mapping. The first section introduces the concepts of Impact and Risk through the relationships between Geomorphological Environment and Anthropical Element. The second section proposes a methodology for the determination of Geomorphological Hazard and the identification of Geomorphological Risk. The third section synthesizes the procedure for the compilation of a Geomorphological Hazards Map. The fourth section outlines the concepts of Geomorphological Resource Assessment for the analysis of the Environmental Impact. The fifth section considers the contribution of geomorphological studies and mapping in the procedure for Environmental Impact Assessment.

  9. Camera Image Transformation and Registration for Safe Spacecraft Landing and Hazard Avoidance

    NASA Technical Reports Server (NTRS)

    Jones, Brandon M.

    2005-01-01

    Inherent geographical hazards of Martian terrain may impede a safe landing for science exploration spacecraft. Surface visualization software for hazard detection and avoidance may accordingly be applied in vehicles such as the Mars Exploration Rover (MER) to induce an autonomous and intelligent descent upon entering the planetary atmosphere. The focus of this project is to develop an image transformation algorithm for coordinate system matching between consecutive frames of terrain imagery taken throughout descent. The methodology involves integrating computer vision and graphics techniques, including affine transformation and projective geometry of an object, with the intrinsic parameters governing spacecraft dynamic motion and camera calibration.

  10. Alert generation and cockpit presentation for an integrated microburst alerting system

    NASA Technical Reports Server (NTRS)

    Wanke, Craig; Hansman, R. John, Jr.

    1991-01-01

    Alert generation and cockpit presentation issues for low level wind shear (microburst) alerts are investigated. Alert generation issues center on the development of a hazard criterion which allows integration of both ground based and airborne wind shear detection systems to form an accurate picture of the aviation hazard posed by a particular wind shear situation. A methodology for the testing of a hazard criteria through flight simulation has been developed, and has been used to examine the effectiveness and feasibility of several possible criteria. Also, an experiment to evaluate candidate graphical cockpit displays for microburst alerts using a piloted simulator has been designed.

  11. A conflict model for the international hazardous waste disposal dispute.

    PubMed

    Hu, Kaixian; Hipel, Keith W; Fang, Liping

    2009-12-15

    A multi-stage conflict model is developed to analyze international hazardous waste disposal disputes. More specifically, the ongoing toxic waste conflicts are divided into two stages consisting of the dumping prevention and dispute resolution stages. The modeling and analyses, based on the methodology of graph model for conflict resolution (GMCR), are used in both stages in order to grasp the structure and implications of a given conflict from a strategic viewpoint. Furthermore, a specific case study is investigated for the Ivory Coast hazardous waste conflict. In addition to the stability analysis, sensitivity and attitude analyses are conducted to capture various strategic features of this type of complicated dispute.

  12. Radiological Characterization Methodology of INEEL Stored RH-TRU Waste from ANL-E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajiv N. Bhatt

    2003-02-01

    An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using this methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less

  13. Radiological Characterization Methodology for INEEL-Stored Remote-Handled Transuranic (RH TRU) Waste from Argonne National Laboratory-East

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuan, P.; Bhatt, R.N.

    2003-01-14

    An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using the methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less

  14. Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site

    NASA Astrophysics Data System (ADS)

    Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.

    2012-04-01

    The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".

  15. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  16. 32 CFR 644.520 - Contaminated industrial property.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... hazardous materials reasonably possible to detect either by present state-of-the-art methodology or by a visual inspection. (5) Recommendation as to whether the land or structures may be used for any purpose...

  17. 32 CFR 644.520 - Contaminated industrial property.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... hazardous materials reasonably possible to detect either by present state-of-the-art methodology or by a visual inspection. (5) Recommendation as to whether the land or structures may be used for any purpose...

  18. 32 CFR 644.520 - Contaminated industrial property.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... hazardous materials reasonably possible to detect either by present state-of-the-art methodology or by a visual inspection. (5) Recommendation as to whether the land or structures may be used for any purpose...

  19. [Political dimensions of an epidemic: the case of influenza A (H1N1) in the Argentine press].

    PubMed

    Sy, Anahi; Spinelli, Hugo

    2016-03-01

    The current study addresses social representations of the influenza A (H1N1) epidemic in Argentina in 2009, in the country's mainstream newspapers. The methodology was twofold, qualitative and quantitative, with an analysis of two dimensions: the construction of the epidemic as an "object" (designation and characterization) and the sources of information in the news stories, seeking to identify the social actors involved in each case. The results show that designating the epidemic as "H1N1" rather than "swine flu" was a conscious political decision to exempt a hazardous form of livestock production from its role in the disease, while focusing responsibility on individual patients. The study addresses the relations between recommendations by policy spokespersons (especially at the international level), the pharmaceuticalization of the epidemic, shifting of the population's demands to validate biomedical hegemony, and local press coverage of the epidemic.

  20. Natural hazard modeling and uncertainty analysis [Chapter 2

    Treesearch

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  1. ABM and GIS-based multi-scenarios volcanic evacuation modelling of Merapi

    NASA Astrophysics Data System (ADS)

    Jumadi, Carver, Steve; Quincey, Duncan

    2016-05-01

    Conducting effective evacuation is one of the successful keys to deal with such crisis. Therefore, a plan that considers the probability of the spatial extent of the hazard occurrences is needed. Likewise, the evacuation plan in Merapi is already prepared before the eruption on 2010. However, the plan could not be performed because the eruption magnitude was bigger than it was predicted. In this condition, the extent of the hazardous area was increased larger than the prepared hazard model. Managing such unpredicted situation need adequate information that flexible and adaptable to the current situation. Therefore, we applied an Agent-based Model (ABM) and Geographic Information System (GIS) using multi-scenarios hazard model to support the evacuation management. The methodology and the case study in Merapi is provided.

  2. Hazardous waste management and weight-based indicators--the case of Haifa Metropolis.

    PubMed

    Elimelech, E; Ayalon, O; Flicstein, B

    2011-01-30

    The quantity control of hazardous waste in Israel relies primarily on the Environmental Services Company (ESC) reports. With limited management tools, the Ministry of Environmental Protection (MoEP) has no applicable methodology to confirm or monitor the actual amounts of hazardous waste produced by various industrial sectors. The main goal of this research was to develop a method for estimating the amounts of hazardous waste produced by various sectors. In order to achieve this goal, sector-specific indicators were tested on three hazardous waste producing sectors in the Haifa Metropolis: petroleum refineries, dry cleaners, and public hospitals. The findings reveal poor practice of hazardous waste management in the dry cleaning sector and in the public hospitals sector. Large discrepancies were found in the dry cleaning sector, between the quantities of hazardous waste reported and the corresponding indicator estimates. Furthermore, a lack of documentation on hospitals' pharmaceutical and chemical waste production volume was observed. Only in the case of petroleum refineries, the reported amount was consistent with the estimate. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  4. The KULTURisk Regional Risk Assessment methodology for water-related natural hazards - Part 1: Physical-environmental assessment

    NASA Astrophysics Data System (ADS)

    Ronco, P.; Gallina, V.; Torresan, S.; Zabeo, A.; Semenzin, E.; Critto, A.; Marcomini, A.

    2014-07-01

    In recent years, the frequency of catastrophes induced by natural hazard has increased and flood events in particular have been recognized as one of the most threatening water-related disasters. Severe floods have occurred in Europe over the last decade causing loss of life, displacement of people and heavy economic losses. Flood disasters are growing as a consequence of many factors, both climatic and non-climatic. Indeed, the current increase of water-related disasters can be mainly attributed to the increase of exposure (increase elements potentially at risk in floodplains area) and vulnerability (i.e. economic, social, geographic, cultural, and physical/environmental characteristics of the exposure). Besides these factors, the strong effect of climate change is projected to radically modify the usual pattern of the hydrological cycle by intensifying the frequency and severity of flood events both at local, regional and global scale. Within this context, it becomes urgent and dramatically relevant the need of promoting and developing effective and pro-active strategies, tools and actions which allow to assess and (possibly) to reduce the flood risks that threats different relevant receptors. Several methodologies to assess the risk posed by water-related natural hazards have been proposed so far, but very few of them can be adopted to implement the last European Flood Directive (FD). The present study is intended to introduce and present a state-of-the-art Regional Risk Assessment (RRA) methodology to evaluate the benefits of risk prevention in terms of reduced environmental risks due to floods. The methodology, developed within the recently phased out FP7-KULTURisk Project (Knowledge-based approach to develop a cULTUre of Risk prevention - KR) is flexible and can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale). The FD compliant KR-RRA methodology is based on the concept of risk being function of hazard, exposure and vulnerability. It integrates the outputs of various hydrodynamics models (hazard) with sito-specific bio-geophysical and socio-economic indicators (e.g. slope, land cover, population density, economic activities) to develop tailored risk indexes and GIS-based maps for each of the selected targets (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) in the considered region, by comparing the baseline scenario with alternative scenarios, where different structural and/or non-structural mitigation measures are planned. As demonstrated in the companion paper (Part 2, Ronco et al., 2014), risk maps, along with related statistics, allow to identify and prioritize relative hotspots and targets which are more likely to be affected by flood and support the development of relevant and strategic adaptation and prevention measures to minimizing flood impacts. Moreover, the outputs of the RRA methodology can be used for the economic evaluation of different damages (e.g. tangible costs, intangible costs) and for the social assessment considering the benefits of the human dimension of vulnerability (i.e. adaptive and coping capacity).

  5. Integrating Hazardous Materials Characterization and Assessment Tools to Guide Pollution Prevention in Electronic Products and Manufacturing

    NASA Astrophysics Data System (ADS)

    Lam, Carl

    Due to technology proliferation, the environmental burden attributed to the production, use, and disposal of hazardous materials in electronics have become a worldwide concern. The major theme of this dissertation is to develop and apply hazardous materials assessment tools to systematically guide pollution prevention opportunities in the context of electronic product design, manufacturing and end-of-life waste management. To this extent, a comprehensive review is first provided on describing hazard traits and current assessment methods to evaluate hazardous materials. As a case study at the manufacturing level, life cycle impact assessment (LCIA)-based and risk-based screening methods are used to quantify chemical and geographic environmental impacts in the U.S. printed wiring board (PWB) industry. Results from this industrial assessment clarify priority waste streams and States to most effectively mitigate impact. With further knowledge of PWB manufacturing processes, select alternative chemical processes (e.g., spent copper etchant recovery) and material options (e.g., lead-free etch resist) are discussed. In addition, an investigation on technology transition effects for computers and televisions in the U.S. market is performed by linking dynamic materials flow and environmental assessment models. The analysis forecasts quantities of waste units generated and maps shifts in environmental impact potentials associated with metal composition changes due to product substitutions. This insight is important to understand the timing and waste quantities expected and the emerging toxic elements needed to be addressed as a consequence of technology transition. At the product level, electronic utility meter devices are evaluated to eliminate hazardous materials within product components. Development and application of a component Toxic Potential Indicator (TPI) assessment methodology highlights priority components requiring material alternatives. Alternative recommendations are provided and substitute materials such as aluminum alloys for stainless steel and high-density polyethylene for polyvinyl chloride and acrylonitrile-based polymers show promise to meet toxicity reduction, cost, and material functionality requirements. Furthermore, the TPI method, an European Union focused screening tool, is customized to reflect regulated U.S. toxicity parameters. Results show that, although it is possible to adopt U.S. parameters into the TPI method, harmonization of toxicity regulation and standards in various nations and regions is necessary to eliminate inconsistencies during hazard screening of substances used globally. As a whole, the present work helps to assimilate material hazard assessment methods into the larger framework of design for environment strategies so toxics use reduction could be achieved for the development and management of electronics and other consumer goods.

  6. Disaggregated seismic hazard and the elastic input energy spectrum: An approach to design earthquake selection

    NASA Astrophysics Data System (ADS)

    Chapman, Martin Colby

    1998-12-01

    The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression modeling does not resolve significant effects due to site class at frequencies greater than approximately 5 Hz. Disaggregation of general seismic hazard models using Vsbea indicates that the modal magnitudes for the higher frequency oscillators tend to be larger, and vary less with oscillator frequency, than those derived using PSV. Insofar as the elastic input energy may be a better parameter for quantifying the damage potential of ground motion, its use in probabilistic seismic hazard analysis could provide an improved means for selecting earthquake scenarios and establishing design earthquakes for many types of engineering analyses.

  7. Impact of fault models on probabilistic seismic hazard assessment: the example of the West Corinth rift.

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Boiselet, Aurelien; Lyon-Caen, Hélène

    2016-04-01

    Including faults in probabilistic seismic hazard assessment tends to increase the degree of uncertainty in the results due to the intrinsically uncertain nature of the fault data. This is especially the case in the low to moderate seismicity regions of Europe, where slow slipping faults are difficult to characterize. In order to better understand the key parameters that control the uncertainty in the fault-related hazard computations, we propose to build an analytic tool that provides a clear link between the different components of the fault-related hazard computations and their impact on the results. This will allow identifying the important parameters that need to be better constrained in order to reduce the resulting uncertainty in hazard and also provide a more hazard-oriented strategy for collecting relevant fault parameters in the field. The tool will be illustrated through the example of the West Corinth rifts fault-models. Recent work performed in the gulf has shown the complexity of the normal faulting system that is accommodating the extensional deformation of the rift. A logic-tree approach is proposed to account for this complexity and the multiplicity of scientifically defendable interpretations. At the nodes of the logic tree, different options that could be considered at each step of the fault-related seismic hazard will be considered. The first nodes represent the uncertainty in the geometries of the faults and their slip rates, which can derive from different data and methodologies. The subsequent node explores, for a given geometry/slip rate of faults, different earthquake rupture scenarios that may occur in the complex network of faults. The idea is to allow the possibility of several faults segments to break together in a single rupture scenario. To build these multiple-fault-segment scenarios, two approaches are considered: one based on simple rules (i.e. minimum distance between faults) and a second one that relies on physically-based simulations. The following nodes represents for each rupture scenario different rupture forecast models (i.e; characteristic or Gutenberg-Richter) and for a given rupture forecast, two probability models commonly used in seismic hazard assessment: poissonian or time-dependent. The final node represents an exhaustive set of ground motion prediction equations chosen in order to be compatible with the region. Finally, the expected probability of exceeding a given ground motion level is computed at each sites. Results will be discussed for a few specific localities of the West Corinth Gulf.

  8. A Combined Hazard Index Fire Test Methodology for Aircraft Cabin Materials. Volume I.

    DTIC Science & Technology

    1982-04-01

    PROGRAM TEST PANEL NO. 1 ....... 52 5 SUMARY OF EXPERIMTAL CHAS/SATS DATA FOR CI PRGRAM TEST PANEL 2, 3 & 4...As indicated in Figure 2, the dose of each hazard building up in CHI zone 13 is approaching an "effective dose" limit which prevents occupant escape...per minute. During a test, flow into SATS was stopped when CO reached peak concentrations to prevent dilution thereafter at decreasing sample CO

  9. Release of man-made radionuclides into seawater from dumped and sunken nuclear- and radiation-hazardous objects

    NASA Astrophysics Data System (ADS)

    Vysotsky, V. L.; Sivintsev, Yu. V.; Sotnikov, V. A.; Khokhlov, V. N.

    2014-12-01

    The methodology and results of a weighted average evaluation of the release of man-made radio-nuclides into seawater out of nuclear- and radiation-hazardous objects located on the sea bottom over a long period of time (from dumping/sinking to complete destruction of their structural elements) are presented in the paper. The expected radio-ecological implications of environmental contamination are estimated for the main stages of destruction.

  10. Research, methodology, and applications of probabilistic seismic-hazard mapping of the Central and Eastern United States; minutes of a workshop on June 13-14, 2000, at Saint Louis University

    USGS Publications Warehouse

    Wheeler, Russell L.; Perkins, David M.

    2000-01-01

    The U.S. Geological Survey (USGS) is updating and revising its 1996 national seismic-hazard maps for release in 2001. Part of this process is the convening of four regional workshops with earth scientists and other users of the maps. The second of these workshops was sponsored by the USGS and the Mid-America Earthquake Center, and was hosted by Saint Louis University on June 13-14, 2000.The workshop concentrated on the central and eastern U.S. (CEUS) east of the Rocky Mountains. The tasks of the workshop were to (1) evaluate new research findings that are relevant to seismic hazard mapping, (2) discuss modifications in the inputs and methodology used in the national maps, (3) discuss concerns by engineers and other users about the scientific input to the maps and the use of the hazard maps in building codes, and (4) identify needed research in the CEUS that can improve the seismic hazard maps and reduce their uncertainties. These minutes summarize the workshop discussions. This is not a transcript; some individual remarks and short discussions of side issues and logistics were omitted. Named speakers were sent a draft of the minutes with a request for corrections of any errors in remarks attributed to them. Nine people returned corrections, amplifications, or approvals of their remarks as reported. The rest of this document consists of the meeting agenda, discussion summaries, and a list of the 60 attendees.

  11. Seismic hazard assessment and pattern recognition of earthquake prone areas in the Po Plain (Italy)

    NASA Astrophysics Data System (ADS)

    Gorshkov, Alexander; Peresan, Antonella; Soloviev, Alexander; Panza, Giuliano F.

    2014-05-01

    A systematic and quantitative assessment, capable of providing first-order consistent information about the sites where large earthquakes may occur, is crucial for the knowledgeable seismic hazard evaluation. The methodology for the pattern recognition of areas prone to large earthquakes is based on the morphostructural zoning method (MSZ), which employs topographic data and present-day tectonic structures for the mapping of earthquake-controlling structures (i.e. the nodes formed around lineaments intersections) and does not require the knowledge about past seismicity. The nodes are assumed to be characterized by a uniform set of topographic, geologic, and geophysical parameters; on the basis of such parameters the pattern recognition algorithm defines a classification rule to discriminate seismogenic and non-seismogenic nodes. This methodology has been successfully applied since the early 1970s in a number of regions worldwide, including California, where it permitted the identification of areas that have been subsequently struck by strong events and that previously were not considered prone to strong earthquakes. Recent studies on the Iberian Peninsula and the Rhone Valley, have demonstrated the applicability of MSZ to flat basins, with a relatively flat topography. In this study, the analysis is applied to the Po Plain (Northern Italy), an area characterized by a flat topography, to allow for the systematic identification of the nodes prone to earthquakes with magnitude larger or equal to M=5.0. The MSZ method differs from the standard morphostructural analysis where the term "lineament" is used to define the complex of alignments detectable on topographic maps or on satellite images. According to that definition the lineament is locally defined and the existence of the lineament does not depend on the surrounding areas. In MSZ, the primary element is the block - a relatively homogeneous area - while the lineament is a secondary element of the morphostructure. The identified earthquake prone areas provide first-order systematic information that may significantly contribute to seismic hazard assessment in the Italian territory. The information about the possible location of strong earthquakes provided by the morphostructural analysis, in fact, can be naturally incorporated in the neo-deterministic procedure for seismic hazard assessment (NDSHA), so as to fill in possible gaps in known seismicity. Moreover, the space information about earthquake prone areas can be fruitfully combined with the space-time information provided by the quantitative analysis of the seismic flow, so as to identify the priority areas (with linear dimensions of few tens kilometers), where the probability of a strong earthquake is relatively high, for detailed local scale studies. The new indications about the seismogenic potential obtained from this study, although less accurate than detailed fault studies, have the advantage of being independent on past seismicity information, since they rely on the systematic and quantitative analysis of the available geological and morphostructural data. Thus, this analysis appears particularly useful in areas where historical information is scarce; special attention should be paid to seismogenic nodes that are not related with known active faults or past earthquakes.

  12. Identifying treatment effect heterogeneity in clinical trials using subpopulations of events: STEPP.

    PubMed

    Lazar, Ann A; Bonetti, Marco; Cole, Bernard F; Yip, Wai-Ki; Gelber, Richard D

    2016-04-01

    Investigators conducting randomized clinical trials often explore treatment effect heterogeneity to assess whether treatment efficacy varies according to patient characteristics. Identifying heterogeneity is central to making informed personalized healthcare decisions. Treatment effect heterogeneity can be investigated using subpopulation treatment effect pattern plot (STEPP), a non-parametric graphical approach that constructs overlapping patient subpopulations with varying values of a characteristic. Procedures for statistical testing using subpopulation treatment effect pattern plot when the endpoint of interest is survival remain an area of active investigation. A STEPP analysis was used to explore patterns of absolute and relative treatment effects for varying levels of a breast cancer biomarker, Ki-67, in the phase III Breast International Group 1-98 randomized clinical trial, comparing letrozole to tamoxifen as adjuvant therapy for postmenopausal women with hormone receptor-positive breast cancer. Absolute treatment effects were measured by differences in 4-year cumulative incidence of breast cancer recurrence, while relative effects were measured by the subdistribution hazard ratio in the presence of competing risks using O-E (observed-minus-expected) methodology, an intuitive non-parametric method. While estimation of hazard ratio values based on O-E methodology has been shown, a similar development for the subdistribution hazard ratio has not. Furthermore, we observed that the subpopulation treatment effect pattern plot analysis may not produce results, even with 100 patients within each subpopulation. After further investigation through simulation studies, we observed inflation of the type I error rate of the traditional test statistic and sometimes singular variance-covariance matrix estimates that may lead to results not being produced. This is due to the lack of sufficient number of events within the subpopulations, which we refer to as instability of the subpopulation treatment effect pattern plot analysis. We introduce methodology designed to improve stability of the subpopulation treatment effect pattern plot analysis and generalize O-E methodology to the competing risks setting. Simulation studies were designed to assess the type I error rate of the tests for a variety of treatment effect measures, including subdistribution hazard ratio based on O-E estimation. This subpopulation treatment effect pattern plot methodology and standard regression modeling were used to evaluate heterogeneity of Ki-67 in the Breast International Group 1-98 randomized clinical trial. We introduce methodology that generalizes O-E methodology to the competing risks setting and that improves stability of the STEPP analysis by pre-specifying the number of events across subpopulations while controlling the type I error rate. The subpopulation treatment effect pattern plot analysis of the Breast International Group 1-98 randomized clinical trial showed that patients with high Ki-67 percentages may benefit most from letrozole, while heterogeneity was not detected using standard regression modeling. The STEPP methodology can be used to study complex patterns of treatment effect heterogeneity, as illustrated in the Breast International Group 1-98 randomized clinical trial. For the subpopulation treatment effect pattern plot analysis, we recommend a minimum of 20 events within each subpopulation. © The Author(s) 2015.

  13. Integrated approach for coastal hazards and risks in Sri Lanka

    NASA Astrophysics Data System (ADS)

    Garcin, M.; Desprats, J. F.; Fontaine, M.; Pedreros, R.; Attanayake, N.; Fernando, S.; Siriwardana, C. H. E. R.; de Silva, U.; Poisson, B.

    2008-06-01

    The devastating impact of the tsunami of 26 December 2004 on the shores of the Indian Ocean recalled the importance of knowledge and the taking into account of coastal hazards. Sri Lanka was one of the countries most affected by this tsunami (e.g. 30 000 dead, 1 million people homeless and 70% of the fishing fleet destroyed). Following this tsunami, as part of the French post-tsunami aid, a project to establish a Geographical Information System (GIS) on coastal hazards and risks was funded. This project aims to define, at a pilot site, a methodology for multiple coastal hazards assessment that might be useful for the post-tsunami reconstruction and for development planning. This methodology could be applied to the whole coastline of Sri Lanka. The multi-hazard approach deals with very different coastal processes in terms of dynamics as well as in terms of return period. The first elements of this study are presented here. We used a set of tools integrating a GIS, numerical simulations and risk scenario modelling. While this action occurred in response to the crisis caused by the tsunami, it was decided to integrate other coastal hazards into the study. Although less dramatic than the tsunami these remain responsible for loss of life and damage. Furthermore, the establishment of such a system could not ignore the longer-term effects of climate change on coastal hazards in Sri Lanka. This GIS integrates the physical and demographic data available in Sri Lanka that is useful for assessing the coastal hazards and risks. In addition, these data have been used in numerical modelling of the waves generated during periods of monsoon as well as for the December 2004 tsunami. Risk scenarios have also been assessed for test areas and validated by field data acquired during the project. The results obtained from the models can be further integrated into the GIS and contribute to its enrichment and to help in better assessment and mitigation of these risks. The coastal-hazards-and-risks GIS coupled with modelling thus appears to be a very useful tool that can constitute the skeleton of a coastal zone management system. Decision makers will be able to make informed choices with regards to hazards during reconstruction and urban planning projects.

  14. Occupational-level interactions between physical hazards and cognitive ability and skill requirements in predicting injury incidence rates.

    PubMed

    Ford, Michael T; Wiggins, Bryan K

    2012-07-01

    Interactions between occupational-level physical hazards and cognitive ability and skill requirements were examined as predictors of injury incidence rates as reported by the U. S. Bureau of Labor Statistics. Based on ratings provided in the Occupational Information Network (O*NET) database, results across 563 occupations indicate that physical hazards at the occupational level were strongly related to injury incidence rates. Also, as expected, the physical hazard-injury rate relationship was stronger among occupations with high cognitive ability and skill requirements. In addition, there was an unexpected main effect such that occupations with high cognitive ability and skill requirements had lower injury rates even after controlling for physical hazards. The main effect of cognitive ability and skill requirements, combined with the interaction with physical hazards, resulted in unexpectedly high injury rates for low-ability and low-skill occupations with low physical hazard levels. Substantive and methodological explanations for these interactions and their theoretical and practical implications are offered. Results suggest that organizations and occupational health and safety researchers and practitioners should consider the occupational level of analysis and interactions between physical hazards and cognitive requirements in future research and practice when attempting to understand and prevent injuries.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldston, W.

    On April 21, 2009, the Energy Facilities Contractors Group (EFCOG) Waste Management Working Group (WMWG) provided a recommendation to the Department of Energy's Environmental Management program (DOE-EM) concerning supplemental guidance on blending methodologies to use to classify waste forms to determine if the waste form meets the definition of Transuranic (TRU) Waste or can be classified as Low-Level Waste (LLW). The guidance provides specific examples and methods to allow DOE and its Contractors to properly classify waste forms while reducing the generation of TRU wastes. TRU wastes are much more expensive to characterize at the generator's facilities, ship, and thenmore » dispose at the Waste Isolation Pilot Plant (WIPP) than Low-Level Radioactive Waste's disposal. Also the reduction of handling and packaging of LLW is inherently less hazardous to the nuclear workforce. Therefore, it is important to perform the characterization properly, but in a manner that minimizes the generation of TRU wastes if at all possible. In fact, the generation of additional volumes of radioactive wastes under the ARRA programs, this recommendation should improve the cost effective implementation of DOE requirements while properly protecting human health and the environment. This paper will describe how the message of appropriate, less expensive, less hazardous blending of radioactive waste is the 'right' thing to do in many cases, but can be confused with inappropriate 'dilution' that is frowned upon by regulators and stakeholders in the public. A proposal will be made in this paper on how to communicate this very complex and confusing technical issue to regulatory bodies and interested stakeholders to gain understanding and approval of the concept. The results of application of the proposed communication method and attempt to change the regulatory requirements in this area will be discussed including efforts by DOE and the NRC on this very complex subject.« less

  16. Seismic dynamics in advance and after the recent strong earthquakes in Italy and New Zealand

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Kossobokov, V. G.

    2017-12-01

    We consider seismic events as a sequence of avalanches in self-organized system of blocks-and-faults of the Earth lithosphere and characterize earthquake series with the distribution of the control parameter, η = τ × 10B × (5-M) × L C of the Unified Scaling Law for Earthquakes, USLE (where τ is inter-event time, B is analogous to the Gutenberg-Richter b-value, and C is fractal dimension of seismic locus). A systematic analysis of earthquake series in Central Italy and New Zealand, 1993-2017, suggests the existence, in a long-term, of different rather steady levels of seismic activity characterized with near constant values of η, which, in mid-term, intermittently switch at times of transitions associated with the strong catastrophic events. On such a transition, seismic activity, in short-term, may follow different scenarios with inter-event time scaling of different kind, including constant, logarithmic, power law, exponential rise/decay or a mixture of those. The results do not support the presence of universality in seismic energy release. The observed variability of seismic activity in advance and after strong (M6.0+) earthquakes in Italy and significant (M7.0+) earthquakes in New Zealand provides important constraints on modelling realistic earthquake sequences by geophysicists and can be used to improve local seismic hazard assessments including earthquake forecast/prediction methodologies. The transitions of seismic regime in Central Italy and New Zealand started in 2016 are still in progress and require special attention and geotechnical monitoring. It would be premature to make any kind of definitive conclusions on the level of seismic hazard which is evidently high at this particular moment of time in both regions. The study supported by the Russian Science Foundation Grant No.16-17-00093.

  17. Evaluating MoE and its Uncertainty and Variability for Food Contaminants (EuroTox presentation)

    EPA Science Inventory

    Margin of Exposure (MoE), is a metric for quantifying the relationship between exposure and hazard. Ideally, it is the ratio of the dose associated with hazard and an estimate of exposure. For example, hazard may be characterized by a benchmark dose (BMD), and, for food contami...

  18. Urban-hazard risk analysis: mapping of heat-related risks in the elderly in major Italian cities.

    PubMed

    Morabito, Marco; Crisci, Alfonso; Gioli, Beniamino; Gualtieri, Giovanni; Toscano, Piero; Di Stefano, Valentina; Orlandini, Simone; Gensini, Gian Franco

    2015-01-01

    Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks. Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥ 65). A long time-series (2001-2013) of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m) daytime and night-time land surface temperatures (LST). LST was estimated pixel-wise by applying two statistical model approaches: 1) the Linear Regression Model (LRM); 2) the Generalized Additive Model (GAM). Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m) from the 2001 census (Eurostat source), and processed together using "Crichton's Risk Triangle" hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI). The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk) were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities. This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public health operators and facilitate coordination for heat-related emergencies.

  19. Urban-Hazard Risk Analysis: Mapping of Heat-Related Risks in the Elderly in Major Italian Cities

    PubMed Central

    Morabito, Marco; Crisci, Alfonso; Gioli, Beniamino; Gualtieri, Giovanni; Toscano, Piero; Di Stefano, Valentina; Orlandini, Simone; Gensini, Gian Franco

    2015-01-01

    Background Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks. Objectives Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥65). Methods A long time-series (2001–2013) of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m) daytime and night-time land surface temperatures (LST). LST was estimated pixel-wise by applying two statistical model approaches: 1) the Linear Regression Model (LRM); 2) the Generalized Additive Model (GAM). Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m) from the 2001 census (Eurostat source), and processed together using “Crichton’s Risk Triangle” hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI). Results The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk) were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities. Conclusions This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public health operators and facilitate coordination for heat-related emergencies. PMID:25985204

  20. A methodology for modeling regional terrorism risk.

    PubMed

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  1. Global Natural Disaster Risk Hotspots: Transition to a Regional Approach

    NASA Astrophysics Data System (ADS)

    Lerner-Lam, A.; Chen, R.; Dilley, M.

    2005-12-01

    The "Hotspots Project" is a collaborative study of the global distribution and occurrence of multiple natural hazards and the associated exposures of populations and their economic output. In this study we assess the global risks of two disaster-related outcomes: mortality and economic losses. We estimate risk levels by combining hazard exposure with historical vulnerability for two indicators of elements at risk-gridded population and Gross Domestic Product (GDP) per unit area - for six major natural hazards: earthquakes, volcanoes, landslides, floods, drought, and cyclones. By calculating relative risks for each grid cell rather than for countries as a whole, we are able to estimate risk levels at sub-national scales. These can then be used to estimate aggregate relative multiple hazard risk at regional and national scales. Mortality-related risks are assessed on a 2.5' x 2.5' latitude-longitude grid of global population (GPW Version 3). Economic risks are assessed at the same resolution for gridded GDP per unit area, using World Bank estimates of GDP based on purchasing power parity. Global hazard data were compiled from multiple sources. The project collaborated directly with UNDP and UNEP, the International Research Institute for Climate Prediction (IRI) at Columbia, and the Norwegian Geotechnical Institute (NGI) in the creation of data sets for several hazards for which global data sets did not previously exist. Drought, flood and volcano hazards are characterized in terms of event frequency, storms by frequency and severity, earthquakes by frequency and ground acceleration exceedance probability, and landslides by an index derived from probability of occurrence. The global analysis undertaken in this project is clearly limited by issues of scale as well as by the availability and quality of data. For some hazards, there exist only 15- to 25-year global records with relatively crude spatial information. Data on historical disaster losses, and particularly on economic losses, are also limited. On one hand the data are adequate for general identification of areas of the globe that are at relatively higher single- or multiple-hazard risk than other areas. On the other hand they are inadequate for understanding the absolute levels of risk posed by any specific hazard or combination of hazards. Nevertheless it is possible to assess in general terms the exposure and potential magnitude of losses to people and their assets in these areas. Such information, although not ideal, can still be very useful for informing a range of disaster prevention and preparedness measures, including prioritization of resources, targeting of more localized and detailed risk assessments, implementation of risk-based disaster management and emergency response strategies, and development of long-term plans for poverty reduction and economic development. In addition to summarizing the results of the Hotspots Project, we discuss data collection issues and suggest methodological approaches for making the transition to more detailed regional and national studies. Preliminary results for several regional case studies will be presented.

  2. Development of methodologies to assess the relative hazards from thermal decomposition products of polymeric materials.

    PubMed

    Barrow, C S; Lucia, H; Stock, M F; Alarie, Y

    1979-05-01

    The physiological stress imposed upon mice due to the irritating properties of thermal decomposition products of polymeric materials was evaluated. Acute lethality and histopathological evaluation were included in the study. The rankings of the polymeric materials studied from most to least hazardous was concluded to be polytetrafluoroethylene greater than polyvinyl chloride greater than Douglas Fir and flexible polyurethane foam greater than fiber glass reinforced polyester greater than copper coated wire with mineral insulation.

  3. Active faulting in low- to moderate-seismicity regions: the SAFE project

    NASA Astrophysics Data System (ADS)

    Sebrier, M.; Safe Consortium

    2003-04-01

    SAFE (Slow Active Faults in Europe) is an EC-FP5 funded multidisciplinary effort which proposes an integrated European approach in identifying and characterizing active faults as input for evaluating seismic hazard in low- to moderate-seismicity regions. Seismically active western European regions are generally characterized by low hazard but high risk, due to the concentration of human and material properties with high vulnerability. Detecting, and then analysing, tectonic deformations that may lead to destructive earthquakes in such areas has to take into account three major limitations: - the typical climate of western Europe (heavy vegetation cover and/or erosion) ; - the subdued geomorphic signature of slowly deforming faults ; - the heavy modification of landscape by human activity. The main objective of SAFE, i.e., improving the assessment of seismic hazard through understanding of the mechanics and recurrence of active faults in slowly deforming regions, is achieved through four major steps : (1) extending geologic and geomorphic investigations of fault activity beyond the Holocene to take into account various time-windows; (2) developing an expert system that combines diverse lines of geologic, seismologic, geomorphic, and geophysical evidence to diagnose the existence and seismogenic potential of slow active faults; (3) delineating and characterising high seismic risk areas of western Europe, either from historical or geological/geomorphic evidence; (4) demonstrating and discussing the impact of the project results on risk assessment through a seismic scenario in the Basel-Mulhouse pilot area. To take properly into account known differences in source behavior, these goals are pursued both in extensional (Lower and Upper Rhine Graben, Catalan Coast) and compressional tectonic settings (southern Upper Rhine Graben, Po Plain, and Provence). Two arid compressional regions (SE Spain and Moroccan High Atlas) have also been selected to address the limitations imposed by vegetation and human modified landscapes. The first results demonstrate that the strong added value provided by SAFE consists in its integrated multidisciplinary and multiscalar approach that allows robust diagnostic conclusions on fault activity and on the associated earthquake potential. This approach will be illustrated through selected methodological results.

  4. CMOS Active Pixel Sensor Technology and Reliability Characterization Methodology

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Guertin, Steven M.; Pain, Bedabrata; Kayaii, Sammy

    2006-01-01

    This paper describes the technology, design features and reliability characterization methodology of a CMOS Active Pixel Sensor. Both overall chip reliability and pixel reliability are projected for the imagers.

  5. Characterizing wood-plastic composites via data-driven methodologies

    Treesearch

    John G. Michopoulos; John C. Hermanson; Robert Badaliance

    2007-01-01

    The recent increase of wood-plastic composite materials in various application areas has underlined the need for an efficient and robust methodology to characterize their nonlinear anisotropic constitutive behavior. In addition, the multiplicity of various loading conditions in structures utilizing these materials further increases the need for a characterization...

  6. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L. K.; Vogel, R. M.

    2015-11-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.

  7. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  8. The KULTURisk Regional Risk Assessment methodology for water-related natural hazards - Part 2: Application to the Zurich case study

    NASA Astrophysics Data System (ADS)

    Ronco, P.; Bullo, M.; Torresan, S.; Critto, A.; Olschewski, R.; Zappa, M.; Marcomini, A.

    2014-07-01

    The main objective of the paper is the application of the KULTURisk Regional Risk Assessment (KR-RRA) methodology, presented in the companion paper (Part 1, Ronco et al., 2014), to the Sihl River valley, in Switzerland. Through a tuning process of the methodology to the site-specific context and features, flood related risks have been assessed for different receptors lying on the Sihl River valley including the city of Zurich, which represents a typical case of river flooding in urban area. After characterizing the peculiarities of the specific case study, risk maps have been developed under a 300 years return period scenario (selected as baseline) for six identified relevant targets, exposed to flood risk in the Sihl valley, namely: people, economic activities (including buildings, infrastructures and agriculture), natural and semi-natural systems and cultural heritage. Finally, the total risk index map, which allows to identify and rank areas and hotspots at risk by means of Multi Criteria Decision Analysis tools, has been produced to visualize the spatial pattern of flood risk within the area of study. By means of a tailored participative approach, the total risk maps supplement the consideration of technical experts with the (essential) point of view of the relevant stakeholders for the appraisal of the specific scores and weights related to the receptor-relative risks. The total risk maps obtained for the Sihl River case study are associated with the lower classes of risk. In general, higher relative risks are concentrated in the deeply urbanized area within and around the Zurich city centre and areas that rely just behind to the Sihl River course. Here, forecasted injuries and potential fatalities are mainly due to high population density and high presence of old (vulnerable) people; inundated buildings are mainly classified as continuous and discontinuous urban fabric; flooded roads, pathways and railways, the majority of them referring to the Zurich main train station (Hauptbahnhof), are at high risk of inundation, causing huge indirect damages. The analysis of flood risk to agriculture, natural and semi-natural systems and cultural heritage have pointed out that these receptors could be relatively less impacted by the selected flood scenario mainly because their scattered presence. Finally, the application of the KR-RRA methodology to the Sihl River case study as well as to several other sites across Europe (not presented here), has demonstrated its flexibility and possible adaptation to different geographical and socio-economic contexts, depending on data availability and peculiarities of the sites, as well as for other hazard scenarios.

  9. Vulnerability curves vs. vulnerability indicators: application of an indicator-based methodology for debris-flow hazards

    NASA Astrophysics Data System (ADS)

    Papathoma-Köhle, Maria

    2016-08-01

    The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.

  10. Classification of Large-Scale Remote Sensing Images for Automatic Identification of Health Hazards: Smoke Detection Using an Autologistic Regression Classifier.

    PubMed

    Wolters, Mark A; Dean, C B

    2017-01-01

    Remote sensing images from Earth-orbiting satellites are a potentially rich data source for monitoring and cataloguing atmospheric health hazards that cover large geographic regions. A method is proposed for classifying such images into hazard and nonhazard regions using the autologistic regression model, which may be viewed as a spatial extension of logistic regression. The method includes a novel and simple approach to parameter estimation that makes it well suited to handling the large and high-dimensional datasets arising from satellite-borne instruments. The methodology is demonstrated on both simulated images and a real application to the identification of forest fire smoke.

  11. The Importance of Returned Martian Samples for Constraining Potential Hazards to Future Human Exploration

    NASA Astrophysics Data System (ADS)

    iMOST Team; Harrington, A. D.; Carrier, B. L.; Fernandez-Remolar, D. C.; Fogarty, J.; McCoy, J. T.; Rucker, M. A.; Spry, J. A.; Altieri, F.; Amelin, Y.; Ammannito, E.; Anand, M.; Beaty, D. W.; Benning, L. G.; Bishop, J. L.; Borg, L. E.; Boucher, D.; Brucato, J. R.; Busemann, H.; Campbell, K. A.; Czaja, A. D.; Debaille, V.; Des Marais, D. J.; Dixon, M.; Ehlmann, B. L.; Farmer, J. D.; Glavin, D. P.; Goreva, Y. S.; Grady, M. M.; Hallis, L. J.; Hausrath, E. M.; Herd, C. D. K.; Horgan, B.; Humayun, M.; Kleine, T.; Kleinhenz, J.; Mangold, N.; Mackelprang, R.; Mayhew, L. E.; McCubbin, F. M.; McLennan, S. M.; McSween, H. Y.; Moser, D. E.; Moynier, F.; Mustard, J. F.; Niles, P. B.; Ori, G. G.; Raulin, F.; Rettberg, P.; Schmitz, N.; Sefton-Nash, E.; Sephton, M. A.; Shaheen, R.; Shuster, D. L.; Siljestrom, S.; Smith, C. L.; Steele, A.; Swindle, T. D.; ten Kate, I. L.; Tosca, N. J.; Usui, T.; Van Kranendonk, M. J.; Wadhwa, M.; Weiss, B. P.; Werner, S. C.; Westall, F.; Wheeler, R. M.; Zipfel, J.; Zorzano, M. P.

    2018-04-01

    Thorough characterization and evaluation of returned martian regolith and airfall samples are critical to understanding the potential health and engineering system hazards during future human exploration.

  12. Fault2SHA- A European Working group to link faults and Probabilistic Seismic Hazard Assessment communities in Europe

    NASA Astrophysics Data System (ADS)

    Scotti, Oona; Peruzza, Laura

    2016-04-01

    The key questions we ask are: What is the best strategy to fill in the gap in knowledge and know-how in Europe when considering faults in seismic hazard assessments? Are field geologists providing the relevant information for seismic hazard assessment? Are seismic hazard analysts interpreting field data appropriately? Is the full range of uncertainties associated with the characterization of faults correctly understood and propagated in the computations? How can fault-modellers contribute to a better representation of the long-term behaviour of fault-networks in seismic hazard studies? Providing answers to these questions is fundamental, in order to reduce the consequences of future earthquakes and improve the reliability of seismic hazard assessments. An informal working group was thus created at a meeting in Paris in November 2014, partly financed by the Institute of Radioprotection and Nuclear Safety, with the aim to motivate exchanges between field geologists, fault modellers and seismic hazard practitioners. A variety of approaches were presented at the meeting and a clear gap emerged between some field geologists, that are not necessarily familiar with probabilistic seismic hazard assessment methods and needs and practitioners that do not necessarily propagate the "full" uncertainty associated with the characterization of faults. The group thus decided to meet again a year later in Chieti (Italy), to share concepts and ideas through a specific exercise on a test case study. Some solutions emerged but many problems of seismic source characterizations with people working in the field as well as with people tackling models of interacting faults remained. Now, in Wien, we want to open the group and launch a call for the European community at large to contribute to the discussion. The 2016 EGU session Fault2SHA is motivated by such an urgency to increase the number of round tables on this topic and debate on the peculiarities of using faults in seismic hazard assessment in Europe. Europe is a country dominated by slow deforming regions where the long histories of seismicity are the main source of information to infer fault behaviour. Geodetic studies, geomorphological studies as well as paleoseismological studies are welcome complementary data that are slowly filling in the database but are at present insufficient, by themselves, to allow characterizing faults. Moreover, Europe is characterized by complex fault systems (Upper Rhine Graben, Central and Southern Apennines, Corinth, etc.) and the degree of uncertainty in the characterization of the faults can be very different from one country to the other. This requires developing approaches and concepts that are adapted to the European context. It is thus the specificity of the European situation that motivates the creation of a predominantly European group where field geologists, fault modellers and fault-PSHA practitioners may exchange and learn from each other's experience.

  13. A methodology for the characterization and diagnosis of cognitive impairments-Application to specific language impairment.

    PubMed

    Oliva, Jesús; Serrano, J Ignacio; del Castillo, M Dolores; Iglesias, Angel

    2014-06-01

    The diagnosis of mental disorders is in most cases very difficult because of the high heterogeneity and overlap between associated cognitive impairments. Furthermore, early and individualized diagnosis is crucial. In this paper, we propose a methodology to support the individualized characterization and diagnosis of cognitive impairments. The methodology can also be used as a test platform for existing theories on the causes of the impairments. We use computational cognitive modeling to gather information on the cognitive mechanisms underlying normal and impaired behavior. We then use this information to feed machine-learning algorithms to individually characterize the impairment and to differentiate between normal and impaired behavior. We apply the methodology to the particular case of specific language impairment (SLI) in Spanish-speaking children. The proposed methodology begins by defining a task in which normal and individuals with impairment present behavioral differences. Next we build a computational cognitive model of that task and individualize it: we build a cognitive model for each participant and optimize its parameter values to fit the behavior of each participant. Finally, we use the optimized parameter values to feed different machine learning algorithms. The methodology was applied to an existing database of 48 Spanish-speaking children (24 normal and 24 SLI children) using clustering techniques for the characterization, and different classifier techniques for the diagnosis. The characterization results show three well-differentiated groups that can be associated with the three main theories on SLI. Using a leave-one-subject-out testing methodology, all the classifiers except the DT produced sensitivity, specificity and area under curve values above 90%, reaching 100% in some cases. The results show that our methodology is able to find relevant information on the underlying cognitive mechanisms and to use it appropriately to provide better diagnosis than existing techniques. It is also worth noting that the individualized characterization obtained using our methodology could be extremely helpful in designing individualized therapies. Moreover, the proposed methodology could be easily extended to other languages and even to other cognitive impairments not necessarily related to language. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Methodological challenges in assessing the environmental status of a marine ecosystem: case study of the Baltic Sea.

    PubMed

    Ojaveer, Henn; Eero, Margit

    2011-04-29

    Assessments of the environmental status of marine ecosystems are increasingly needed to inform management decisions and regulate human pressures to meet the objectives of environmental policies. This paper addresses some generic methodological challenges and related uncertainties involved in marine ecosystem assessment, using the central Baltic Sea as a case study. The objectives of good environmental status of the Baltic Sea are largely focusing on biodiversity, eutrophication and hazardous substances. In this paper, we conduct comparative evaluations of the status of these three segments, by applying different methodological approaches. Our analyses indicate that the assessment results are sensitive to a selection of indicators for ecological quality objectives that are affected by a broad spectrum of human activities and natural processes (biodiversity), less so for objectives that are influenced by a relatively narrow array of drivers (eutrophications, hazardous substances). The choice of indicator aggregation rule appeared to be of essential importance for assessment results for all three segments, whereas the hierarchical structure of indicators had only a minor influence. Trend-based assessment was shown to be a useful supplement to reference-based evaluation, being independent of the problems related to defining reference values and indicator aggregation methodologies. Results of this study will help in setting priorities for future efforts to improve environmental assessments in the Baltic Sea and elsewhere, and to ensure the transparency of the assessment procedure.

  15. Landslide hazard assessment : LIFE+IMAGINE project methodology and Liguria region use case

    NASA Astrophysics Data System (ADS)

    Spizzichino, Daniele; Campo, Valentina; Congi, Maria Pia; Cipolloni, Carlo; Delmonaco, Giuseppe; Guerrieri, Luca; Iadanza, Carla; Leoni, Gabriele; Trigila, Alessandro

    2015-04-01

    Scope of the work is to present a methodology developed for analysis of potential impacts in areas prone to landslide hazard in the framework of the EC project LIFE+IMAGINE. The project aims to implement a web services-based infrastructure addressed to environmental analysis, that integrates, in its own architecture, specifications and results from INSPIRE, SEIS and GMES. Existing web services has been customized to provide functionalities for supporting environmental integrated management. The implemented infrastructure has been applied to landslide risk scenarios, developed in selected pilot areas, aiming at: i) application of standard procedures to implement a landslide risk analysis; ii) definition of a procedure for assessment of potential environmental impacts, based on a set of indicators to estimate the different exposed elements with their specific vulnerability in the pilot area. The landslide pilot and related scenario are focused at providing a simplified Landslide Risk Assessment (LRA) through: 1) a landslide inventory derived from available historical and recent databases and maps; 2) landslide susceptibility and hazard maps; 3) assessment of exposure and vulnerability on selected typologies of elements at risk; 4) implementation of a landslide risk scenario for different sets of exposed elements 5) development of a use case; 6) definition of guidelines, best practices and production of thematic maps. The LRA has been implemented in Liguria region, Italy, in two different catchment areas located in the Cinque Terre National Park, characterized by a high landslide susceptibility and low resilience. The landslide risk impact analysis has been calibrated taking into account the socio-economic damage caused by landslides triggered by the October 2011 meteorological event. During this event, over 600 landslides were triggered in the selected pilot area. Most of landslides affected the diffuse system of anthropogenic terraces and caused the direct disruption of the walls as well as transportation of a large amount of loose sediments along the slopes and channels as induced consequence of the event. Application of a spatial analysis detected ca. 400 critical point along the road network with an average length of about 200 m. Over 1,000 buildings were affected and damaged by the event. The exposed population in the area involved by the event has been estimated in ca. 2,600 inhabitants (people?). In the pilot area, 19 different typologies of Cultural Heritage were affected by landslide phenomena or located in zones classified as high landslide hazard. The final scope of the landslide scenario is to improve the awareness on hazard, exposure, vulnerability and landslide risk in the Cinque Terre National Park to the benefit of local authorities and population. In addition, the results of the application will be used for updating the land planning process in order to improve the resilience of local communities, ii) implementing cost-benefit analysis aimed at the definition of guidelines for sustainable landslide risk mitigation strategies, iii) suggesting a general road map for the implementation of a local adaptation plan.

  16. Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes

    NASA Astrophysics Data System (ADS)

    Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.

    2012-07-01

    Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.

  17. The 2014 update to the National Seismic Hazard Model in California

    USGS Publications Warehouse

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  18. Robust inference in discrete hazard models for randomized clinical trials.

    PubMed

    Nguyen, Vinh Q; Gillen, Daniel L

    2012-10-01

    Time-to-event data in which failures are only assessed at discrete time points are common in many clinical trials. Examples include oncology studies where events are observed through periodic screenings such as radiographic scans. When the survival endpoint is acknowledged to be discrete, common methods for the analysis of observed failure times include the discrete hazard models (e.g., the discrete-time proportional hazards and the continuation ratio model) and the proportional odds model. In this manuscript, we consider estimation of a marginal treatment effect in discrete hazard models where the constant treatment effect assumption is violated. We demonstrate that the estimator resulting from these discrete hazard models is consistent for a parameter that depends on the underlying censoring distribution. An estimator that removes the dependence on the censoring mechanism is proposed and its asymptotic distribution is derived. Basing inference on the proposed estimator allows for statistical inference that is scientifically meaningful and reproducible. Simulation is used to assess the performance of the presented methodology in finite samples.

  19. Spatial earthquake hazard assessment of Evansville, Indiana

    USGS Publications Warehouse

    Rockaway, T.D.; Frost, J.D.; Eggert, D.L.; Luna, R.

    1997-01-01

    The earthquake hazard has been evaluated for a 150-square-kilometer area around Evansville, Indiana. GIS-QUAKE, a system that combines liquefaction and ground motion analysis routines with site-specific geological, geotechnical, and seismological information, was used for the analysis. The hazard potential was determined by using 586 SPT borings, 27 CPT sounding, 39 shear-wave velocity profiles and synthesized acceleration records for body-wave magnitude 6.5 and 7.3 mid-continental earthquakes, occurring at distances of 50 km and 250 km, respectively. The results of the GIS-QUAKE hazard analyses for Evansville identify areas with a high hazard potential that had not previously been identified in earthquake zonation studies. The Pigeon Creek area specifically is identified as having significant potential for liquefaction-induced damage. Damage as a result of ground motion amplification is determined to be a moderate concern throughout the area. Differences in the findings of this zonation study and previous work are attributed to the size and range of the database, the hazard evaluation methodologies, and the geostatistical interpolation techniques used to estimate the hazard potential. Further, assumptions regarding the groundwater elevations made in previous studies are also considered to have had a significant effect on the results.

  20. Evaluating Developmental Neurotoxicity Hazard: Better than Before

    EPA Pesticide Factsheets

    EPA researchers grew neural networks in their laboratory that showed the promise of helping to screen thousands of chemicals in the environment that are yet to be characterized for developmental neurotoxicity hazard through traditional methods.

  1. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    PubMed

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  2. Time-to-event methodology improved statistical evaluation in register-based health services research.

    PubMed

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Toward the characterization of biological toxins using field-based FT-IR spectroscopic instrumentation

    NASA Astrophysics Data System (ADS)

    Schiering, David W.; Walton, Robert B.; Brown, Christopher W.; Norman, Mark L.; Brewer, Joseph; Scott, James

    2004-12-01

    IR spectroscopy is a broadly applicable technique for the identification of covalent materials. Recent advances in instrumentation have made Fourier Transform infrared (FT-IR) spectroscopy available for field characterization of suspect materials. Presently, this instrumentation is broadly deployed and used for the identification of potential chemical hazards. This discussion concerns work towards expanding the analytical utility of field-based FT-IR spectrometry in the characterization of biological threats. Two classes of materials were studied: biologically produced chemical toxins which were non-peptide in nature and peptide toxin. The IR spectroscopic identification of aflatoxin-B1, trichothecene T2 mycotoxin, and strychnine was evaluated using the approach of spectral searching against large libraries of materials. For pure components, the IR method discriminated the above toxins at better than the 99% confidence level. The ability to identify non-peptide toxins in mixtures was also evaluated using a "spectral stripping" search approach. For the mixtures evaluated, this method was able to identify the mixture components from ca. 32K spectral library entries. Castor bean extract containing ricin was used as a representative peptide toxin. Due to similarity in protein spectra, a SIMCA pattern recognition methodology was evaluated for classifying peptide toxins. In addition to castor bean extract the method was validated using bovine serum albumin and myoglobin as simulants. The SIMCA approach was successful in correctly classifying these samples at the 95% confidence level.

  4. Characterization of Rocket Propellant Combustion Products. Chemical Characterization and Computer Modeling of the Exhaust Products from Four Propellant Formulations

    DTIC Science & Technology

    1990-12-31

    health hazards from weapons combustion products, to include rockets and missiles, became evident, Research to elucidate significant health effects of...CO/CO2 ratios was low for all but one of dhe formulations, In general, if the model were to be used in its present state for health risk assessments...35 Part 2: Modeling for Health Hazard Prediction Introduction ................................................. 37 Results and D iscussion

  5. Development of South Dakota accident reduction factors

    DOT National Transportation Integrated Search

    1998-08-01

    This report offers the methodology and findings of the first project to develop Accident Reduction Factors (ARFs) and Severity Reduction Ratios (SRRs) for the state of South Dakota. The ARFs and SRRs of this project focused on Hazard Elimination and ...

  6. Hazardous Materials Verification and Limited Characterization Report on Sodium and Caustic Residuals in Materials and Fuel Complex Facilities MFC-799/799A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gary Mecham

    2010-08-01

    This report is a companion to the Facilities Condition and Hazard Assessment for Materials and Fuel Complex Sodium Processing Facilities MFC-799/799A and Nuclear Calibration Laboratory MFC-770C (referred to as the Facilities Condition and Hazards Assessment). This report specifically responds to the requirement of Section 9.2, Item 6, of the Facilities Condition and Hazards Assessment to provide an updated assessment and verification of the residual hazardous materials remaining in the Sodium Processing Facilities processing system. The hazardous materials of concern are sodium and sodium hydroxide (caustic). The information supplied in this report supports the end-point objectives identified in the Transition Planmore » for Multiple Facilities at the Materials and Fuels Complex, Advanced Test Reactor, Central Facilities Area, and Power Burst Facility, as well as the deactivation and decommissioning critical decision milestone 1, as specified in U.S. Department of Energy Guide 413.3-8, “Environmental Management Cleanup Projects.” Using a tailored approach and based on information obtained through a combination of process knowledge, emergency management hazardous assessment documentation, and visual inspection, this report provides sufficient detail regarding the quantity of hazardous materials for the purposes of facility transfer; it also provides that further characterization/verification of these materials is unnecessary.« less

  7. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-01-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  8. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-05-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  9. Assessing risks and preventing disease from environmental chemicals.

    PubMed

    Dunnette, D A

    1989-01-01

    In the last 25 years there has been considerable concern expressed about the extent to which chemical agents in the ambient and work environments are contributing to the causation of disease. This concern is a logical extension of our increased knowledge of the real and potential effects of environmental chemicals and the methodological difficulties in applying new knowledge that could help prevent environmentally induced disease. Chemical risk assessment offers an approach to estimating risks and involves consideration of relevant information including identification of chemical hazards, evaluation of the dose-response relationship, estimation of exposure and finally, risk characterization. Particularly significant uncertainties which are inherent in use of this and other risk models include animal-human and low dose-high dose extrapolation and estimation of exposure. Community public health risks from exposure to environmental chemicals appear to be small relative to other public health risks based on information related to cancer trends, dietary intake of synthetic chemicals, assessment data on substances such as DDT and "dioxin," public health effects of hazardous waste sites and contextual considerations. Because of inherent uncertainty in the chemical risk assessment process, however, we need to apply what methods are available in our efforts to prevent disease induced by environmental chemicals. There are a number of societal strategies which can contribute to overall reduction of risk from environmental chemicals. These include acquisition of information on environmental risk including toxicity, intensity and extensity of exposure, biological monitoring, disease surveillance, improvement in epidemiological methods, control of environmental chemical exposures, and dissemination of hazardous chemical information. Responsible environmental risk communication and information transfer appear to be among the most important of the available strategies for preventing disease induced by chemicals in the environment.

  10. Multi-hazard national-level risk assessment in Africa using global approaches

    NASA Astrophysics Data System (ADS)

    Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Murnane, Richard

    2016-04-01

    In recent years Sub-Saharan Africa has been characterized by unprecedented opportunity for transformation and sustained growth. However, natural disasters such as droughts, floods, cyclones, earthquakes, landslides, volcanic eruptions and extreme temperatures cause significant economic and human losses, and major development challenges. Quantitative disaster risk assessments are an important basis for governments to understand disaster risk in their country, and to develop effective risk management and risk financing solutions. However, the data-scarce nature of many Sub-Saharan African countries as well as a lack of financing for risk assessments has long prevented detailed analytics. Recent advances in globally applicable disaster risk modelling practices and data availability offer new opportunities. In December 2013 the European Union approved a € 60 million contribution to support the development of an analytical basis for risk financing and to accelerate the effective implementation of a comprehensive disaster risk reduction. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) was selected as the implementing partner of the Program for Result Area 5: the "Africa Disaster Risk Assessment and Financing Program." As part of this effort, the GFDRR is overseeing the production of national-level multi-hazard risk profiles for a range of countries in Sub-Saharan Africa, using a combination of national and global datasets and state-of-the-art hazard and risk assessment methodologies. In this presentation, we will highlight the analytical approach behind these assessments, and show results for the first five countries for which the assessment has been completed (Kenya, Uganda, Senegal, Niger and Ethiopia). The presentation will also demonstrate the visualization of the risk assessments into understandable and visually attractive risk profile documents.

  11. Landslide Risk: Economic Valuation in The North-Eastern Zone of Medellin City

    NASA Astrophysics Data System (ADS)

    Vega, Johnny Alexander; Hidalgo, César Augusto; Johana Marín, Nini

    2017-10-01

    Natural disasters of a geodynamic nature can cause enormous economic and human losses. The economic costs of a landslide disaster include relocation of communities and physical repair of urban infrastructure. However, when performing a quantitative risk analysis, generally, the indirect economic consequences of such an event are not taken into account. A probabilistic approach methodology that considers several scenarios of hazard and vulnerability to measure the magnitude of the landslide and to quantify the economic costs is proposed. With this approach, it is possible to carry out a quantitative evaluation of the risk by landslides, allowing the calculation of the economic losses before a potential disaster in an objective, standardized and reproducible way, taking into account the uncertainty of the building costs in the study zone. The possibility of comparing different scenarios facilitates the urban planning process, the optimization of interventions to reduce risk to acceptable levels and an assessment of economic losses according to the magnitude of the damage. For the development and explanation of the proposed methodology, a simple case study is presented, located in north-eastern zone of the city of Medellín. This area has particular geomorphological characteristics, and it is also characterized by the presence of several buildings in bad structural conditions. The proposed methodology permits to obtain an estimative of the probable economic losses by earthquake-induced landslides, taking into account the uncertainty of the building costs in the study zone. The obtained estimative shows that the structural intervention of the buildings produces a reduction the order of 21 % in the total landslide risk.

  12. Hollow-fiber flow field-flow fractionation and multi-angle light scattering investigation of the size, shape and metal-release of silver nanoparticles in aqueous medium for nano-risk assessment.

    PubMed

    Marassi, Valentina; Casolari, Sonia; Roda, Barbara; Zattoni, Andrea; Reschiglian, Pierluigi; Panzavolta, Silvia; Tofail, Syed A M; Ortelli, Simona; Delpivo, Camilla; Blosi, Magda; Costa, Anna Luisa

    2015-03-15

    Due to the increased use of silver nanoparticles in industrial scale manufacturing, consumer products and nanomedicine reliable measurements of properties such as the size, shape and distribution of these nano particles in aqueous medium is critical. These properties indeed affect both functional properties and biological impacts especially in quantifying associated risks and identifying suitable risk-mediation strategies. The feasibility of on-line coupling of a fractionation technique such as hollow-fiber flow field flow fractionation (HF5) with a light scattering technique such as MALS (multi-angle light scattering) is investigated here for this purpose. Data obtained from such a fractionation technique and its combination thereof with MALS have been compared with those from more conventional but often complementary techniques e.g. transmission electron microscopy, dynamic light scattering, atomic absorption spectroscopy, and X-ray fluorescence. The combination of fractionation and multi angle light scattering techniques have been found to offer an ideal, hyphenated methodology for a simultaneous size-separation and characterization of silver nanoparticles. The hydrodynamic radii determined by fractionation techniques can be conveniently correlated to the mean average diameters determined by multi angle light scattering and reliable information on particle morphology in aqueous dispersion has been obtained. The ability to separate silver (Ag(+)) ions from silver nanoparticles (AgNPs) via membrane filtration during size analysis is an added advantage in obtaining quantitative insights to its risk potential. Most importantly, the methodology developed in this article can potentially be extended to similar characterization of metal-based nanoparticles when studying their functional effectiveness and hazard potential. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. From leaves to landscape: A multiscale approach to assess fire hazard in wildland-urban interface areas.

    PubMed

    Ghermandi, Luciana; Beletzky, Natacha A; de Torres Curth, Mónica I; Oddi, Facundo J

    2016-12-01

    The overlapping zone between urbanization and wildland vegetation, known as the wildland urban interface (WUI), is often at high risk of wildfire. Human activities increase the likelihood of wildfires, which can have disastrous consequences for property and land use, and can pose a serious threat to lives. Fire hazard assessments depend strongly on the spatial scale of analysis. We assessed the fire hazard in a WUI area of a Patagonian city by working at three scales: landscape, community and species. Fire is a complex phenomenon, so we used a large number of variables that correlate a priori with the fire hazard. Consequently, we analyzed environmental variables together with fuel load and leaf flammability variables and integrated all the information in a fire hazard map with four fire hazard categories. The Nothofagus dombeyi forest had the highest fire hazard while grasslands had the lowest. Our work highlights the vulnerability of the wildland-urban interface to fire in this region and our suggested methodology could be applied in other wildland-urban interface areas. Particularly in high hazard areas, our work could help in spatial delimitation policies, urban planning and development of plans for the protection of human lives and assets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Multi-Hazard Vulnerability Assessment Along the Coast of Visakhapatnam, North-East Coast of India

    NASA Astrophysics Data System (ADS)

    Vivek, G.; Grinivasa Kumar, T.

    2016-08-01

    The current study area is coastal zone of Visakhapatnam, district of Andhra Pradesh along the coast of India. This area is mostly vulnerable to many disasters such as storms, cyclone, flood, tsunami and erosion. This area is considered as cyclone prone area because of frequently occurrence of the cyclones in this area. Recently the two tropical cyclones that formed in the Bay of Bengal are Hudhud (October 13, 2014) and Phylin (October 11, 2013), has caused devastating impacts on the eastern coast and shows that the country has lack of preparedness to cyclone, storm surge and related natural hazards. The multi-hazard vulnerability maps prepared here are a blended and combined overlay of multiple hazards those affecting the coastal zone. The present study aims to develop a methodology for coastal multi-hazard vulnerability assessment. This study carried out using parameters like probability of coastal slope, tsunami arrival height, future sea level rise, coastal erosion and tidal range. The multi-hazard vulnerability maps prepared by overlaying of multi hazards those affecting the coastal zone. Multi-hazard vulnerability maps further reproduced as risk maps with the land use information. The decision making tools presented here can provide a useful information during the disaster for the evacuation process and to evolve a management strategy.

  15. Hazard analysis in active landslide areas in the State of Veracruz, Mexico

    NASA Astrophysics Data System (ADS)

    Wilde, Martina; Morales Barrera, Wendy V.; Rodriguez Elizarrarás, Sergio R.; Solleiro Rebolledo, Elizabeth; Sedov, Sergey; Terhorst, Birgit

    2016-04-01

    The year 2013 was characterized by strong storms and hurricanes like the Hurricanes Barbara and Ingrid and the tropical storms Barry and Fernand, which occurred between June and November affecting especially the coastal regions of Mexico. First of all, the State of Veracruz experienced a series of intense rainfalls and as consequences of these events over 780 landslides were registered. More than 45000 people suffered from evacuations. Located on the coast of the Gulf of Mexico, Veracruz has a wide range of altitude differences. The area with the highest elevations reaches from 5675 m.a.s.l. (Pico de Orizaba, the highest mountain of Mexico) to approximately 3000 m.a.s.l. and is characterized by steep slopes and V-shaped valleys. The mountains are part of the Sierra Madre Oriental and the Trans-Mexican Volcanic Belt. Plateaus and rounded hills are typical for the intermediate zones (3000 - 500 m.a.s.l.). The lowest zone (from 500 m.a.s.l. to sea level) is defined by moderate slopes, large rivers and coastal plain areas. The geology shows a variety and complexity of sedimentary and volcanic rocks. The sedimentary formations comprise claystones, siltstones, sandstones and calcareous rocks. Plateaus of basalts and andesites and deposits of ignimbrites are representative for this area. Even though Veracruz is a region highly endangered by landslides, currently there are no susceptibility maps or any other relevant information with high spatial resolution. Because of the lack of high definite information about the landslide hazards in this area, detailed investigations about the conditions (geology, geomorphology, thresholds, etc.) are indispensable. A doctoral grant from the German Academic Exchange Service (DAAD) allowed to carry out investigations in areas affected by large landslides in the year 2013. The selected study sites comprise damaged infrastructures and settlements. With a multi-methodological and interdisciplinary approach different processes and types of mass movements are analyzed in order to reconstruct complex interrelations of the causes and effects of landslide events. One of the major objectives of this research is to evaluate the potential hazard of active landslide areas. Detailed field analyzes were performed to investigate the situations and dynamics of the slope movements. Therefore, geomorphological mapping, sediment characterization as well as geophysical methods are applied. On the one hand, a detailed sediment characterization aims to identify the type of material (e.g. geotechnical attributes), on the other sediments can provide information on different activity phases, respectively movement processes in slide masses. Furthermore, the focus is placed on the determination of landslide relevant parameters and thresholds. Digital elevation models, which were generated before the onset of slope movements, are integrated in the geomorphological analysis. The poster presents the specific study sites in Veracruz and the situation of endangered slopes before and after the landslide events. It is planned to use this knowledge to model susceptibility maps for the region in the future. Moreover, field data will be used as basic information for further monitoring plans. Resulting susceptibility maps will be provided to the responsible authorities in order to support sustainable planning of settlements and infrastructure in hazardous regions.

  16. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.

  17. Hazard function theory for nonstationary natural hazards

    DOE PAGES

    Read, Laura K.; Vogel, Richard M.

    2016-04-11

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  18. Hazard function theory for nonstationary natural hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Read, Laura K.; Vogel, Richard M.

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  19. Value of Earth Observation for Risk Mitigation

    NASA Astrophysics Data System (ADS)

    Pearlman, F.; Shapiro, C. D.; Grasso, M.; Pearlman, J.; Adkins, J. E.; Pindilli, E.; Geppi, D.

    2017-12-01

    Societal benefits flowing from Earth observation are intuitively obvious as we use the information to assess natural hazards (such as storm tracks), water resources (such as flooding and droughts in coastal and riverine systems), ecosystem vitality and other dynamics that impact the health and economic well being of our population. The most powerful confirmation of these benefits would come from quantifying the impact and showing direct quantitative links in the value chain from data to decisions. However, our ability to identify and quantify those benefits is challenging. The impact of geospatial data on these types of decisions is not well characterized and assigning a true value to the observations on a broad scale across disciplines still remains to be done in a systematic way. This presentation provides the outcomes of a workshop held in October 2017 as a side event of the GEO Plenary that addressed research on economic methodologies for quantification of impacts. To achieve practical outputs during the meeting, the workshop focused on the use and value of Earth observations in risk mitigation including: ecosystem impacts, weather events, and other natural and manmade hazards. Case studies on approaches were discussed and will be part of this presentation. The presentation will also include the exchange of lessons learned and a discussion of gaps in the current understanding of the use and value of earth observation information for risk mitigation.

  20. Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses

    PubMed Central

    Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan

    2016-01-01

    This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10−11) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α1 = 1, α2 = 91; α1 = 1, α2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be −2.35 and −2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10−14 and 3.58×10−14, respectively. These results indicate that probability of C. perfringens foodborne illness by consumption cheese is low, and it can be used to establish microbial criteria for C. perfringens on natural and processed cheeses. PMID:26954204

  1. The work environment disability-adjusted life year for use with life cycle assessment: a methodological approach

    PubMed Central

    2013-01-01

    Background Life cycle assessment (LCA) is a systems-based method used to determine potential impacts to the environment associated with a product throughout its life cycle. Conclusions from LCA studies can be applied to support decisions regarding product design or public policy, therefore, all relevant inputs (e.g., raw materials, energy) and outputs (e.g., emissions, waste) to the product system should be evaluated to estimate impacts. Currently, work-related impacts are not routinely considered in LCA. The objectives of this paper are: 1) introduce the work environment disability-adjusted life year (WE-DALY), one portion of a characterization factor used to express the magnitude of impacts to human health attributable to work-related exposures to workplace hazards; 2) outline the methods for calculating the WE-DALY; 3) demonstrate the calculation; and 4) highlight strengths and weaknesses of the methodological approach. Methods The concept of the WE-DALY and the methodological approach to its calculation is grounded in the World Health Organization’s disability-adjusted life year (DALY). Like the DALY, the WE-DALY equation considers the years of life lost due to premature mortality and the years of life lived with disability outcomes to estimate the total number of years of healthy life lost in a population. The equation requires input in the form of the number of fatal and nonfatal injuries and illnesses that occur in the industries relevant to the product system evaluated in the LCA study, the age of the worker at the time of the fatal or nonfatal injury or illness, the severity of the injury or illness, and the duration of time lived with the outcomes of the injury or illness. Results The methodological approach for the WE-DALY requires data from various sources, multi-step instructions to determine each variable used in the WE-DALY equation, and assumptions based on professional opinion. Conclusions Results support the use of the WE-DALY in a characterization factor in LCA. Integrating occupational health into LCA studies will provide opportunities to prevent shifting of impacts between the work environment and the environment external to the workplace and co-optimize human health, to include worker health, and environmental health. PMID:23497039

  2. Hazardous-waste analysis plan for LLNL operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, R.S.

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan willmore » address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.« less

  3. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  4. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  5. Comparison of the historical record of earthquake hazard with seismic-hazard models for New Zealand and the continental United States

    USGS Publications Warehouse

    Stirling, M.; Petersen, M.

    2006-01-01

    We compare the historical record of earthquake hazard experienced at 78 towns and cities (sites) distributed across New Zealand and the continental United States with the hazard estimated from the national probabilistic seismic-hazard (PSH) models for the two countries. The two PSH models are constructed with similar methodologies and data. Our comparisons show a tendency for the PSH models to slightly exceed the historical hazard in New Zealand and westernmost continental United States interplate regions, but show lower hazard than that of the historical record in the continental United States intraplate region. Factors such as non-Poissonian behavior, parameterization of active fault data in the PSH calculations, and uncertainties in estimation of ground-motion levels from historical felt intensity data for the interplate regions may have led to the higher-than-historical levels of hazard at the interplate sites. In contrast, the less-than-historical hazard for the remaining continental United States (intraplate) sites may be largely due to site conditions not having been considered at the intraplate sites, and uncertainties in correlating ground-motion levels to historical felt intensities. The study also highlights the importance of evaluating PSH models at more than one region, because the conclusions reached on the basis of a solely interplate or intraplate study would be very different.

  6. Integrating asthma hazard characterization methods for consumer products.

    PubMed

    Maier, A; Vincent, M J; Gadagbui, B; Patterson, J; Beckett, W; Dalton, P; Kimber, I; Selgrade, M J K

    2014-10-01

    Despite extensive study, definitive conclusions regarding the relationship between asthma and consumer products remain elusive. Uncertainties reflect the multi-faceted nature of asthma (i.e., contributions of immunologic and non-immunologic mechanisms). Many substances used in consumer products are associated with occupational asthma or asthma-like syndromes. However, risk assessment methods do not adequately predict the potential for consumer product exposures to trigger asthma and related syndromes under lower-level end-user conditions. A decision tree system is required to characterize asthma and respiratory-related hazards associated with consumer products. A system can be built to incorporate the best features of existing guidance, frameworks, and models using a weight-of-evidence (WoE) approach. With this goal in mind, we have evaluated chemical hazard characterization methods for asthma and asthma-like responses. Despite the wealth of information available, current hazard characterization methods do not definitively identify whether a particular ingredient will cause or exacerbate asthma, asthma-like responses, or sensitization of the respiratory tract at lower levels associated with consumer product use. Effective use of hierarchical lines of evidence relies on consideration of the relevance and potency of assays, organization of assays by mode of action, and better assay validation. It is anticipated that the analysis of existing methods will support the development of a refined WoE approach. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Joseph Daniel; Anderson, Robert Stephen

    Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less

  8. Multiple Natural Hazards Assessment and Comparison to Planned Land Use in an Andean Touristic Site within the Riskscape Central Chile

    NASA Astrophysics Data System (ADS)

    Braun, Andreas; Jaque Castillo, Edilia

    2017-04-01

    The Andes of central Chile are a natural environment characterized by multiple natural hazards (mass movements, volcanic hazards, seismic hazards, snow avalanches to name a few). The totality of these hazards, according to the notion of Müller-Mahn et al. an in relation to vulnerable entities, spans a riskscape. Spatial planning should take this riskscape into account in order to ensure a save an resilient regional development. However, as frequently observed in developing or newly developed countries, such precaution measures are only hardly realized. Spatial planing tends to be reactive to private inversion, opportunistic and frequently clientelistic. This results in spatial structures whose future development is vulnerable to natural disasters. The contribution analyses these circumstances within a riskscape in central Chile. Within the VIII. Region, close to the volcanic complex Nevados de Chillan, a touristic development around a Hotel for winter sports is established. However, the place is affected by a multitude of natural hazards. The contribution, on the basis of primary and secondary data, first provides hazard maps for several natural hazards. Secondly, the individual hazard maps are merged to an overall hazard map. This overall hazard map is related to the vulnerable entities to span a riskscape. The vulnerable entities are settlements, but also tourist infrastructures. Then, the contribution compares how a precautions spatial planning could have avoided putting vulnerable entities at risk, which spatial structure - especially regarding tourism - is actually found and which challenges for spatial development do exist. It reveals that the most important tourist infrastructures are found particularly at places, characterized by a high overall hazard. Furthermore, it will show that alternatives at economically equally attractive sites, but with a much smaller overall hazard, would have existed. It concludes by discussing possible reasons for this by considering the Chilean planning system.

  9. LNG risk management

    NASA Astrophysics Data System (ADS)

    Martino, P.

    1980-12-01

    A general methodology is presented for conducting an analysis of the various aspects of the hazards associated with the storage and transportation of liquefied natural gas (LNG) which should be considered during the planning stages of a typical LNG ship terminal. The procedure includes the performance of a hazards and system analysis of the proposed site, a probability analysis of accident scenarios and safety impacts, an analysis of the consequences of credible accidents such as tanker accidents, spills and fires, the assessment of risks and the design and evaluation of risk mitigation measures.

  10. Implementing the effect of the rupture directivity on PSHA maps: Application to the Marmara Region (Turkey)

    NASA Astrophysics Data System (ADS)

    Herrero, Andre; Spagnuolo, Elena; Akinci, Aybige; Pucci, Stefano

    2016-04-01

    In the present study we attempted to improve the seismic hazard assessment taking into account possible sources of epistemic uncertainty and the azimuthal variability of the ground motions which, at a particular site, is significantly influenced by the rupture mechanism and the rupture direction relative to the site. As a study area we selected Marmara Region (Turkey), especially the city of Istanbul which is characterized by one of the highest levels of seismic risk in Europe and the Mediterranean region. The seismic hazard in the city is mainly associated with two active fault segments which are located at about 20-30 km south of Istanbul. In this perspective first we proposed a methodology to incorporate this new information such as nucleation point in a probabilistic seismic hazard analysis (PSHA) framework. Secondly we introduced information about those fault segments by focusing on the fault rupture characteristics which affect the azimuthal variations of the ground motion spatial distribution i.e. source directivity effect and its influence on the probabilistic seismic hazard analyses (PSHA). An analytical model developed by Spudich and Chiou (2008) is used as a corrective factor that modifies the Next Generation Attenuation (NGA, Power et al. 2008) ground motion predictive equations (GMPEs) introducing rupture related parameters that generally lump together into the term directivity effect. We used the GMPEs as derived by the Abrahamson and Silva (2008) and the Boore and Atkinson (2008); our results are given in terms of 10% probability of exceedance of PSHA (at several periods from 0.5 s to 10 s) in 50 years on rock site condition; the correction for directivity introduces a significant contribution to the percentage ratio between the seismic hazards computed using the directivity model respect to the seismic hazard standard practice. In particular, we benefited the dynamic simulation from a previous study (Aochi & Utrich, 2015) aimed at evaluating the seismic potential of the Marmara region to derive a statistical distribution for nucleation position. Our results suggest that accounting for rupture related parameters in a PSHA using deterministic information from dynamic models is feasible and in particular, the use of a non-uniform statistical distribution for nucleation position has serious consequences on the hazard assessment. Since the directivity effect is conditional on the nucleation position the hazard map changes with the assumptions made. A worst case scenario (both the faults are rupturing towards the city of Istanbul) predicts up to 25% change than the standard formulation at 2 sec and increases with longer periods. The former result is heavily different if a deterministically based nucleation position is assumed.

  11. Development of hazard-compatible building fragility and vulnerability models

    USGS Publications Warehouse

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  12. Interdependence and dynamics of essential services in an extensive risk context: a case study in Montserrat, West Indies

    NASA Astrophysics Data System (ADS)

    Sword-Daniels, V. L.; Rossetto, T.; Wilson, T. M.; Sargeant, S.

    2015-05-01

    The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data, we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard conditions, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts.

  13. Interdependence and dynamics of essential services in an extensive risk context: a case study in Montserrat, West Indies

    NASA Astrophysics Data System (ADS)

    Sword-Daniels, V. L.; Rossetto, T.; Wilson, T. M.; Sargeant, S.

    2015-02-01

    The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard condition, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts.

  14. Hazardous-waste-characterization survey of unknown drums at the 21st Tactical Fighter Wing, Elmendorf and Shemya Air Force Bases, and Galena and King Salmon Airports, Alaska. Final report 2-13 Aug 91

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bishop, M.S.

    1991-12-01

    At the request of the USAF Regional Hospital Elmendorf/SGPB (PACAF), the Armstrong Laboratory, Occupational and Environmental Health Directorate, conducted a hazardous waste characterization survey of unknown drums at Elmendorf AFB from 2 Aug - 13 Aug 91. The scope of the survey was to sample and characterize drums of unknown material stored at Elmendorf AFB, Shemya AFB, and Galena and King Salmon Airports. Several waste streams were sampled at Elmendorf AFB to revalidate sample results from a previous survey.

  15. TSCA Work Plan: 2012 Scoring of Potential Candidate Chemicals Entering Step 2

    EPA Pesticide Factsheets

    In 2012, EPA scored these chemicals based on hazard, exposure and persistence/bioaccumulation criteria as part of Step 2 in the Work Plan methodology in order to identify candidate chemicals for near-term review and assessment under TSCA.

  16. RAPID ASSESSMENT OF POTENTIAL GROUND-WATER CONTAMINATION UNDER EMERGENCY RESPONSE CONDITIONS

    EPA Science Inventory

    Emergency response actions at chemical spills and abandoned hazardous waste sites often require rapid assessment of the potential for groundwater contamination by the chemical or waste compound. This manual provides a rapid assessment methodology for performing such an evaluation...

  17. EPA'S LIFE CYCLE METHODOLOGY: GUIDELINES FOR USE IN DEVELOPMENT OF PACKAGING

    EPA Science Inventory

    Approaches to reducing environmental effects of products and processes have moved steadily upstream over the years from end-of-pipe controls to source reduction and recycling of hazardous waste, and more recently, toward multimedia pollution prevention. ife Cycle Assessment (LCA)...

  18. [Hazardous materials and work safety in veterinary practice. 1: Hazardous material definition and characterization, practice documentation and general rules for handling].

    PubMed

    Sliwinski-Korell, A; Lutz, F

    1998-04-01

    In the last years the standards for professional handling of hazardous material as well as health and safety in the veterinary practice became considerably more stringent. This is expressed in various safety regulations, particularly the decree of hazardous material and the legislative directives concerning health and safety at work. In part 1, a definition based on the law for hazardous material is given and the potential risks are mentioned. The correct documentation regarding the protection of the purchase, storage, working conditions and removal of hazardous material and of the personal is explained. General rules for the handling of hazardous material are described. In part 2, particular emphasis is put on the handling of flammable liquids, disinfectants, cytostatica, pressurised gas, liquid nitrogen, narcotics, mailing of potentially infectious material and safe disposal of hazardous waste. Advice about possible unrecognized hazards and references is also given.

  19. Volcanic hazards at distant critical infrastructure: A method for bespoke, multi-disciplinary assessment

    NASA Astrophysics Data System (ADS)

    Odbert, H. M.; Aspinall, W.; Phillips, J.; Jenkins, S.; Wilson, T. M.; Scourse, E.; Sheldrake, T.; Tucker, P.; Nakeshree, K.; Bernardara, P.; Fish, K.

    2015-12-01

    Societies rely on critical services such as power, water, transport networks and manufacturing. Infrastructure may be sited to minimise exposure to natural hazards but not all can be avoided. The probability of long-range transport of a volcanic plume to a site is comparable to other external hazards that must be considered to satisfy safety assessments. Recent advances in numerical models of plume dispersion and stochastic modelling provide a formalized and transparent approach to probabilistic assessment of hazard distribution. To understand the risks to critical infrastructure far from volcanic sources, it is necessary to quantify their vulnerability to different hazard stressors. However, infrastructure assets (e.g. power plantsand operational facilities) are typically complex systems in themselves, with interdependent components that may differ in susceptibility to hazard impact. Usually, such complexity means that risk either cannot be estimated formally or that unsatisfactory simplifying assumptions are prerequisite to building a tractable risk model. We present a new approach to quantifying risk by bridging expertise of physical hazard modellers and infrastructure engineers. We use a joint expert judgment approach to determine hazard model inputs and constrain associated uncertainties. Model outputs are chosen on the basis of engineering or operational concerns. The procedure facilitates an interface between physical scientists, with expertise in volcanic hazards, and infrastructure engineers, with insight into vulnerability to hazards. The result is a joined-up approach to estimating risk from low-probability hazards to critical infrastructure. We describe our methodology and show preliminary results for vulnerability to volcanic hazards at a typical UK industrial facility. We discuss our findings in the context of developing bespoke assessment of hazards from distant sources in collaboration with key infrastructure stakeholders.

  20. Guidebook: Quality Assurance/Quality Control Procedures for Submission of Data for the Land Disposal Restrictions (LDR) Program

    EPA Pesticide Factsheets

    This document explains how to generate data which characterizes the performance of hazardous waste treatment systems in terms of the composition of treated hazardous waste streams plus treatment system operation and design.

  1. IMMUNE SYSTEM ONTOGENY AND DEVELOPMENTAL IMMUNOTOXICOLOGY

    EPA Science Inventory

    Animal testing for the identification and characterization of hazard(s), associated with exposure to toxic chemicals, is an accepted approach for identifying the potential risk to humans. The rodent, in particular the rat, has been the most commonly used species for routine toxi...

  2. Prediction of spatially explicit rainfall intensity-duration thresholds for post-fire debris-flow generation in the western United States

    NASA Astrophysics Data System (ADS)

    Staley, Dennis; Negri, Jacquelyn; Kean, Jason

    2016-04-01

    Population expansion into fire-prone steeplands has resulted in an increase in post-fire debris-flow risk in the western United States. Logistic regression methods for determining debris-flow likelihood and the calculation of empirical rainfall intensity-duration thresholds for debris-flow initiation represent two common approaches for characterizing hazard and reducing risk. Logistic regression models are currently being used to rapidly assess debris-flow hazard in response to design storms of known intensities (e.g. a 10-year recurrence interval rainstorm). Empirical rainfall intensity-duration thresholds comprise a major component of the United States Geological Survey (USGS) and the National Weather Service (NWS) debris-flow early warning system at a regional scale in southern California. However, these two modeling approaches remain independent, with each approach having limitations that do not allow for synergistic local-scale (e.g. drainage-basin scale) characterization of debris-flow hazard during intense rainfall. The current logistic regression equations consider rainfall a unique independent variable, which prevents the direct calculation of the relation between rainfall intensity and debris-flow likelihood. Regional (e.g. mountain range or physiographic province scale) rainfall intensity-duration thresholds fail to provide insight into the basin-scale variability of post-fire debris-flow hazard and require an extensive database of historical debris-flow occurrence and rainfall characteristics. Here, we present a new approach that combines traditional logistic regression and intensity-duration threshold methodologies. This method allows for local characterization of both the likelihood that a debris-flow will occur at a given rainfall intensity, the direct calculation of the rainfall rates that will result in a given likelihood, and the ability to calculate spatially explicit rainfall intensity-duration thresholds for debris-flow generation in recently burned areas. Our approach synthesizes the two methods by incorporating measured rainfall intensity into each model variable (based on measures of topographic steepness, burn severity and surface properties) within the logistic regression equation. This approach provides a more realistic representation of the relation between rainfall intensity and debris-flow likelihood, as likelihood values asymptotically approach zero when rainfall intensity approaches 0 mm/h, and increase with more intense rainfall. Model performance was evaluated by comparing predictions to several existing regional thresholds. The model, based upon training data collected in southern California, USA, has proven to accurately predict rainfall intensity-duration thresholds for other areas in the western United States not included in the original training dataset. In addition, the improved logistic regression model shows promise for emergency planning purposes and real-time, site-specific early warning. With further validation, this model may permit the prediction of spatially-explicit intensity-duration thresholds for debris-flow generation in areas where empirically derived regional thresholds do not exist. This improvement would permit the expansion of the early-warning system into other regions susceptible to post-fire debris flow.

  3. Adaption of the Magnetometer Towed Array geophysical system to meet Department of Energy needs for hazardous waste site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cochran, J.R.; McDonald, J.R.; Russell, R.J.

    1995-10-01

    This report documents US Department of Energy (DOE)-funded activities that have adapted the US Navy`s Surface Towed Ordnance Locator System (STOLS) to meet DOE needs for a ``... better, faster, safer and cheaper ...`` system for characterizing inactive hazardous waste sites. These activities were undertaken by Sandia National Laboratories (Sandia), the Naval Research Laboratory, Geo-Centers Inc., New Mexico State University and others under the title of the Magnetometer Towed Array (MTA).

  4. Ground motion models used in the 2014 U.S. National Seismic Hazard Maps

    USGS Publications Warehouse

    Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.

    2015-01-01

    The National Seismic Hazard Maps (NSHMs) are an important component of seismic design regulations in the United States. This paper compares hazard using the new suite of ground motion models (GMMs) relative to hazard using the suite of GMMs applied in the previous version of the maps. The new source characterization models are used for both cases. A previous paper (Rezaeian et al. 2014) discussed the five NGA-West2 GMMs used for shallow crustal earthquakes in the Western United States (WUS), which are also summarized here. Our focus in this paper is on GMMs for earthquakes in stable continental regions in the Central and Eastern United States (CEUS), as well as subduction interface and deep intraslab earthquakes. We consider building code hazard levels for peak ground acceleration (PGA), 0.2-s, and 1.0-s spectral accelerations (SAs) on uniform firm-rock site conditions. The GMM modifications in the updated version of the maps created changes in hazard within 5% to 20% in WUS; decreases within 5% to 20% in CEUS; changes within 5% to 15% for subduction interface earthquakes; and changes involving decreases of up to 50% and increases of up to 30% for deep intraslab earthquakes for most U.S. sites. These modifications were combined with changes resulting from modifications in the source characterization models to obtain the new hazard maps.

  5. A modeling approach to account for toxicokinetic interactions in the calculation of biological hazard index for chemical mixtures.

    PubMed

    Haddad, S; Tardif, R; Viau, C; Krishnan, K

    1999-09-05

    Biological hazard index (BHI) is defined as biological level tolerable for exposure to mixture, and is calculated by an equation similar to the conventional hazard index. The BHI calculation, at the present time, is advocated for use in situations where toxicokinetic interactions do not occur among mixture constituents. The objective of this study was to develop an approach for calculating interactions-based BHI for chemical mixtures. The approach consisted of simulating the concentration of exposure indicator in the biological matrix of choice (e.g. venous blood) for each component of the mixture to which workers are exposed and then comparing these to the established BEI values, for calculating the BHI. The simulation of biomarker concentrations was performed using a physiologically-based toxicokinetic (PBTK) model which accounted for the mechanism of interactions among all mixture components (e.g. competitive inhibition). The usefulness of the present approach is illustrated by calculating BHI for varying ambient concentrations of a mixture of three chemicals (toluene (5-40 ppm), m-xylene (10-50 ppm), and ethylbenzene (10-50 ppm)). The results show that the interactions-based BHI can be greater or smaller than that calculated on the basis of additivity principle, particularly at high exposure concentrations. At lower exposure concentrations (e.g. 20 ppm each of toluene, m-xylene and ethylbenzene), the BHI values obtained using the conventional methodology are similar to the interactions-based methodology, confirming that the consequences of competitive inhibition are negligible at lower concentrations. The advantage of the PBTK model-based methodology developed in this study relates to the fact that, the concentrations of individual chemicals in mixtures that will not result in a significant increase in the BHI (i.e. > 1) can be determined by iterative simulation.

  6. Development of a risk-based prioritisation methodology to inform public health emergency planning and preparedness in case of accidental spill at sea of hazardous and noxious substances (HNS).

    PubMed

    Harold, P D; de Souza, A S; Louchart, P; Russell, D; Brunt, H

    2014-11-01

    Hazardous and noxious chemicals are increasingly being transported by sea. Current estimates indicate some 2000 hazardous and noxious substances (HNS) are carried regularly by sea with bulk trade of 165milliontonnes per year worldwide. Over 100 incidents involving HNS have been reported in EU waters. Incidents occurring in a port or coastal area can have potential and actual public health implications. A methodology has been developed for prioritisation of HNS, based upon potential public health risks. The work, undertaken for the Atlantic Region Pollution Response programme (ARCOPOL), aims to provide information for incident planning and preparedness. HNS were assessed using conventional methodology based upon acute toxicity, behaviour and reactivity. Tonnage was used as a proxy for likelihood, although other factors such as shipping frequency and local navigation may also contribute. Analysis of 350 individual HNS identified the highest priority HNS as being those that present an inhalation risk. Limitations were identified around obtaining accurate data on HNS handled on a local and regional level due to a lack of port records and also political and commercial confidentiality issues. To account for this the project also developed a software tool capable of combining chemical data from the study with user defined shipping data to be used by operators to produce area-specific prioritisations. In conclusion a risk prioritisation matrix has been developed to assess the acute risks to public health from the transportation of HNS. Its potential use in emergency planning and preparedness is discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Spatially resolved hazard and exposure assessments: an example of lead in soil at Lavrion, Greece.

    PubMed

    Tristán, E; Demetriades, A; Ramsey, M H; Rosenbaum, M S; Stavrakis, P; Thornton, I; Vassiliades, E; Vergou, K

    2000-01-01

    Spatially resolved hazard assessment (SRHA) and spatially resolved exposure assessment (SREA) are methodologies that have been devised for assessing child exposure to soil containing environmental pollutants. These are based on either a quantitative or a semiquantitative approach. The feasibility of the methodologies has been demonstrated in a study assessing child exposure to Pb accessible in soil at the town of Lavrion in Greece. Using a quantitative approach, both measured and kriged concentrations of Pb in soil are compared with an "established" statutory threshold value. The probabilistic approach gives a refined classification of the contaminated land, since it takes into consideration the uncertainty in both the actual measurement and estimated kriged values. Two exposure assessment models (i.e., IEUBK and HESP) are used as the basis of the quantitative SREA methodologies. The significant correlation between the blood-Pb predictions, using the IEUBK model, and measured concentrations provides a partial validation of the method, because it allows for the uncertainty in the measurements and the lack of some site-specific measurements. The semiquantitative applications of SRHA and SREA incorporate both qualitative information (e.g., land use and dustiness of waste) and quantitative information (e.g., distance from wastes and distance from industry). The significant correlation between the results of these assessments and the measured blood-Pb levels confirms the robust nature of this approach. Successful application of these methodologies could reduce the cost of the assessment and allow areas to be prioritized for further investigation, remediation, or risk management.

  8. Published methodological quality of randomized controlled trials does not reflect the actual quality assessed in protocols

    PubMed Central

    Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P.; Kumar, Ambuj

    2011-01-01

    Objectives To assess whether reported methodological quality of randomized controlled trials (RCTs) reflect the actual methodological quality, and to evaluate the association of effect size (ES) and sample size with methodological quality. Study design Systematic review Setting Retrospective analysis of all consecutive phase III RCTs published by 8 National Cancer Institute Cooperative Groups until year 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Results 429 RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94, 95%CI: 0.88, 0.99) and 24% (RHR: 1.24, 95%CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. Conclusion The largest study to-date shows poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. PMID:22424985

  9. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part I: Propagation Modelling and Tsunami Hazard Assessment at the Shoreline

    NASA Astrophysics Data System (ADS)

    Power, William; Wang, Xiaoming; Lane, Emily; Gillibrand, Philip

    2013-09-01

    Regional source tsunamis represent a potentially devastating threat to coastal communities in New Zealand, yet are infrequent events for which little historical information is available. It is therefore essential to develop robust methods for quantitatively estimating the hazards posed, so that effective mitigation measures can be implemented. We develop a probabilistic model for the tsunami hazard posed to the Auckland region of New Zealand from the Kermadec Trench and the southern New Hebrides Trench subduction zones. An innovative feature of our model is the systematic analysis of uncertainty regarding the magnitude-frequency distribution of earthquakes in the source regions. The methodology is first used to estimate the tsunami hazard at the coastline, and then used to produce a set of scenarios that can be applied to produce probabilistic maps of tsunami inundation for the study region; the production of these maps is described in part II. We find that the 2,500 year return period regional source tsunami hazard for the densely populated east coast of Auckland is dominated by events originating in the Kermadec Trench, while the equivalent hazard to the sparsely populated west coast is approximately equally due to events on the Kermadec Trench and the southern New Hebrides Trench.

  10. System for ranking relative threats of U.S. volcanoes

    USGS Publications Warehouse

    Ewert, J.W.

    2007-01-01

    A methodology to systematically rank volcanic threat was developed as the basis for prioritizing volcanoes for long-term hazards evaluations, monitoring, and mitigation activities. A ranking of 169 volcanoes in the United States and the Commonwealth of the Northern Mariana Islands (U.S. volcanoes) is presented based on scores assigned for various hazard and exposure factors. Fifteen factors define the hazard: Volcano type, maximum known eruptive explosivity, magnitude of recent explosivity within the past 500 and 5,000 years, average eruption-recurrence interval, presence or potential for a suite of hazardous phenomena (pyroclastic flows, lahars, lava flows, tsunami, flank collapse, hydrothermal explosion, primary lahar), and deformation, seismic, or degassing unrest. Nine factors define exposure: a measure of ground-based human population in hazard zones, past fatalities and evacuations, a measure of airport exposure, a measure of human population on aircraft, the presence of power, transportation, and developed infrastructure, and whether or not the volcano forms a significant part of a populated island. The hazard score and exposure score for each volcano are multiplied to give its overall threat score. Once scored, the ordered list of volcanoes is divided into five overall threat categories from very high to very low. ?? 2007 ASCE.

  11. A risk assessment approach for fresh fruits.

    PubMed

    Bassett, J; McClure, P

    2008-04-01

    To describe the approach used in conducting a fit-for-purpose risk assessment of microbiological human pathogens associated with fresh fruit and the risk management recommendations made. A qualitative risk assessment for microbiological hazards in fresh fruit was carried out based on the Codex Alimentarius (Codex) framework, modified to consider multiple hazards and all fresh (whole) fruits. The assessment determines 14 significant bacterial, viral, protozoal and nematodal hazards associated with fresh produce, assesses the probable level of exposure from fresh fruit, concludes on the risk from each hazard, and considers and recommends risk management actions. A review of potential risk management options allowed the comparison of effectiveness with the potential exposure to each hazard. Washing to a recommended protocol is an appropriate risk management action for the vast majority of consumption events, particularly when good agricultural and hygienic practices are followed and with the addition of refrigerated storage for low acid fruit. Additional safeguards are recommended for aggregate fruits with respect to the risk from protozoa. The potentially complex process of assessing the risks of multiple hazards in multiple but similar commodities can be simplified in a qualitative assessment approach that employs the Codex methodology.

  12. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    NASA Astrophysics Data System (ADS)

    Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey

    2015-06-01

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.

  13. Hazardous workplace review program in Taiwan.

    PubMed

    Chang, Yi-Kuo; Chuang, Kuen-Yuan; Tseng, Jo-Ming; Lin, Fang-Chen; Su, Teh-Sheng

    2013-01-01

    In Taiwan, relevant mid-term plans and projects of mitigating occupational hazards have been launched in recent years in the hopes of lowering the incidence of occupational hazards. In light of the lack of objective methodologies for researches on issues pertaining occupational safety and health, this research aims to explore the priorities of safety and health issues through focal groups, expert questionnaires and interviews on relevant issues such as hazard installations identified in R181 Prevention of Major Industrial Accidents Recommendation, 1993 proposed during the 18th World Congress on Safety and Health at work in Seoul 2008. Results revealed that distribute reports of major domestic/foreign occupational disasters to relevant sectors for the prevention of major accidents is needed, both from the importance and feasibility analysis. It is the only topic that scored over 4 points in average for expert and focal group consensus. Furthermore, the experts and focal groups came to consensus in the ranking of priority for 4 items, namely: 1) Installations containing/using large quantities of hazardous materials should be prioritized for inspection, 2) Incorporation of hazard installation review/inspection into OSH management system accreditation, 3) Impose operation shutdown as a means of penalty) and 4) Prioritize the promotion of preliminary PHA.

  14. Human health and safety risks management in underground coal mines using fuzzy TOPSIS.

    PubMed

    Mahdevari, Satar; Shahriar, Kourosh; Esfahanipour, Akbar

    2014-08-01

    The scrutiny of health and safety of personnel working in underground coal mines is heightened because of fatalities and disasters that occur every year worldwide. A methodology based on fuzzy TOPSIS was proposed to assess the risks associated with human health in order to manage control measures and support decision-making, which could provide the right balance between different concerns, such as safety and costs. For this purpose, information collected from three hazardous coal mines namely Hashouni, Hojedk and Babnizu located at the Kerman coal deposit, Iran, were used to manage the risks affecting the health and safety of their miners. Altogether 86 hazards were identified and classified under eight categories: geomechanical, geochemical, electrical, mechanical, chemical, environmental, personal, and social, cultural and managerial risks. Overcoming the uncertainty of qualitative data, the ranking process is accomplished by fuzzy TOPSIS. After running the model, twelve groups with different risks were obtained. Located in the first group, the most important risks with the highest negative effects are: materials falling, catastrophic failure, instability of coalface and immediate roof, firedamp explosion, gas emission, misfire, stopping of ventilation system, wagon separation at inclines, asphyxiation, inadequate training and poor site management system. According to the results, the proposed methodology can be a reliable technique for management of the minatory hazards and coping with uncertainties affecting the health and safety of miners when performance ratings are imprecise. The proposed model can be primarily designed to identify potential hazards and help in taking appropriate measures to minimize or remove the risks before accidents can occur. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Probabilistic Risk Analysis of Run-up and Inundation in Hawaii due to Distant Tsunamis

    NASA Astrophysics Data System (ADS)

    Gica, E.; Teng, M. H.; Liu, P. L.

    2004-12-01

    Risk assessment of natural hazards usually includes two aspects, namely, the probability of the natural hazard occurrence and the degree of damage caused by the natural hazard. Our current study is focused on the first aspect, i.e., the development and evaluation of a methodology that can predict the probability of coastal inundation due to distant tsunamis in the Pacific Basin. The calculation of the probability of tsunami inundation could be a simple statistical problem if a sufficiently long record of field data on inundation was available. Unfortunately, such field data are very limited in the Pacific Basin due to the reason that field measurement of inundation requires the physical presence of surveyors on site. In some areas, no field measurements were ever conducted in the past. Fortunately, there are more complete and reliable historical data on earthquakes in the Pacific Basin partly because earthquakes can be measured remotely. There are also numerical simulation models such as the Cornell COMCOT model that can predict tsunami generation by an earthquake, propagation in the open ocean, and inundation onto a coastal land. Our objective is to develop a methodology that can link the probability of earthquakes in the Pacific Basin with the inundation probability in a coastal area. The probabilistic methodology applied here involves the following steps: first, the Pacific Rim is divided into blocks of potential earthquake sources based on the past earthquake record and fault information. Then the COMCOT model is used to predict the inundation at a distant coastal area due to a tsunami generated by an earthquake of a particular magnitude in each source block. This simulation generates a response relationship between the coastal inundation and an earthquake of a particular magnitude and location. Since the earthquake statistics is known for each block, by summing the probability of all earthquakes in the Pacific Rim, the probability of the inundation in a coastal area can be determined through the response relationship. Although the idea of the statistical methodology applied here is not new, this study is the first to apply it to study the probability of inundation caused by earthquake-generated distant tsunamis in the Pacific Basin. As a case study, the methodology is applied to predict the tsunami inundation risk in Hilo Bay in Hawaii. Since relatively more field data on tsunami inundation are available for Hilo Bay, this case study can help to evaluate the applicability of the methodology for predicting tsunami inundation risk in the Pacific Basin. Detailed results will be presented at the AGU meeting.

  16. Lateral spread hazard mapping of the northern Salt Lake Valley, Utah, for a M7.0 scenario earthquake

    USGS Publications Warehouse

    Olsen, M.J.; Bartlett, S.F.; Solomon, B.J.

    2007-01-01

    This paper describes the methodology used to develop a lateral spread-displacement hazard map for northern Salt Lake Valley, Utah, using a scenario M7.0 earthquake occurring on the Salt Lake City segment of the Wasatch fault. The mapping effort is supported by a substantial amount of geotechnical, geologic, and topographic data compiled for the Salt Lake Valley, Utah. ArcGIS?? routines created for the mapping project then input this information to perform site-specific lateral spread analyses using methods developed by Bartlett and Youd (1992) and Youd et al. (2002) at individual borehole locations. The distributions of predicted lateral spread displacements from the boreholes located spatially within a geologic unit were subsequently used to map the hazard for that particular unit. The mapped displacement zones consist of low hazard (0-0.1 m), moderate hazard (0.1-0.3 m), high hazard (0.3-1.0 m), and very high hazard (> 1.0 m). As expected, the produced map shows the highest hazard in the alluvial deposits at the center of the valley and in sandy deposits close to the fault. This mapping effort is currently being applied to the southern part of the Salt Lake Valley, Utah, and probabilistic maps are being developed for the entire valley. ?? 2007, Earthquake Engineering Research Institute.

  17. Using the Triad Approach to Improve the Cost-effectiveness of Hazardous Waste Site Cleanups

    EPA Pesticide Factsheets

    U.S. EPA's Office of Solid Waste and Emergency Response is promoting more effective strategies for characterizing, monitoring, and cleaning up hazardous waste sites. In particular, a paradigm based on using an integrated triad of systematic planning...

  18. Developmental Neurotoxicity Testing: A Path Forward

    EPA Science Inventory

    Great progress has been made over the past 40 years in understanding the hazards of exposure to a small number of developmental neurotoxicants. Lead, PCBs, and methylmercury are all good examples of science-based approaches to characterizing the hazard to the developing nervous s...

  19. [Work-related accidents on oil drilling platforms in the Campos Basin, Rio de Janeiro, Brazil].

    PubMed

    Freitas, C M; Souza, C A; Machado, J M; Porto, M F

    2001-01-01

    The offshore oil industry is characterized by complex systems in relation to technology and organization of work. Working conditions are hazardous, resulting in accidents and even occasional full-scale catastrophes. This article is the result of a study on work-related accidents in the offshore platforms in the Campos Basin, Rio de Janeiro State. The primary objective was to provide technical back-up for both workers' representative organizations and public authorities. As a methodology, we attempt to go beyond the immediate causes of accidents and emphasize underlying causes related to organizational and managerial aspects. The sources were used in such a way as to permit classification in relation to the type of incident, technological system, operation, and immediate and underlying causes. The results show the aggravation of safety conditions and the immediate need for public authorities and the offshore oil industry in Brazil to change the methods used to investigate accidents in order to identify the main causes in the organizational and managerial structure of companies.

  20. Framework Analysis for Determining Mode of Action & Human Relevance

    EPA Science Inventory

    The overall aim of a cancer risk assessment is to characterize the risk to humans from environmental exposures. This risk characterization includes a qualitative and quantitative risk characterization that relies on the development of separate hazard, dose- response and exposure...

  1. SITE CHARACTERIZATION LIBRARY: VOLUMN 1 (RELEASE 2.5)

    EPA Science Inventory

    This CD-ROM, Volume 1, Release 2.5, of EPA's National Exposure Research Laboratory (NERL - Las Vegas) Site Characterization Library, contains additional electronic documents and computer programs related to the characterization of hazardous waste sites. EPA has produced this libr...

  2. Condition Assessment Modeling for Distribution Systems Using Shared Frailty Analysis

    EPA Science Inventory

    Condition Assessment (CA) modeling is drawing increasing interest as a methodology for managing drinking water infrastructure. This paper develops a Cox Proportional Hazard (PH)/shared frailty model and applies it to the problem of investment in the repair and replacement of dri...

  3. Integrating intelligent transportation systems within the transportation planning process : an interim handbook

    DOT National Transportation Integrated Search

    1999-06-01

    The main purpose of Phase I of this project was to develop a methodology for predicting consequences of hazardous material (HM) crashes, such as injuries and property damage. An initial step in developing a risk assessment is to reliably estimate the...

  4. GEOSTATISTICAL INTERPOLATION OF CHEMICAL CONCENTRATION. (R825689C037)

    EPA Science Inventory

    Abstract

    Measurements of contaminant concentration at a hazardous waste site typically vary over many orders of magnitude and have highly skewed distributions. This work presents a practical methodology for the estimation of solute concentration contour maps and volume...

  5. Mapping debris-flow hazard in Honolulu using a DEM

    USGS Publications Warehouse

    Ellen, Stephen D.; Mark, Robert K.; ,

    1993-01-01

    A method for mapping hazard posed by debris flows has been developed and applied to an area near Honolulu, Hawaii. The method uses studies of past debris flows to characterize sites of initiation, volume at initiation, and volume-change behavior during flow. Digital simulations of debris flows based on these characteristics are then routed through a digital elevation model (DEM) to estimate degree of hazard over the area.

  6. Advances in early fetal loss research: importance for risk assessment.

    PubMed Central

    Sweeney, A M; LaPorte, R E

    1991-01-01

    The assessment of early fetal losses (EFLs) in relationship to environmental agents offers unique advantages compared to other end points for hazard assessment. There is a high incidence (greater than 20% of all pregnancies end in an EFL), and the interval between exposure and end point is the short duration between conception and event, i.e., approximately 12 weeks. In contrast, cancer, which is the primary end point evaluated in risk assessment models, occurs with much lower frequency, and the latency period is measured in years or decades. EFLs have not been used effectively for risk assessment because most of the events are not detected. Prospective studies provide the only approach whereby it is possible to link exposure to EFLs. Recent methodologic advancements have demonstrated that it is now possible to conduct population-based studies of EFLs. It is likely that EFLs could serve as sentinels to monitor adverse health effects of many potential environmental hazards. The methodology will be demonstrated using lead exposure in utero as an example. PMID:2050056

  7. The use of mental models in chemical risk protection: developing a generic workplace methodology.

    PubMed

    Cox, Patrick; Niewöhmer, Jörg; Pidgeon, Nick; Gerrard, Simon; Fischhoff, Baruch; Riley, Donna

    2003-04-01

    We adopted a comparative approach to evaluate and extend a generic methodology to analyze the different sets of beliefs held about chemical hazards in the workplace. Our study mapped existing knowledge structures about the risks associated with the use of perchloroethylene and rosin-based solder flux in differing workplaces. "Influence diagrams" were used to represent beliefs held by chemical experts; "user models" were developed from data elicited from open-ended interviews with the workplace users of the chemicals. The juxtaposition of expert and user understandings of chemical risks enabled us to identify knowledge gaps and misunderstandings and to reinforce appropriate sets of safety beliefs and behavior relevant to chemical risk communications. By designing safety information to be more relevant to the workplace context of users, we believe that employers and employees may gain improved knowledge about chemical hazards in the workplace, such that better chemical risk management, self-protection, and informed decision making develop over time.

  8. Lost in translation? The hazards of applying social constructionism to quantitative research on sexual orientation development.

    PubMed

    Robboy, Caroline Alex

    2002-01-01

    This article explores the hazards faced by social constructionists who attempt to conduct quantitative research on sexual orientation development. By critically reviewing two quantitative research studies, this article explores the ways in which the very nature of social constructionist arguments may be incongruous with the methodological requirements of quantitative studies. I suggest this conflict is a result of the differing natures of these two modes of scholarly inquiry. While research requires the acceptance of certain analytical categories, the strength of social constructionism comes from its reflexive scrutiny and problematization of those very categories. Ultimately, social constructionists who try to apply their theories/perspectives must necessarily conform to the methodological constraints of quantitative research. The intent of this article is not to suggest that it is futile or self-contradictory for social constructionists to attempt empirical research, but that these are two distinct modes of scholarly inquiry which can, and should, co-exist in a dialectical relationship to each other.

  9. Detecting Slow Deformation Signals Preceding Dynamic Failure: A New Strategy For The Mitigation Of Natural Hazards (SAFER)

    NASA Astrophysics Data System (ADS)

    Vinciguerra, Sergio; Colombero, Chiara; Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Umili, Gessica; Fiaschi, Andrea; Saccorotti, Gilberto

    2014-05-01

    Rock slope monitoring is a major aim in territorial risk assessment and mitigation. The high velocity that usually characterizes the failure phase of rock instabilities makes the traditional instruments based on slope deformation measurements not applicable for early warning systems. On the other hand the use of acoustic emission records has been often a good tool in underground mining for slope monitoring. Here we aim to identify the characteristic signs of impending failure, by deploying a "site specific" microseismic monitoring system on an unstable patch of the Madonna del Sasso landslide on the Italian Western Alps designed to monitor subtle changes of the mechanical properties of the medium and installed as close as possible to the source region. The initial characterization based on geomechanical and geophysical tests allowed to understand the instability mechanism and to design the monitoring systems to be placed. Stability analysis showed that the stability of the slope is due to rock bridges. Their failure progress can results in a global slope failure. Consequently the rock bridges potentially generating dynamic ruptures need to be monitored. A first array consisting of instruments provided by University of Turin, has been deployed on October 2013, consisting of 4 triaxial 4.5 Hz seismometers connected to a 12 channel data logger arranged in a 'large aperture' configuration which encompasses the entire unstable rock mass. Preliminary data indicate the occurrence of microseismic swarms with different spectral contents. Two additional geophones and 4 triaxial piezoelectric accelerometers able to operate at frequencies up to 23 KHz will be installed during summer 2014. This will allow us to develop a network capable of recording events with Mw < 0.5 and frequencies between 700 Hz and 20 kHz. Rock physical and mechanical characterization along with rock deformation laboratory experiments during which the evolution of related physical parameters under simulated conditions of stress and fluid content will be also studied and theoretical modelling will allow to come up with a full hazard assessment and test new methodologies for a much wider scale of applications within EU.

  10. A multi-objective optimization approach for the selection of working fluids of geothermal facilities: Economic, environmental and social aspects.

    PubMed

    Martínez-Gomez, Juan; Peña-Lamas, Javier; Martín, Mariano; Ponce-Ortega, José María

    2017-12-01

    The selection of the working fluid for Organic Rankine Cycles has traditionally been addressed from systematic heuristic methods, which perform a characterization and prior selection considering mainly one objective, thus avoiding a selection considering simultaneously the objectives related to sustainability and safety. The objective of this work is to propose a methodology for the optimal selection of the working fluid for Organic Rankine Cycles. The model is presented as a multi-objective approach, which simultaneously considers the economic, environmental and safety aspects. The economic objective function considers the profit obtained by selling the energy produced. Safety was evaluated in terms of individual risk for each of the components of the Organic Rankine Cycles and it was formulated as a function of the operating conditions and hazardous properties of each working fluid. The environmental function is based on carbon dioxide emissions, considering carbon dioxide mitigation, emission due to the use of cooling water as well emissions due material release. The methodology was applied to the case of geothermal facilities to select the optimal working fluid although it can be extended to waste heat recovery. The results show that the hydrocarbons represent better solutions, thus among a list of 24 working fluids, toluene is selected as the best fluid. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Source processes for the probabilistic assessment of tsunami hazards

    USGS Publications Warehouse

    Geist, Eric L.; Lynett, Patrick J.

    2014-01-01

    The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.

  12. Landslide Hazard Mapping in Rwanda Using Logistic Regression

    NASA Astrophysics Data System (ADS)

    Piller, A.; Anderson, E.; Ballard, H.

    2015-12-01

    Landslides in the United States cause more than $1 billion in damages and 50 deaths per year (USGS 2014). Globally, figures are much more grave, yet monitoring, mapping and forecasting of these hazards are less than adequate. Seventy-five percent of the population of Rwanda earns a living from farming, mostly subsistence. Loss of farmland, housing, or life, to landslides is a very real hazard. Landslides in Rwanda have an impact at the economic, social, and environmental level. In a developing nation that faces challenges in tracking, cataloging, and predicting the numerous landslides that occur each year, satellite imagery and spatial analysis allow for remote study. We have focused on the development of a landslide inventory and a statistical methodology for assessing landslide hazards. Using logistic regression on approximately 30 test variables (i.e. slope, soil type, land cover, etc.) and a sample of over 200 landslides, we determine which variables are statistically most relevant to landslide occurrence in Rwanda. A preliminary predictive hazard map for Rwanda has been produced, using the variables selected from the logistic regression analysis.

  13. Waste-water characterization and hazardous-waste technical assistance survey, Bergstrom AFB tTxas. Final report, 6-15 March 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hedgecock, N.S.

    1990-01-01

    At the request of 67 Combat Support Group/DEEV the Air Force Occupational and Environmental Health Laboratory conducted a waste-water characterization and hazardous-waste technical assistance survey at Bergstrom AFB (BAFB) from 6-15 Mar 89. The scope of the waste-water survey was to characterize the effluent exiting the base and the effluent from 23 industrial facilities and 10 food-serving facilities. The scope of the hazardous-waste survey was to address hazardous-waste-management practices and explore opportunities for hazardous waste minimization. Specific recommendations from the survey include: (1) Accompany City of Austin personnel during waste-water sampling procedures; (2) Sample at the manhole exiting the mainmore » lift station rather than at the lift station wet well; (3) Split waste-water samples with the City of Austin for comparison of results; (4) Ensure that oil/water separators and grease traps are functioning properly and are cleaned out regularly; (5) Limit the quantity of soaps and solvents discharged down the drain to the sanitary sewer; (6) Establish a waste disposal contract for the removal of wastes in the Petroleum Oils and Lubricants underground storage tanks. (7) Remove, analyze, and properly dispose of oil contaminated soil from accumulation sites. (8) Move indoors or secure, cover, and berm the aluminum sign reconditioning tank at 67 Civil Engineering Squadron Protective Coating. (9) Connect 67 Combat Repair Squadron Test Cell floor drains to the sanitary sewer.« less

  14. Waste-water characterization and hazardous-waste technical assistance survey, Mather AFB California. Final report, 28 November-9 December 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, S.P.; Hedgecock, N.S.

    1989-10-01

    Personnel from the AFOEHL conducted a waste-water characterization and hazardous-waste technical assistance survey at MAFB from 28 Nov to 9 Dec 1988. The scope of this survey was to characterize the waste-water, address hazardous-waste-management practices, and explore opportunities for hazardous waste minimization. The waste water survey team analyzed the base's industrial effluent, effluent from oil/water separators, and storm water. The team performed a shop-by-shop evaluation of chemical-waste-management practices. Survey results showed that MAFB needs to improve its hazardous-waste-management program. Recommendations for improvement include: (1) Collecting two additional grab samples on separate days from the hospital discharge. Analyze for EPA Methodmore » 601 to determine if the grab sample from the survey gives a true indication of what is being discharged. (2) Locate the source and prevent mercury from the hospital from discharging into the sanitary sewer. (3) Dilute the soaps used for cleaning at the Fuels Lab, Building 7060. (4) Investigate the source of chromium from the Photo Lab. (5) Clean out the sewer system manhole directly downgradient from the Photo Lab. (6) Locate the source of contamination in the West Ditch Outfall. (7) Reconnect the two oil/water separators that discharge into the storm sewerage system. (8) Investigate the source of methylene chloride coming on the base. (9) Investigate the source of mercury at Fuel Cell Repair, building 7005.« less

  15. Disability adjusted life year (DALY): a useful tool for quantitative assessment of environmental pollution.

    PubMed

    Gao, Tingting; Wang, Xiaochang C; Chen, Rong; Ngo, Huu Hao; Guo, Wenshan

    2015-04-01

    Disability adjusted life year (DALY) has been widely used since 1990s for evaluating global and/or regional burden of diseases. As many environmental pollutants are hazardous to human health, DALY is also recognized as an indicator to quantify the health impact of environmental pollution related to disease burden. Based on literature reviews, this article aims to give an overview of the applicable methodologies and research directions for using DALY as a tool for quantitative assessment of environmental pollution. With an introduction of the methodological framework of DALY, the requirements on data collection and manipulation for quantifying disease burdens are summarized. Regarding environmental pollutants hazardous to human beings, health effect/risk evaluation is indispensable for transforming pollution data into disease data through exposure and dose-response analyses which need careful selection of models and determination of parameters. Following the methodological discussions, real cases are analyzed with attention paid to chemical pollutants and pathogens usually encountered in environmental pollution. It can be seen from existing studies that DALY is advantageous over conventional environmental impact assessment for quantification and comparison of the risks resulted from environmental pollution. However, further studies are still required to standardize the methods of health effect evaluation regarding varied pollutants under varied circumstances before DALY calculation. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Risk analysis of technological hazards: Simulation of scenarios and application of a local vulnerability index.

    PubMed

    Sanchez, E Y; Represa, S; Mellado, D; Balbi, K B; Acquesta, A D; Colman Lerner, J E; Porta, A A

    2018-06-15

    The potential impact of a technological accident can be assessed by risk estimation. Taking this into account, the latent or potential condition can be warned and mitigated. In this work we propose a methodology to estimate risk of technological hazards, focused on two components. The first one is the processing of meteorological databases to define the most probably and conservative scenario of study, and the second one, is the application of a local social vulnerability index to classify the population. In this case of study, the risk was estimated for a hypothetical release of liquefied ammonia in a meat-packing industry in the city of La Plata, Argentina. The method consists in integrating the simulated toxic threat zone with ALOHA software, and the layer of sociodemographic classification of the affected population. The results show the areas associated with higher risks of exposure to ammonia, which are worth being addressed for the prevention of disasters in the region. Advantageously, this systemic approach is methodologically flexible as it provides the possibility of being applied in various scenarios based on the available information of both, the exposed population and its meteorology. Furthermore, this methodology optimizes the processing of the input data and its calculation. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Photovoice in the workplace: A participatory method to give voice to workers to identify health and safety hazards and promote workplace change-a study of university custodians.

    PubMed

    Flum, Marian R; Siqueira, Carlos Eduardo; DeCaro, Anthony; Redway, Scott

    2010-11-01

    Photovoice, a photographic participatory action research methodology was used in a workplace setting to assess hazards that were creating extremely high injury and incidents rates for university custodians and to promote the conditions to eliminate or reduce those hazards. University custodians participated in a Photovoice project to identify, categorize, and prioritize occupational hazards and to discuss and propose solutions to these problems. Results were presented to management and to all custodians for further discussion. The effort was led by a worker-based union-sponsored participatory evaluation team in partnership with a university researcher. Visual depiction of hazardous tasks and exposures among custodians and management focused primarily on improper or unsafe equipment, awkward postures, lifting hazards, and electrical hazards. The process of taking pictures and presenting them created an ongoing discussion among workers and management regarding the need for change and for process improvements, and resulted in greater interest and activity regarding occupational health among the workers. In a follow-up evaluation 1-year later, a number of hazards identified through Photovoice had been corrected. Injury rates for custodians had decreased from 39% to 26%. Photovoice can be an important tool, not just for identifying occupational hazards, but also empowering workers to be more active around health and safety and may facilitate important changes in the workplace. © 2010 Wiley-Liss, Inc.

  18. Multi-criteria analysis for the detection of the most critical European UNESCO Heritage sites

    NASA Astrophysics Data System (ADS)

    Valagussa, Andrea; Frattini, Paolo; Berta, Nadia; Spizzichino, Daniele; Leoni, Gabriele; Margottini, Claudio; Battista Crosta, Giovanni

    2017-04-01

    A GIS-based multi-criteria analysis has been implemented to identify and to rank the most critical UNESCO Heritage sites at the European scale in the context of PROTHEGO JPI-Project. Two multi-criteria methods have been tested and applied to more than 300 European UNESCO Sites. First, the Analytic Hierarchy Procedure (AHP) was applied to the data of the UNESCO Periodic Report, in relation to 13 natural hazards that have affected or can potentially affect the Heritage sites. According to these reports, 22% of sites are without any documented hazard and 70% of the sites have at least one hazard affecting the site. The most important hazards on the European country are: fire (wildfire), storm, flooding, earthquake and erosion. For each UNESCO site, the potential risk was calculated as a weighed sum of the hazards that affect the site. The weighs of the 13 hazards were obtained by AHP procedure, which is a technique for multi-attribute decision making that enables the decomposition of a problem into hierarchy, based on the opinion of different experts about the dominance of risks. The weights are obtained by rescaling between 0 and 1 the eigenvectors relative to the maximum eigenvalue for the matrix of the coefficients. The internal coherence of the expert's attributions is defined through the calculation of the consistency ratio (Saaty, 1990). The result of the AHP method consists in a map of the UNESCO sites ranked according to the potential risk, where the site most at risk results to be the Geirangerfjord and Nærøyfjord in Norway. However, the quality of these results lies in the reliability of the Period Reports, which are produced by different experts with unknown level of scientific background. To test the reliability of these results, a comparison of the information of the periodic reports with available high-quality datasets (earthquake, volcano and landslide) at the Italian scale has been performed. Sites properly classified by the Period Reports range from 65% (earthquake hazard) to 98% (volcano hazard), with a high underestimation of landslide hazard. Due to this high value of uncertainty, we developed a new methodology to identify and to rank the most critical UNESCO Heritage sites on the basis of three natural hazards (landslide, earthquake, and volcano) for which reliable European-scale hazard maps are available. For each UNESCO site, a potential risk was calculated as the product of hazard (from the available maps) and potential vulnerability. The latter is obtained considering the typology of site (e.g. monument, cultural landscape, and cultural road), the presence or absence of resident and/or tourist, the position of the site (underground/over-ground). Through this methodology, a new ranking of the European UNESCO Sites has been obtained. In this ranking, the historic center of Naples results to be the most-at-danger site of the European continent.

  19. Near surface geophysics techniques and geomorphological approach to reconstruct the hazard cave map in historical and urban areas

    NASA Astrophysics Data System (ADS)

    Lazzari, M.; Loperte, A.; Perrone, A.

    2010-03-01

    This work, carried out with an integrated methodological approach, focuses on the use of near surface geophysics techniques, such as ground penetrating radar and electrical resistivity tomography (ERT), and geomorphological analysis, in order to reconstruct the cave distribution and geometry in a urban context and, in particular, in historical centres. The interaction during recent centuries between human activity (caves excavation, birth and growth of an urban area) and the characters of the natural environment were the reasons of a progressive increase in hazard and vulnerability levels of several sites. The reconstruction of a detailed cave map distribution is the first step to define the anthropic and geomorphological hazard in urban areas, fundamental basis for planning and assessing the risk.

  20. A small business approach to nanomaterial environment, health, and safety.

    PubMed

    Gause, Charles B; Layman, Rachel M; Small, Aaron C

    2011-06-01

    Integral to the commercialization process for nanotechnology enabled products is the methodology for protecting workers potentially exposed to nanomaterials during product development. Occupational health surveillance is a key aspect of protecting employees and involves both hazard identification and surveillance of known medical data. However, when the health effects and exposure pathways of both new and existing "nano-scale" chemical substances are not yet well understood, conservative hazard controls and baseline data collection can facilitate both immediate and long-term worker protection. Luna Innovations uses a conservative approach based on risk assessment and the OSHA General Duty Clause. To date, Luna's approach has been effective for our business model. Understanding and managing potential hazards to our nanotechnology workers is key to the success and acceptance of nanotechnology enabled products.

  1. Developing a methodology for the national-scale assessment of rainfall-induced landslide hazard in a changing climate

    NASA Astrophysics Data System (ADS)

    Jurchescu, Marta; Micu, Dana; Sima, Mihaela; Bălteanu, Dan; Bojariu, Roxana; Dumitrescu, Alexandru; Dragotă, Carmen; Micu, Mihai; Senzaconi, Francisc

    2017-04-01

    Landslides together with earthquakes and floods represent the main natural hazards in Romania, causing major impacts to human activities. The RO-RISK (Disaster Risk Evaluation at a National Level) project is a flagship project aimed to strengthen risk prevention and management in Romania, by evaluating - among the specific risks in the country - landslide hazard and risk at a national level. Landslide hazard is defined as "the probability of occurrence within a specified period of time and within a given area of a landslide of a given magnitude" (Varnes 1984; Guzzetti et al. 1999). Nevertheless, most landslide ʿhazardʾ maps only consist in susceptibility (i.e. spatial probability) zonations without considering temporal or magnitude information on the hazard. This study proposes a methodology for the assessment of landslide hazard at the national scale on a scenario basis, while also considering changes in hazard patterns and levels under climate change conditions. A national landslide database consisting of more than 3,000 records has been analyzed against a meteorological observation dataset in order to assess the relationship between precipitation and landslides. Various extreme climate indices were computed in order to account for the different rainfall patterns able to prepare/trigger landslides (e.g. extreme levels of seasonal rainfall, 3-days rainfall or number of consecutive rainy days with different return periods). In order to derive national rainfall thresholds, i.e. valid for diverse climatic environments across the country, values in the parameter maps were rendered comparable by means of normalization with the mean annual precipitation and the rainy-day-normal. A hazard assessment builds on a frequency-magnitude relationship. In the current hazard scenario approach, frequency was kept constant for each single map, while the magnitude of the expected geomorphic event was modeled in relation to the distributed magnitude of the triggering factor. Given the small-scale context, landslides were interpreted as multiple-occurrence regional landslide events (MORLE) (Crozier 2005) and consequently their magnitude was expressed by means of the number of triggered processes. In order to achieve acceptable relations between the intensity of the trigger and the magnitude of the MORLE for different morphological and lithological conditions, a prior distinction of homogenous territories in terms of landslide predisposing characteristics was considered. Since landslide data was statistically insufficient, empiric knowledge gained on rainfall thresholds was used to modulate expert judgment and build semi-quantitative hazard matrices. Climate projections (2021-2050) from EURO-CORDEX regional models (downscaled to a 1 km resolution) under RCP 4.5 and RCP 8.5 scenarios were considered to estimate future patterns and levels of landslide hazard across Romania and investigate expected changes. The established hazard scenarios allow the identification of the high-hazard 'hotspot' regions across the country as well as of those assigned to the medium-to-high hazard magnitudes under both current and future climates. Trends in the expected impact of climate change on landslide hazard are discussed with reference to related uncertainties. This study is part of the RO-RISK project coordinated by the Romanian General Inspectorate for Emergency Situations (IGSU) and supported by the European Social Fund through the Operational Programme for Administrative Capacity (POCA).

  2. CHARACTERIZATION OF RISKS POSED BY COMBUSTOR EMISSIONS

    EPA Science Inventory

    Risk characterization is the final step of the risk assessment process as practiced in the U.S. EPA. In risk characterization, the major scientific evidence and "bottom-line" results from the other components of the risk assessment process, hazard identification, dose-response as...

  3. Seismic hazard assessment in the Catania and Siracusa urban areas (Italy) through different approaches

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Lombardo, Giuseppe; Rigano, Rosaria

    2010-05-01

    The seismic hazard assessment (SHA) can be performed using either Deterministic or Probabilistic approaches. In present study a probabilistic analysis was carried out for the Catania and Siracusa towns using two different procedures: the 'site' (Albarello and Mucciarelli, 2002) and the 'seismotectonic' (Cornell 1968; Esteva, 1967) methodologies. The SASHA code (D'Amico and Albarello, 2007) was used to calculate seismic hazard through the 'site' approach, whereas the CRISIS2007 code (Ordaz et al., 2007) was adopted in the Esteva-Cornell procedure. According to current international conventions for PSHA (SSHAC, 1997), a logic tree approach was followed to consider and reduce the epistemic uncertainties, for both seismotectonic and site methods. The code SASHA handles the intensity data taking into account the macroseismic information of past earthquakes. CRISIS2007 code needs, as input elements, a seismic catalogue tested for completeness, a seismogenetic zonation and ground motion predicting equations. Data concerning the characterization of regional seismic sources and ground motion attenuation properties were taken from the literature. Special care was devoted to define source zone models, taking into account the most recent studies on regional seismotectonic features and, in particular, the possibility of considering the Malta escarpment as a potential source. The combined use of the above mentioned approaches allowed us to obtain useful elements to define the site seismic hazard in Catania and Siracusa. The results point out that the choice of the probabilistic model plays a fundamental role. It is indeed observed that when the site intensity data are used, the town of Catania shows hazard values higher than the ones found for Siracusa, for each considered return period. On the contrary, when the Esteva-Cornell method is used, Siracusa urban area shows higher hazard than Catania, for return periods greater than one hundred years. The higher hazard observed, through the site approach, for Catania area can be interpreted in terms of greater damage historically observed at this town and its smaller distance from the seismogenic structures. On the other hand, the higher level of hazard found for Siracusa, throughout the Esteva-Cornell approach, could be a consequence of the features of such method which spreads out the intensities over a wide area. However, in SHA the use of a combined approach is recommended for a mutual validation of obtained results and any choice between the two approaches is strictly linked to the knowledge of the local seismotectonic features. References Albarello D. and Mucciarelli M.; 2002: Seismic hazard estimates using ill?defined macroseismic data at site. Pure Appl. Geophys., 159, 1289?1304. Cornell C.A.; 1968: Engineering seismic risk analysis. Bull. Seism. Soc. Am., 58(5), 1583-1606. D'Amico V. and Albarello D.; 2007: Codice per il calcolo della pericolosità sismica da dati di sito (freeware). Progetto DPC-INGV S1, http://esse1.mi.ingv.it/d12.html Esteva L.; 1967: Criterios para la construcción de espectros para diseño sísmico. Proceedings of XII Jornadas Sudamericanas de Ingeniería Estructural y III Simposio Panamericano de Estructuras, Caracas, 1967. Published later in Boletín del Instituto de Materiales y Modelos Estructurales, Universidad Central de Venezuela, No. 19. Ordaz M., Aguilar A. and Arboleda J.; 2007: CRISIS2007, Program for computing seismic hazard. Version 5.4, Mexico City: UNAM. SSHAC (Senior Seismic Hazard Analysis Committee); 1997: Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and use of experts. NUREG/CR-6372.

  4. An Integrated Science-based methodology

    EPA Pesticide Factsheets

    The data is secondary in nature. Meaning that no data was generated as part of this review effort. Rather, data that was available in the peer-reviewed literature was used.This dataset is associated with the following publication:Tolaymat , T., A. El Badawy, R. Sequeira, and A. Genaidy. An integrated science-based methodology to assess potential risks and implications of engineered nanomaterials. Diana Aga, Wonyong Choi, Andrew Daugulis, Gianluca Li Puma, Gerasimos Lyberatos, and Joo Hwa Tay JOURNAL OF HAZARDOUS MATERIALS. Elsevier Science Ltd, New York, NY, USA, 298: 270-281, (2015).

  5. THERP and HEART integrated methodology for human error assessment

    NASA Astrophysics Data System (ADS)

    Castiglia, Francesco; Giardina, Mariarosa; Tomarchio, Elio

    2015-11-01

    THERP and HEART integrated methodology is proposed to investigate accident scenarios that involve operator errors during high-dose-rate (HDR) treatments. The new approach has been modified on the basis of fuzzy set concept with the aim of prioritizing an exhaustive list of erroneous tasks that can lead to patient radiological overexposures. The results allow for the identification of human errors that are necessary to achieve a better understanding of health hazards in the radiotherapy treatment process, so that it can be properly monitored and appropriately managed.

  6. Gasoline toxicology: overview of regulatory and product stewardship programs.

    PubMed

    Swick, Derek; Jaques, Andrew; Walker, J C; Estreicher, Herb

    2014-11-01

    Significant efforts have been made to characterize the toxicological properties of gasoline. There have been both mandatory and voluntary toxicology testing programs to generate hazard characterization data for gasoline, the refinery process streams used to blend gasoline, and individual chemical constituents found in gasoline. The Clean Air Act (CAA) (Clean Air Act, 2012: § 7401, et seq.) is the primary tool for the U.S. Environmental Protection Agency (EPA) to regulate gasoline and this supplement presents the results of the Section 211(b) Alternative Tier 2 studies required for CAA Fuel and Fuel Additive registration. Gasoline blending streams have also been evaluated by EPA under the voluntary High Production Volume (HPV) Challenge Program through which the petroleum industry provide data on over 80 refinery streams used in gasoline. Product stewardship efforts by companies and associations such as the American Petroleum Institute (API), Conservation of Clean Air and Water Europe (CONCAWE), and the Petroleum Product Stewardship Council (PPSC) have contributed a significant amount of hazard characterization data on gasoline and related substances. The hazard of gasoline and anticipated exposure to gasoline vapor has been well characterized for risk assessment purposes. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  8. Regulatory Issues and Challenges in Developing Seismic Source Characterizations for New Nuclear Power Plant Applications in the US

    NASA Astrophysics Data System (ADS)

    Fuller, C. W.; Unruh, J.; Lindvall, S.; Lettis, W.

    2009-05-01

    An integral component of the safety analysis for proposed nuclear power plants within the US is a probabilistic seismic hazard assessment (PSHA). Most applications currently under NRC review followed guidance provided within NRC Regulatory Guide 1.208 (RG 1.208) for developing seismic source characterizations (SSC) for their PSHA. Three key components of RG 1.208 guidance is that applicants should: (1) use existing PSHA models and SSCs accepted by the NRC as SSC as a starting point for their SSCs; (2) evaluate new information and data developed since acceptance of the starting model to determine if the model should be updated; and (3) follow guidelines set forth by the Senior Seismic Hazard Analysis Committee (SSHAC) (NUREG/CR-6372) in developing significant updates (i.e., updates should capture SSC uncertainty through representing the "center, body, and range of technical interpretations" of the informed technical community). Major motivations for following this guidance are to ensure accurate representations of hazard and regulatory stability in hazard estimates for nuclear power plants. All current applications with the NRC have used the EPRI-SOG source characterizations developed in the 1980s as their starting point model, and all applicants have followed RG 1.208 guidance in updating the EPRI- SOG model. However, there has been considerable variability in how applicants have interpreted the guidance, and thus there has been considerable variability in the methodology used in updating the SSCs. Much of the variability can be attributed to how different applicants have interpreted the implications of new data, new interpretations of new and/or old data, and new "opinions" of members of the informed technical community. For example, many applicants and the NRC have wrestled with the challenge of whether or not to update SSCs in light of new opinions or interpretations of older data put forth by one member of the technical community. This challenge has been further complicated by: (1) a given applicant's uncertainty in how to revise the EPRI-SOG model, which was developed using a process similar to that dictated by SSHAC for a level 3 or 4 study, without conducting a resource-intensive SSHAC level 3 or higher study for their respective application; and (2) a lack of guidance from the NRC on acceptable methods of demonstrating that new data, interpretations, and opinions are adequately represented within the EPRI-SOG model. Partly because of these issues, initiative was taken by the nuclear industry, NRC and DOE to develop a new base PSHA model for the central and eastern US. However, this new SSC model will not be completed for several years and does not resolve many of the fundamental regulatory and philosophical issues that have been raised during the current round of applications. To ensure regulatory stability and to provide accurate estimates of hazard for nuclear power plants, a dialog must be started between regulators and industry to resolve these issues. Two key issues that must be discussed are: (1) should new data and new interpretations or opinions of old data be treated differently in updated SSCs, and if so, how?; and (2) how can new data or interpretations developed by a small subset of the technical community be weighed against and potentially combined with a SSC model that was originally developed to capture the "center, body and range" of the technical community?

  9. 78 FR 35363 - Marine Mammals; Incidental Take During Specified Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-12

    ... design criteria for any planned structure. Methodology for geotechnical surveys may vary between those..., depending on the type of activity. Offshore activities, such as exploration drilling, seismic surveys, and shallow hazards surveys, are expected to occur only during the open- water season (July-November). Onshore...

  10. Visualization and Hierarchical Analysis of Flow in Discrete Fracture Network Models

    NASA Astrophysics Data System (ADS)

    Aldrich, G. A.; Gable, C. W.; Painter, S. L.; Makedonska, N.; Hamann, B.; Woodring, J.

    2013-12-01

    Flow and transport in low permeability fractured rock is primary in interconnected fracture networks. Prediction and characterization of flow and transport in fractured rock has important implications in underground repositories for hazardous materials (eg. nuclear and chemical waste), contaminant migration and remediation, groundwater resource management, and hydrocarbon extraction. We have developed methods to explicitly model flow in discrete fracture networks and track flow paths using passive particle tracking algorithms. Visualization and analysis of particle trajectory through the fracture network is important to understanding fracture connectivity, flow patterns, potential contaminant pathways and fast paths through the network. However, occlusion due to the large number of highly tessellated and intersecting fracture polygons preclude the effective use of traditional visualization methods. We would also like quantitative analysis methods to characterize the trajectory of a large number of particle paths. We have solved these problems by defining a hierarchal flow network representing the topology of particle flow through the fracture network. This approach allows us to analyses the flow and the dynamics of the system as a whole. We are able to easily query the flow network, and use paint-and-link style framework to filter the fracture geometry and particle traces based on the flow analytics. This allows us to greatly reduce occlusion while emphasizing salient features such as the principal transport pathways. Examples are shown that demonstrate the methodology and highlight how use of this new method allows quantitative analysis and characterization of flow and transport in a number of representative fracture networks.

  11. Characterization of seismic hazard and structural response by energy flux

    USGS Publications Warehouse

    Afak, E.

    2000-01-01

    Seismic safety of structures depends on the structure's ability to absorb the seismic energy that is transmitted from ground to structure. One parameter that can be used to characterize seismic energy is the energy flux. Energy flux is defined as the amount of energy transmitted per unit time through a cross-section of a medium, and is equal to kinetic energy multiplied by the propagation velocity of seismic waves. The peak or the integral of energy flux can be used to characterize ground motions. By definition, energy flux automatically accounts for site amplification. Energy flux in a structure can be studied by formulating the problem as a wave propagation problem. For buildings founded on layered soil media and subjected to vertically incident plane shear waves, energy flux equations are derived by modeling the buildings as an extension of the layered soil medium, and considering each story as another layer. The propagation of energy flux in the layers is described in terms of the upgoing and downgoing energy flux in each layer, and the energy reflection and transmission coefficients at each interface. The formulation results in a pair of simple finite-difference equations for each layer, which can be solved recursively starting from the bedrock. The upgoing and downgoing energy flux in the layers allows calculation of the energy demand and energy dissipation in each layer. The methodology is applicable to linear, as well as nonlinear structures. ?? 2000 Published by Elsevier Science Ltd.

  12. A socioeconomic assessment of climate change-enhanced coastal storm hazards in the U.S. Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Baron, H. M.; Ruggiero, P.; Harris, E.

    2010-12-01

    Every winter, coastal communities in the U.S. Pacific Northwest are at risk to coastal change hazards caused by extreme storm events. These storms have the potential to erode large portions of the primary foredune that may be a community’s only barrier from the ocean. Furthermore, the frequency and magnitude of significant erosion events appears to be increasing, likely due to climate-related processes such as sea level rise and increases in storm wave heights. To reduce risks posed by winter storms, it is not only important to determine the impending physical impacts but it is also necessary to explore the vulnerability of the social-ecological system in the context of these hazards. Here we assess the exposure to both annually occurring and extreme storm events at various planning timelines using a methodology that incorporates the effect of a variable and changing climate on future total water levels. To do this, we have developed a suite of climate change scenarios involving a range of projections for the wave climate, global sea level rise, and the occurrence of El Niño events through 2100. Simple geometric models are then used to conservatively determine the extent of erosion that may occur for a given combination of these climatic factors. We integrate the physical hazards with socioeconomic data using a geographic information system (GIS) in order to quantify societal vulnerability, characterized by the exposure and sensitivity of a community, which is based on the distribution of people, property, and resources. Here we focus on a 14 km stretch of dune-backed coast in northwest Oregon, from Cascade Head to Cape Kiwanda—the location of two communities that, historically, have experienced problematic storm-induced coastal change, Pacific City and Neskowin. Although both of these communities have similar exposure to coastal change hazards at the present, Neskowin is more than twice as sensitive to erosion because almost all of its residents and community assets are located within ~230 m of a narrow beach behind a rip rap revetment. Clearly, any significant losses sustained during an extreme storm could be devastating to the community, and these impacts will likely be amplified in the future. This information is being used to inform land-use planners as well as coastal community residents and visitors about potential coastal change hazards in order to make communities more resistant to future extreme storm events as they are influenced by a changing climate.

  13. Exploring the potential utility of high-throughput bioassays associated with US EPA Toxcast Program for effects-based monitoring and surveillance

    EPA Science Inventory

    Environmental monitoring and surveillance strategies are essential for identifying potential hazards of contaminant exposure to aquatic organisms. Chemical monitoring is effective for chemicals with well characterized hazards and for which sensitive analytical methods are availa...

  14. CHARACTERIZATION OF A LOSS OF HETEROZYGOSITY CANCER HAZARD IDENTIFICATION ASSAY.

    EPA Science Inventory

    Tumor development generally requires the loss of heterozygosity (LOH) at one or more loci. Thus, the ability to determine whether a chemical is capable of causing LOH is an important part of cancer hazard identification. The mouse lymphoma assay detects a broad spectrum of geneti...

  15. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni Marie K.; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2017-03-01

    This paper proposes a model for assessing the risk posed by natural hazards to infrastructures, with a focus on the indirect losses and loss of stability for the population relying on the infrastructure. The model prescribes a three-level analysis with increasing level of detail, moving from qualitative to quantitative analysis. The focus is on a methodology for semi-quantitative analyses to be performed at the second level. The purpose of this type of analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures, identifying the most critical scenarios and investigating the need for further analyses (third level). The proposed semi-quantitative methodology considers the frequency of the natural hazard, different aspects of vulnerability, including the physical vulnerability of the infrastructure itself, and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale according to pre-defined ranking criteria. The proposed indicators, which characterise conditions that influence the probability of an infrastructure malfunctioning caused by a natural event, are defined as (1) robustness and buffer capacity, (2) level of protection, (3) quality/level of maintenance and renewal, (4) adaptability and quality of operational procedures and (5) transparency/complexity/degree of coupling. Further indicators describe conditions influencing the socio-economic consequences of the infrastructure malfunctioning, such as (1) redundancy and/or substitution, (2) cascading effects and dependencies, (3) preparedness and (4) early warning, emergency response and measures. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard, the potential duration of the infrastructure malfunctioning (e.g. depending on the required restoration effort) and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented for demonstration purposes, where risk posed by adverse weather and natural hazards to primary road, water supply and power networks is assessed. The application examples show that the proposed model provides a useful tool for screening of potential undesirable events, contributing to a targeted reduction of the risk.

  16. A simple tool for preliminary hazard identification and quick assessment in craftwork and small/medium enterprises (SME).

    PubMed

    Colombini, Daniela; Occhipinti, E; Di Leone, G

    2012-01-01

    During the last Congress of the International Ergonomics Association (IEA), Beijing, August 2009, an international group was founded aimed at developing a "toolkit for MSD prevention" within IEA and in collaboration with World Health Organization (WHO). Possible users of toolkits are: members of health and safety committees, health and safety representatives, line supervisors; labor inspectors; health workers implementing basic occupational health services; occupational health and safety specialists.According to ISO standard 11228 series and the new Draft CD ISO 12259-2009: Application document guides for the potential user, a computer software ( in Excel®) was create dealing with hazard "mapping" in handicraft The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazard identification and risk estimation. Thus it makes possible to decide for which professional hazards a more exhaustive risk assessment will be necessary and which professional consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).

  17. Hazardous Waste Certification Plan: Hazardous Waste Handling Facility, Lawrence Berkeley Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-02-01

    The purpose of this plan is to describe the organization and methodology for the certification of hazardous waste (HW) handled in the Lawrence Berkeley Laboratory (LBL) Hazardous Waste Handling Facility (HWHF). The plan also incorporates the applicable elements of waste reduction, which include both up-front minimization and end- product treatment to reduce the volume and toxicity of the waste; segregation of the waste as it applies to certification; and executive summary of the Quality Assurance Program Plan (QAPP) for the HWHF and a list of the current and planned implementing procedures used in waste certification. The plan provides guidance frommore » the HWHF to waste generators, waste handlers, and the Systems Group Manager to enable them to conduct their activities and carry out their responsibilities in a manner that complies with several requirements of the Federal Resource Conservation and Resource Recovery Act (RCRA), the Federal Department of Transportation (DOT), and the State of California, Code of Regulations (CCR), Title 22.« less

  18. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less

  19. Metabarcoding avian diets at airports: implications for birdstrike hazard management planning

    PubMed Central

    2013-01-01

    Background Wildlife collisions with aircraft cost the airline industry billions of dollars per annum and represent a public safety risk. Clearly, adapting aerodrome habitats to become less attractive to hazardous wildlife will reduce the incidence of collisions. Formulating effective habitat management strategies relies on accurate species identification of high-risk species. This can be successfully achieved for all strikes either through morphology and/or DNA-based identifications. Beyond species identification, dietary analysis of birdstrike gut contents can provide valuable intelligence for airport hazard management practices in regards to what food is attracting which species to aerodromes. Here, we present birdstrike identification and dietary data from Perth Airport, Western Australia, an aerodrome that saw approximately 140,000 aircraft movements in 2012. Next-generation high throughput DNA sequencing was employed to investigate 77 carcasses from 16 bird species collected over a 12-month period. Five DNA markers, which broadly characterize vertebrates, invertebrates and plants, were used to target three animal mitochondrial genes (12S rRNA, 16S rRNA, and COI) and a plastid gene (trnL) from DNA extracted from birdstrike carcass gastrointestinal tracts. Results Over 151,000 DNA sequences were generated, filtered and analyzed by a fusion-tag amplicon sequencing approach. Across the 77 carcasses, the most commonly identified vertebrate was Mus musculus (house mouse). Acrididae (grasshoppers) was the most common invertebrate family identified, and Poaceae (grasses) the most commonly identified plant family. The DNA-based dietary data has the potential to provide some key insights into feeding ecologies within and around the aerodrome. Conclusions The data generated here, together with the methodological approach, will greatly assist in the development of hazard management plans and, in combination with existing observational studies, provide an improved way to monitor the effectiveness of mitigation strategies (for example, netting of water, grass type, insecticides and so on) at aerodromes. It is hoped that with the insights provided by dietary data, airports will be able to allocate financial resources to the areas that will achieve the best outcomes for birdstrike reduction. PMID:24330620

  20. La Conchita Landslide Risk Assessment

    NASA Astrophysics Data System (ADS)

    Kropp, A.; Johnson, L.; Magnusen, W.; Hitchcock, C. S.

    2009-12-01

    Following the disastrous landslide in La Conchita in 2005 that resulted in ten deaths, the State of California selected our team to prepare a risk assessment for a committee of key stakeholders. The stakeholders represented the State of California, Ventura County, members of the La Conchita community, the railroad, and the upslope ranch owner (where the slide originated); a group with widely varying views and interests. Our team was charged with characterizing the major hazards, developing a series of mitigation concepts, evaluating the benefits and costs of mitigation, and gathering stakeholder input throughout the process. Two unique elements of the study were the methodologies utilized for the consequence assessment and for the decision-making framework. La Conchita is exposed to multiple slope hazards, each with differing geographical distributions, as well as depth and velocity characteristics. Three consequence matrices were developed so that the potential financial losses, structural vulnerabilities, and human safety exposure could be evaluated. The matrices utilized semi-quantitative loss evaluations (both financial and life safety) based on a generalized understanding of likely vulnerability and hazard characteristics. The model provided a quantitative estimate of cumulative losses over a 50-year period, including losses of life based on FEMA evaluation criteria. Conceptual mitigation options and loss estimates were developed to provide a range of risk management solutions that were feasible from a cost-benefit standpoint. A decision tree approach was adopted to focus on fundamental risk management questions rather than on specific outcomes since the committee did not have a consensus view on the preferred solution. These questions included: 1. Over what time period can risks be tolerated before implementation of decisions? 2. Whose responsibility is it to identify a workable risk management solution? 3. Who will own the project? The decision tree developed for assessment can also be used in the reverse direction to evaluate a project impasse or to evaluate owners and time-frames associated with a particular risk management outcome. Although the processes developed were specific to the La Conchita study, we believe that they are applicable elsewhere for localized multi-hazard assessments and/or committee-led risk management efforts.

  1. Metabarcoding avian diets at airports: implications for birdstrike hazard management planning.

    PubMed

    Coghlan, Megan L; White, Nicole E; Murray, Dáithí C; Houston, Jayne; Rutherford, William; Bellgard, Matthew I; Haile, James; Bunce, Michael

    2013-12-11

    Wildlife collisions with aircraft cost the airline industry billions of dollars per annum and represent a public safety risk. Clearly, adapting aerodrome habitats to become less attractive to hazardous wildlife will reduce the incidence of collisions. Formulating effective habitat management strategies relies on accurate species identification of high-risk species. This can be successfully achieved for all strikes either through morphology and/or DNA-based identifications. Beyond species identification, dietary analysis of birdstrike gut contents can provide valuable intelligence for airport hazard management practices in regards to what food is attracting which species to aerodromes. Here, we present birdstrike identification and dietary data from Perth Airport, Western Australia, an aerodrome that saw approximately 140,000 aircraft movements in 2012. Next-generation high throughput DNA sequencing was employed to investigate 77 carcasses from 16 bird species collected over a 12-month period. Five DNA markers, which broadly characterize vertebrates, invertebrates and plants, were used to target three animal mitochondrial genes (12S rRNA, 16S rRNA, and COI) and a plastid gene (trnL) from DNA extracted from birdstrike carcass gastrointestinal tracts. Over 151,000 DNA sequences were generated, filtered and analyzed by a fusion-tag amplicon sequencing approach. Across the 77 carcasses, the most commonly identified vertebrate was Mus musculus (house mouse). Acrididae (grasshoppers) was the most common invertebrate family identified, and Poaceae (grasses) the most commonly identified plant family. The DNA-based dietary data has the potential to provide some key insights into feeding ecologies within and around the aerodrome. The data generated here, together with the methodological approach, will greatly assist in the development of hazard management plans and, in combination with existing observational studies, provide an improved way to monitor the effectiveness of mitigation strategies (for example, netting of water, grass type, insecticides and so on) at aerodromes. It is hoped that with the insights provided by dietary data, airports will be able to allocate financial resources to the areas that will achieve the best outcomes for birdstrike reduction.

  2. A simple landslide susceptibility analysis for hazard and risk assessment in developing countries

    NASA Astrophysics Data System (ADS)

    Guinau, M.; Vilaplana, J. M.

    2003-04-01

    In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.

  3. Late Lessons from Early Warnings: Toward Realism and Precaution with Endocrine-Disrupting Substances

    PubMed Central

    Gee, David

    2006-01-01

    The histories of selected public and environmental hazards, from the first scientifically based early warnings about potential harm to the subsequent precautionary and preventive measures, have been reviewed by the European Environment Agency. This article relates the “late lessons” from these early warnings to the current debates on the application of the precautionary principle to the hazards posed by endocrine-disrupting substances (EDSs). Here, I summarize some of the definitional and interpretative issues that arise. These issues include the contingent nature of knowledge; the definitions of precaution, prevention, risk, uncertainty, and ignorance; the use of differential levels of proof; and the nature and main direction of the methodological and cultural biases within the environmental health sciences. It is argued that scientific methods need to reflect better the realities of multicausality, mixtures, timing of dose, and system dynamics, which characterize the exposures and impacts of EDSs. This improved science could provide a more robust basis for the wider and wise use of the precautionary principle in the assessment and management of the threats posed by EDSs. The evaluation of such scientific evidence requires assessments that also account for multicausal reality. Two of the often used, and sometimes misused, Bradford Hill “criteria,” consistency and temporality, are critically reviewed in light of multicausality, thereby illustrating the need to review all of the criteria in light of 40 years of progress in science and policymaking. PMID:16818262

  4. Guidelines and Criteria for the Search Strategy, Evaluation, Selection, and Documentation of Key Data and Supporting Data Used for the Derivation of AEGL Values

    EPA Pesticide Factsheets

    This is Section 2.3 of the Standing Operating Procedures for Developing Acute Exposure Guideline Levels (AEGLs) for Hazardous Chemicals. It discusses methodologies used to search for and select data for development of AEGL values.

  5. Analysis of National Solid Waste Recycling Programs and Development of Solid Waste Recycling Cost Functions: A Summary of the Literature (1999)

    EPA Pesticide Factsheets

    Discussion of methodological issues for conducting benefit-cost analysis and provides guidance for selecting and applying the most appropriate and useful mechanisms in benefit-cost analysis of toxic substances, hazardous materials, and solid waste control

  6. Application of Benchmark Dose Methodology to a Variety of Endpoints and Exposures

    EPA Science Inventory

    This latest beta version (1.1b) of the U.S. Environmental Protection Agency (EPA) Benchmark Dose Software (BMDS) is being distributed for public comment. The BMDS system is being developed as a tool to facilitate the application of benchmark dose (BMD) methods to EPA hazardous p...

  7. Methodology for evaluating effectiveness of traffic-responsive systems on intersection congestion and traffic safety

    DOT National Transportation Integrated Search

    1997-01-01

    In 1986, the city of Milwaukee applied for and received approval for a hazard elimination grant to reduce congestion and traffic accidents at the intersection of two major and one minor arterial on the northwest side of the city. The intersection com...

  8. IN-SITU MONITORING OF INFILTRATION-INDUCED INSTABILITY OF I-70 EMBANKMENT WEST OF THE EISENHOWER-JOHNSON MEMORIAL TUNNELS, PHASE II

    DOT National Transportation Integrated Search

    2017-12-25

    Infiltration-induced landslides are common hazards to roads in Colorado. A new methodology that uses recent advances in unsaturated soil mechanics and hydrology was developed and tested. The approach consists on using soil suction and moisture conten...

  9. Screening Methodologies to Support Risk and Technology Reviews (RTR): A Case Study Analysis

    EPA Science Inventory

    The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have lar...

  10. Managing Risk in Systems Development.

    ERIC Educational Resources Information Center

    DePaoli, Marilyn M.; And Others

    Stanford University's use of a risk assessment methodology to improve the management of systems development projects is discussed. After examining the concepts of hazard, peril, and risk as they relate to the system development process, three ways to assess risk are covered: size, structure, and technology. The overall objective for Stanford…

  11. EcSL: Teaching Economics as a Second Language.

    ERIC Educational Resources Information Center

    Crowe, Richard

    Hazard Community College, in Kentucky, has implemented a new instructional methodology for economics courses called Economics as a Second Language (EcSL). This teaching approach, based on the theory of Rendigs Fel that the best model for learning economics is the foreign language classroom, utilizes strategies similar to those employed in…

  12. In vitro Alternative Methodologies for Central Nervous System Assessment: A Critique using Nanoscale Materials as an Example.

    EPA Science Inventory

    Identifying the potential health hazards to the central nervous system of a new family of materials presents many challenges. Whole-animal toxicity testing has been the tradition, but in vitro methods have been steadily gaining popularity. There are numerous challenges in testing...

  13. 78 FR 1941 - Marine Mammals; Incidental Take During Specified Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-09

    ... design criteria for any planned structure. Methodology for geotechnical surveys may vary between those..., seismic surveys, and shallow hazards surveys, are expected to occur only during the open water season...., hydrological studies), or summer-fall (e.g., various fish and wildlife surveys). The petitioners have also...

  14. Photovoltaic system criteria documents. Volume 5: Safety criteria for photovoltaic applications

    NASA Technical Reports Server (NTRS)

    Koenig, John C.; Billitti, Joseph W.; Tallon, John M.

    1979-01-01

    Methodology is described for determining potential safety hazards involved in the construction and operation of photovoltaic power systems and provides guidelines for the implementation of safety considerations in the specification, design and operation of photovoltaic systems. Safety verification procedures for use in solar photovoltaic systems are established.

  15. Installation Restoration Program. Phase I. Records Search. Vandenberg Air Force Base, California.

    DTIC Science & Technology

    1984-12-01

    acidity or alkalinity POL Petroleum, oils, and lubricants PVC Polyvinyl chloride plastic RCRA Resource Conservation and Recovery Act RP-1 Rocket propellant...multiplter: Subscore 3 K ?hvsical State ulriptier Waste Caracteristics Subscore 50 X 1 50 11-1. CS-I HAZARD ASSESSMENT RATING METHODOLOGY FORM (Continued

  16. 40 CFR 721.4472 - Phenyl, alkyl, hydroxyalkyl substituted imidazole (generic).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... percent), and (c). (ii) Hazard communication program. Requirements as specified in § 721.72 (a), (b), (c... protocol. (3) TSCA Good Laboratory Practice Standards at 40 CFR part 792. (4) Using methodologies generally..., the person must obtain approval of test protocols from EPA by submitting written protocols. EPA will...

  17. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  18. Exposure to hazardous substances and male reproductive health: a research framework.

    PubMed Central

    Moline, J M; Golden, A L; Bar-Chama, N; Smith, E; Rauch, M E; Chapin, R E; Perreault, S D; Schrader, S M; Suk, W A; Landrigan, P J

    2000-01-01

    The discovery in the mid-1970s that occupational exposures to pesticides could diminish or destroy the fertility of workers sparked concern about the effects of hazardous substances on male reproductive health. More recently, there is evidence that sperm quantity and quality may have declined worldwide, that the incidence of testicular cancer has progressively increased in many countries, and that other disorders of the male reproductive tract such as hypospadias and cryptorchidism may have also increased. There is growing concern that occupational factors and environmental chemical exposures, including in utero and childhood exposures to compounds with estrogenic activity, may be correlated with these observed changes in male reproductive health and fertility. We review the evidence and methodologies that have contributed to our current understanding of environmental effects on male reproductive health and fertility and discuss the methodologic issues which confront investigators in this area. One of the greatest challenges confronting researchers in this area is assessing and comparing results from existing studies. We elaborate recommendations for future research. Researchers in the field of male reproductive health should continue working to prioritize hazardous substances; elucidate the magnitude of male reproductive health effects, particularly in the areas of testicular cancer, hypospadias, and cryptorchidism; develop biomarkers of exposure to reproductive toxins and of reproductive health effects for research and clinical use; foster collaborative interdisciplinary research; and recognize the importance of standardized laboratory methods and sample archiving. PMID:11017884

  19. Published methodological quality of randomized controlled trials does not reflect the actual quality assessed in protocols.

    PubMed

    Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P; Kumar, Ambuj

    2012-06-01

    To assess whether the reported methodological quality of randomized controlled trials (RCTs) reflects the actual methodological quality and to evaluate the association of effect size (ES) and sample size with methodological quality. Systematic review. This is a retrospective analysis of all consecutive phase III RCTs published by eight National Cancer Institute Cooperative Groups up to 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Four hundred twenty-nine RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94; 95% confidence interval [CI]: 0.88, 0.99) and 24% (RHR: 1.24; 95% CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. The largest study to date shows that poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Skin sensitization in chemical risk assessment: Report of aWHO/IPCS international workshop focusing on dose–responseassessment

    EPA Science Inventory

    An international workshop was held in 2006 to evaluate experimental techniques for hazard identification and hazard characterization of sensitizing agents in terms of their ability to produce data, including dose–response information, to inform risk assessment. Human testing to i...

  1. Editor's highlight: Evaluation of a Microelectrode Array-based Assay for Neural Network Ontogeny using Training Set Chemicals

    EPA Science Inventory

    Thousands of compounds in the environment have not been characterized for developmental neurotoxicity (DNT) hazard. To address this issue, methods to screen compounds rapidly for DNT hazard evaluation are necessary and are being developed for key neurodevelopmental processes. In...

  2. CONCENTRATION - DURATION RELATIONSHIPS FOR NON-CANCER HEALTH EFFECTS OF EXPOSURE TO HAZARDOUS AIR POLLUTANTS.

    EPA Science Inventory

    EPA is charged with assessing the risks of both acute and chronic exposures to hazardous air pollutants
    (HAPs). The emissions from sources of HAPs are often characterized as temporally-averaged values,
    however, patterns of exposure not captured in such measures may infl...

  3. 29 CFR 1926.407 - Hazardous (classified) locations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Electrical Code, lists or defines hazardous gases, vapors, and dusts by “Groups” characterized by their... the class, group, and operating temperature or temperature range, based on operation in a 40-degree C... be marked to indicate the group. (C) Fixed general-purpose equipment in Class I locations, other than...

  4. 29 CFR 1926.407 - Hazardous (classified) locations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Electrical Code, lists or defines hazardous gases, vapors, and dusts by “Groups” characterized by their... the class, group, and operating temperature or temperature range, based on operation in a 40-degree C... be marked to indicate the group. (C) Fixed general-purpose equipment in Class I locations, other than...

  5. CHARACTERIZATION OF HAZARDOUS WASTE SITES, A METHODS MANUAL. VOLUME 2. AVAILABLE SAMPLING METHODS (SECOND EDITION)

    EPA Science Inventory

    Investigations at hazardous waste sites and sites of chemical spills often require on-site measurements and sampling activities to assess the type and extent of contamination. This document is a compilation of sampling methods and materials suitable to address most needs that ari...

  6. NATIONAL SURVEYS OF MULTIPLE ENVIRONMENTAL HAZARDS TO YOUNG CHILDREN IN HOMES AND CHILD CARE CENTERS

    EPA Science Inventory

    The Department of Housing and Urban Development (HUD) has teamed with other federal agencies to characterize exposure of multiple environmental hazards to young children in two main indoor environments, homes and daycare centers. Under the co-sponsorship of HUD and the Nationa...

  7. Proposed Methodology for Design of Carbon Fiber Reinforced Polymer Spike Anchors into Reinforced Concrete

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacFarlane, Eric Robert

    The included methodology, calculations, and drawings support design of Carbon Fiber Reinforced Polymer (CFRP) spike anchors for securing U-wrap CFRP onto reinforced concrete Tbeams. This content pertains to an installation in one of Los Alamos National Laboratory’s facilities. The anchors are part of a seismic rehabilitation to the subject facility. The information contained here is for information purposes only. The reader is encouraged to verify all equations, details, and methodology prior to usage in future projects. However, development of the content contained here complied with Los Alamos National Laboratory’s NQA-1 quality assurance program for nuclear structures. Furthermore, the formulations andmore » details came from the referenced published literature. This literature represents the current state of the art for FRP anchor design. Construction personnel tested the subject anchor design to the required demand level demonstrated in the calculation. The testing demonstrated the ability of the anchors noted to carry loads in excess of 15 kips in direct tension. The anchors were not tested to failure in part because of the hazards associated with testing large-capacity tensile systems to failure. The calculation, methodology, and drawing originator was Eric MacFarlane of Los Alamos National Laboratory’s (LANL) Office of Seismic Hazards and Risk Mitigation (OSHRM). The checker for all components was Mike Salmon of the LANL OSHRM. The independent reviewers of all components were Insung Kim and Loring Wyllie of Degenkolb Engineers. Note that Insung Kim contributed to the initial formulations in the calculations that pertained directly to his Doctoral research.« less

  8. Multidimensional Approach for Tsunami Vulnerability Assessment: Framing the Territorial Impacts in Two Municipalities in Portugal.

    PubMed

    Tavares, Alexandre Oliveira; Barros, José Leandro; Santos, Angela

    2017-04-01

    This study presents a new multidimensional methodology for tsunami vulnerability assessment that combines the morphological, structural, social, and tax component of vulnerability. This new approach can be distinguished from previous methodologies that focused primarily on the evaluation of potentially affected buildings and did not use tsunami numerical modeling. The methodology was applied to the Figueira da Foz and Vila do Bispo municipalities in Portugal. For each area, the potential tsunami-inundated areas were calculated considering the 1755 Lisbon tsunami, which is the greatest disaster caused by natural hazards that ever occurred in Portugal. Furthermore, the four components of the vulnerability were calculated to obtain a composite vulnerability index. This methodology enables us to differentiate the two areas in their vulnerability, highlighting the characteristics of the territory components. This methodology can be a starting point for the creation of a local assessment framework at the municipal scale related to tsunami risk. In addition, the methodology is an important support for the different local stakeholders. © 2016 Society for Risk Analysis.

  9. Multi -risk assessment at a national level in Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashvili, Otar; Amiranashvili, Avtandil; Tsereteli, Emili; Elizbarashvili, Elizbar; Saluqvadze, Manana; Dolodze, Jemal

    2013-04-01

    Work presented here was initiated by national GNSF project " Reducing natural disasters multiple risk: a positive factor for Georgia development " and two international projects: NATO SFP 983038 "Seismic hazard and Rusk assessment for Southern Caucasus-eastern Turkey Energy Corridors" and EMME " Earthquake Model for Middle east Region". Methodology for estimation of "general" vulnerability, hazards and multiple risk to natural hazards (namely, earthquakes, landslides, snow avalanches, flash floods, mudflows, drought, hurricanes, frost, hail) where developed for Georgia. The electronic detailed databases of natural disasters were created. These databases contain the parameters of hazardous phenomena that caused natural disasters. The magnitude and intensity scale of the mentioned disasters are reviewed and the new magnitude and intensity scales are suggested for disasters for which the corresponding formalization is not yet performed. The associated economic losses were evaluated and presented in monetary terms for these hazards. Based on the hazard inventory, an approach was developed that allowed for the calculation of an overall vulnerability value for each individual hazard type, using the Gross Domestic Product per unit area (applied to population) as the indicator for elements at risk exposed. The correlation between estimated economic losses, physical exposure and the magnitude for each of the six types of hazards has been investigated in detail by using multiple linear regression analysis. Economic losses for all past events and historical vulnerability were estimated. Finally, the spatial distribution of general vulnerability was assessed, and the expected maximum economic loss was calculated as well as a multi-risk map was set-up.

  10. Hazardous-Materials Robot

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Edmonds, Gary O.

    1995-01-01

    Remotely controlled mobile robot used to locate, characterize, identify, and eventually mitigate incidents involving hazardous-materials spills/releases. Possesses number of innovative features, allowing it to perform mission-critical functions such as opening and unlocking doors and sensing for hazardous materials. Provides safe means for locating and identifying spills and eliminates risks of injury associated with use of manned entry teams. Current version of vehicle, called HAZBOT III, also features unique mechanical and electrical design enabling vehicle to operate safely within combustible atmosphere.

  11. Wastewater Characterization and Hazardous Waste Survey, Reese Air Force Base, Texas

    DTIC Science & Technology

    1988-04-01

    drain. The pH of caustic soda C’?- generally classifies this waste as hazardous. h. Personnel from the CE Power Production shop are neutralizing spent ...been changed out. A disposal practice for this waste will be formulated upon determination of whether or not the spent solvent is hazardous. Shop...sampling has been S performed to determine the characteristics of this waste. Spent rags are also thrown in the trash. Personnel also perform cadmium

  12. Probabilistic Surface Characterization for Safe Landing Hazard Detection and Avoidance (HDA)

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E. (Inventor); Ivanov, Tonislav I. (Inventor); Huertas, Andres (Inventor)

    2015-01-01

    Apparatuses, systems, computer programs and methods for performing hazard detection and avoidance for landing vehicles are provided. Hazard assessment takes into consideration the geometry of the lander. Safety probabilities are computed for a plurality of pixels in a digital elevation map. The safety probabilities are combined for pixels associated with one or more aim points and orientations. A worst case probability value is assigned to each of the one or more aim points and orientations.

  13. Fire and explosion hazards related to the industrial use of potassium and sodium methoxides.

    PubMed

    Kwok, Q; Acheson, B; Turcotte, R; Janès, A; Marlair, G

    2013-04-15

    Sodium and potassium methoxides are used as an intermediary for a variety of products in several industrial applications. For example, current production of so called "1G-biodiesel" relies on processing a catalytic reaction called "transesterification". This reaction transforms lipid resources from biomass materials into fatty acid methyl and ethyl esters. 1-G biodiesel processes imply the use of methanol, caustic potash (KOH), and caustic soda (NaOH) for which the hazards are well characterized. The more recent introduction of the direct catalysts CH3OK and CH3ONa may potentially introduce new process hazards. From an examination of existing MSDSs concerning these products, it appears that no consensus currently exists on their intrinsic hazardous properties. Recently, l'Institut National de l'Environnement Industriel et des Risques (France) and the Canadian Explosives Research Laboratory (Canada) have embarked upon a joint effort to better characterize the thermal hazards associated with these catalysts. This work employs the more conventional tests for water reactivity as an ignition source, fire and dust explosion hazards, using isothermal nano-calorimetry, isothermal basket tests, the Fire Propagation Apparatus and a standard 20 L sphere, respectively. It was found that these chemicals can become self-reactive close to room temperature under specific conditions and can generate explosible dusts. Copyright © 2013 Crown. Published by Elsevier B.V. All rights reserved.

  14. Determining the Financial Impact of Flood Hazards in Ungaged Basins

    NASA Astrophysics Data System (ADS)

    Cotterman, K. A.; Gutenson, J. L.; Pradhan, N. R.; Byrd, A.

    2017-12-01

    Many portions of the Earth lack adequate authoritative or in situ data that is of great value in determining natural hazard vulnerability from both anthropogenic and physical perspective. Such locations include the majority of developing nations, which do not possess adequate warning systems and protective infrastructure. The lack of warning and protection from natural hazards make these nations vulnerable to the destructive power of events such as floods. The goal of this research is to demonstrate an initial workflow with which to characterize flood financial hazards with global datasets and crowd-sourced, non-authoritative data in ungagged river basins. This workflow includes the hydrologic and hydraulic response of the watershed to precipitation, characterized by the physics-based modeling application Gridded Surface-Subsurface Hydrologic Analysis (GSSHA) model. In addition, data infrastructure and resources are available to approximate the human impact of flooding. Open source, volunteer geographic information (VGI) data can provide global coverage of elements at risk of flooding. Additional valuation mechanisms can then translate flood exposure into percentage and financial damage to each building. The combinations of these tools allow the authors to remotely assess flood hazards with minimal computational, temporal, and financial overhead. This combination of deterministic and stochastic modeling provides the means to quickly characterize watershed flood vulnerability and will allow emergency responders and planners to better understand the implications of flooding, both spatially and financially. In either a planning, real-time, or forecasting scenario, the system will assist the user in understanding basin flood vulnerability and increasing community resiliency and preparedness.

  15. The KULTURisk Regional Risk Assessment methodology for water-related natural hazards - Part 1: Physical-environmental assessment

    NASA Astrophysics Data System (ADS)

    Ronco, P.; Gallina, V.; Torresan, S.; Zabeo, A.; Semenzin, E.; Critto, A.; Marcomini, A.

    2014-12-01

    In recent years, the frequency of catastrophes induced by natural hazards has increased, and flood events in particular have been recognized as one of the most threatening water-related disasters. Severe floods have occurred in Europe over the last decade, causing loss of life, displacement of people and heavy economic losses. Flood disasters are growing in frequency as a consequence of many factors, both climatic and non-climatic. Indeed, the current increase of water-related disasters can be mainly attributed to the increase of exposure (elements potentially at risk in flood-prone area) and vulnerability (i.e. economic, social, geographic, cultural and physical/environmental characteristics of the exposure). Besides these factors, the undeniable effect of climate change is projected to strongly modify the usual pattern of the hydrological cycle by intensifying the frequency and severity of flood events at the local, regional and global scale. Within this context, the need for developing effective and pro-active strategies, tools and actions which allow one to assess and (possibly) to reduce the flood risks that threatens different relevant receptors becomes urgent. Several methodologies to assess the risk posed by water-related natural hazards have been proposed so far, but very few of them can be adopted to implement the last European Flood Directive (FD). This paper is intended to introduce and present a state-of-the-art Regional Risk Assessment (RRA) methodology to appraise the risk posed by floods from a physical-environmental perspective. The methodology, developed within the recently completed FP7-KULTURisk Project (Knowledge-based approach to develop a cULTUre of Risk prevention - KR) is flexible and can be adapted to different case studies (i.e. plain rivers, mountain torrents, urban and coastal areas) and spatial scales (i.e. from catchment to the urban scale). The FD compliant KR-RRA methodology is based on the concept of risk being function of hazard, exposure and vulnerability. It integrates the outputs of various hydrodynamic models with site-specific bio-geophysical and socio-economic indicators (e.g. slope, land cover, population density, economic activities etc.) to develop tailored risk indexes and GIS-based maps for each of the selected receptors (i.e. people, buildings, infrastructure, agriculture, natural and semi-natural systems, cultural heritage) in the considered region. It further compares the baseline scenario with alternative scenarios, where different structural and/or non-structural mitigation measures are planned and eventually implemented. As demonstrated in the companion paper (Part 2, Ronco et al., 2014), risk maps, along with related statistics, allow one to identify and classify, on a relative scale, areas at risk which are more likely to be affected by floods and support the development of strategic adaptation and prevention measures to minimizing flood impacts. In addition, the outcomes of the RRA can be eventually used for a further socio-economic assessment, considering the tangible and intangible costs as well as the human dimension of vulnerability.

  16. Towards a Methodology for the Characterization of Teachers' Didactic-Mathematical Knowledge

    ERIC Educational Resources Information Center

    Pino-Fan, Luis R.; Assis, Adriana; Castro, Walter F.

    2015-01-01

    This research study aims at exploring the use of some dimensions and theoretical-methodological tools suggested by the model of Didactic-Mathematical Knowledge (DMK) for the analysis, characterization and development of knowledge that teachers should have in order to efficiently develop within their practice. For this purpose, we analyzed the…

  17. Preliminary Validation of Composite Material Constitutive Characterization

    Treesearch

    John G. Michopoulos; Athanasios lliopoulos; John C. Hermanson; Adrian C. Orifici; Rodney S. Thomson

    2012-01-01

    This paper is describing the preliminary results of an effort to validate a methodology developed for composite material constitutive characterization. This methodology involves using massive amounts of data produced from multiaxially tested coupons via a 6-DoF robotic system called NRL66.3 developed at the Naval Research Laboratory. The testing is followed by...

  18. Hazard Screening Methods for Nanomaterials: A Comparative Study

    PubMed Central

    Murphy, Finbarr; Mullins, Martin; Furxhi, Irini; Costa, Anna L.; Simeone, Felice C.

    2018-01-01

    Hazard identification is the key step in risk assessment and management of manufactured nanomaterials (NM). However, the rapid commercialisation of nano-enabled products continues to out-pace the development of a prudent risk management mechanism that is widely accepted by the scientific community and enforced by regulators. However, a growing body of academic literature is developing promising quantitative methods. Two approaches have gained significant currency. Bayesian networks (BN) are a probabilistic, machine learning approach while the weight of evidence (WoE) statistical framework is based on expert elicitation. This comparative study investigates the efficacy of quantitative WoE and Bayesian methodologies in ranking the potential hazard of metal and metal-oxide NMs—TiO2, Ag, and ZnO. This research finds that hazard ranking is consistent for both risk assessment approaches. The BN and WoE models both utilize physico-chemical, toxicological, and study type data to infer the hazard potential. The BN exhibits more stability when the models are perturbed with new data. The BN has the significant advantage of self-learning with new data; however, this assumes all input data is equally valid. This research finds that a combination of WoE that would rank input data along with the BN is the optimal hazard assessment framework. PMID:29495342

  19. Mission hazard assessment for STARS Mission 1 (M1) in the Marshall Islands area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Outka, D.E.; LaFarge, R.A.

    1993-07-01

    A mission hazard assessment has been performed for the Strategic Target System Mission 1 (known as STARS M1) for hazards due to potential debris impact in the Marshall Islands area. The work was performed at Sandia National Laboratories as a result of discussion with Kwajalein Missile Range (KMR) safety officers. The STARS M1 rocket will be launched from the Kauai Test Facility (KTF), Hawaii, and deliver two payloads to within the viewing range of sensors located on the Kwajalein Atoll. The purpose of this work has been to estimate upper bounds for expected casualty rates and impact probability or themore » Marshall Islands areas which adjoin the STARS M1 instantaneous impact point (IIP) trace. This report documents the methodology and results of the analysis.« less

  20. A comprehensive review of the implementation of hazard analysis critical control point (HACCP) to the production of flour and flour-based products.

    PubMed

    Arvanitoyannis, Ioannis S; Traikou, Athina

    2005-01-01

    The production of flour and semolina and their ensuing products, such as bread, cake, spaghetti, noodles, and corn flakes, is of major importance, because these products constitute some of the main ingredients of the human diet. The Hazard Analysis Critical Control Point (HACCP) system aims at ensuring the safety of these products. HACCP has been implemented within the frame of this study on various products of both Asian and European origin; the hazards, critical control limits (CCLs), observation practices, and corrective actions have been summarized in comprehensive tables. Furthermore, the various production steps, packaging included, were thoroughly analyzed, and reference was made to both the traditional and new methodologies in an attempt to pinpoint the occurring differences (advantages and disadvantages) per process.

  1. Physically-Based Probabilistic Seismic Hazard Analysis Using Broad-Band Ground Motion Simulation: a Case Study for Prince Islands Fault, Marmara Sea

    NASA Astrophysics Data System (ADS)

    Mert, A.

    2016-12-01

    The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.

  2. Safety and Health Hazard Observations in Hmong Farming Operations

    PubMed Central

    Neitzel, R. L.; Krenz, J.; de Castro, A. B.

    2014-01-01

    Agricultural workers have a high risk of occupational injuries, illnesses, and fatalities. However, there are very few standardized tools available to assess safety and health in agricultural operations. Additionally, there are a number of groups of agricultural workers, including Hmong refugees and immigrants, for which virtually no information on safety and health conditions is available. This study developed an observation-based methodology for systematically evaluating occupational health and safety hazards in agriculture, and pilot-tested this on several small-scale Hmong farming operations. Each observation assessed of range of safety and health hazards (e.g., musculoskeletal hazards, dust and pollen, noise, and mechanical hazards), as well as on factors such as type of work area, presence of personal protective equipment, and weather conditions. Thirty-six observations were collected on nine farms. The most common hazards observed were bending at the back and lifting <50 pounds. Use of sharp tools without adequate guarding mechanisms, awkward postures, repetitive hand motions, and lifting >50 pounds were also common. The farming activities observed involved almost no power equipment, and no pesticide or chemical handling was observed. The use of personal protective equipment was uncommon. The results of this assessment agreed well with a parallel study of perceived safety and health hazards among Hmong agricultural workers. This study suggests that small-scale Hmong farming operations involve a variety of hazards, and that occupational health interventions may be warranted in this community. The study also demonstrates the utility of standardized assessment tools and mixed-method approaches to hazard evaluation. PMID:24911689

  3. Delineation of karst terranes in complex environments: Application of modern developments in the wavelet theory and data mining

    NASA Astrophysics Data System (ADS)

    Alperovich, Leonid; Averbuch, Amir; Eppelbaum, Lev; Zheludev, Valery

    2013-04-01

    Karst areas occupy about 14% of the world land. Karst terranes of different origin have caused difficult conditions for building, industrial activity and tourism, and are the source of heightened danger for environment. Mapping of karst (sinkhole) hazards, obviously, will be one of the most significant problems of engineering geophysics in the XXI century. Taking into account the complexity of geological media, some unfavourable environments and known ambiguity of geophysical data analysis, a single geophysical method examination might be insufficient. Wavelet methodology as whole has a significant impact on cardinal problems of geophysical signal processing such as: denoising of signals, enhancement of signals and distinguishing of signals with closely related characteristics and integrated analysis of different geophysical fields (satellite, airborne, earth surface or underground observed data). We developed a three-phase approach to the integrated geophysical localization of subsurface karsts (the same approach could be used for following monitoring of karst dynamics). The first phase consists of modeling devoted to compute various geophysical effects characterizing karst phenomena. The second phase determines development of the signal processing approaches to analyzing of profile or areal geophysical observations. Finally, at the third phase provides integration of these methods in order to create a new method of the combined interpretation of different geophysical data. In the base of our combine geophysical analysis we put modern developments in the wavelet technique of the signal and image processing. The development of the integrated methodology of geophysical field examination will enable to recognizing the karst terranes even by a small ratio of "useful signal - noise" in complex geological environments. For analyzing the geophysical data, we used a technique based on the algorithm to characterize a geophysical image by a limited number of parameters. This set of parameters serves as a signature of the image and is to be utilized for discrimination of images containing karst cavity (K) from the images non-containing karst (N). The constructed algorithm consists of the following main phases: (a) collection of the database, (b) characterization of geophysical images, (c) and dimensionality reduction. Then, each image is characterized by the histogram of the coherency directions. As a result of the previous steps we obtain two sets K and N of the signatures vectors for images from sections containing karst cavity and non-karst subsurface, respectively.

  4. A high-throughput virus-induced gene silencing protocol identifies genes involved in multi-stress tolerance

    PubMed Central

    2013-01-01

    Background Understanding the function of a particular gene under various stresses is important for engineering plants for broad-spectrum stress tolerance. Although virus-induced gene silencing (VIGS) has been used to characterize genes involved in abiotic stress tolerance, currently available gene silencing and stress imposition methodology at the whole plant level is not suitable for high-throughput functional analyses of genes. This demands a robust and reliable methodology for characterizing genes involved in abiotic and multi-stress tolerance. Results Our methodology employs VIGS-based gene silencing in leaf disks combined with simple stress imposition and effect quantification methodologies for easy and faster characterization of genes involved in abiotic and multi-stress tolerance. By subjecting leaf disks from gene-silenced plants to various abiotic stresses and inoculating silenced plants with various pathogens, we show the involvement of several genes for multi-stress tolerance. In addition, we demonstrate that VIGS can be used to characterize genes involved in thermotolerance. Our results also showed the functional relevance of NtEDS1 in abiotic stress, NbRBX1 and NbCTR1 in oxidative stress; NtRAR1 and NtNPR1 in salinity stress; NbSOS1 and NbHSP101 in biotic stress; and NtEDS1, NbETR1, NbWRKY2 and NbMYC2 in thermotolerance. Conclusions In addition to widening the application of VIGS, we developed a robust, easy and high-throughput methodology for functional characterization of genes involved in multi-stress tolerance. PMID:24289810

  5. 1-Propanol probing methodology: two-dimensional characterization of the effect of solute on H2O.

    PubMed

    Koga, Yoshikata

    2013-09-21

    The wording "hydrophobicity/hydrophilicity" has been used in a loose manner based on human experiences. We have devised a more quantitative way to redefine "hydrophobes" and "hydrophiles" in terms of the mole fraction dependence pattern of one of the third derivative quantities, the enthalpic interaction between solute molecules. We then devised a thermodynamic methodology to characterize the effect of a solute on H2O in terms of its hydrophobicity and/or hydrophilicity. We use a thermodynamic signature, the enthalpic interaction of 1-propanol, H, to monitor how the test solute modifies H2O. By this method, characterization is facilitated by two indices; one pertaining to its hydrophobicity and the other its hydrophilicity. Hence differences among amphiphiles are quantified in a two-dimensional manner. Furthermore, an individual ion can be characterized independent of a counter ion. By using this methodology, we have studied the effects on H2O of a number of solutes, and gained some important new insights. For example, such commonly used examples of hydrophobes in the literature as tetramethyl urea, trimethylamine-N-oxide, and tetramethylammonium salts are in fact surprisingly hydrophilic. Hence the conclusions about "hydrophobes" using these samples ought to be interpreted with caution. The effects of anions on H2O found by this methodology are in the same sequence of the Hofmeister ranking, which will no doubt aid a further investigation into this enigma in biochemistry. Thus, it is likely that this methodology could play an important role in the characterization of the effects of solutes in H2O, and a perspective view may be useful. Here, we describe the basis on which the methodology is developed and the methodology itself in m.ore detail than given in individual papers. We then summarize the results in two dimensional hydrophobicity/hydrophilicity maps.

  6. Development of Methodology for Programming Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Erol, Kutluhan; Levy, Renato; Lang, Lun

    2004-01-01

    A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently

  7. ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.

    PubMed

    Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

    2006-03-31

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.

  8. 40 CFR 280.63 - Initial site characterization.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Initial site characterization. 280.63... Hazardous Substances § 280.63 Initial site characterization. (a) Unless directed to do otherwise by the implementing agency, owners and operators must assemble information about the site and the nature of the...

  9. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  10. PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary

    2015-06-01

    PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.

  11. Map Your Hazards! - an Interdisciplinary, Place-Based Educational Approach to Assessing Natural Hazards, Social Vulnerability, Risk and Risk Perception.

    NASA Astrophysics Data System (ADS)

    Brand, B. D.; McMullin-Messier, P. A.; Schlegel, M. E.

    2014-12-01

    'Map your Hazards' is an educational module developed within the NSF Interdisciplinary Teaching about Earth for a Sustainable Future program (InTeGrate). The module engages students in place-based explorations of natural hazards, social vulnerability, and the perception of natural hazards and risk. Students integrate geoscience and social science methodologies to (1) identify and assess hazards, vulnerability and risk within their communities; (2) distribute, collect and evaluate survey data (designed by authors) on the knowledge, risk perception and preparedness within their social networks; and (3) deliver a PPT presentation to local stakeholders detailing their findings and recommendations for development of a prepared, resilient community. 'Map your Hazards' underwent four rigorous assessments by a team of geoscience educators and external review before being piloted in our classrooms. The module was piloted in a 300-level 'Volcanoes and Society' course at Boise State University, a 300-level 'Environmental Sociology' course at Central Washington University, and a 100-level 'Natural Disasters and Environmental Geology' course at the College of Western Idaho. In all courses students reported a fascination with learning about the hazards around them and identifying the high risk areas in their communities. They were also surprised at the low level of knowledge, inaccurate risk perception and lack of preparedness of their social networks. This successful approach to engaging students in an interdisciplinary, place-based learning environment also has the broad implications of raising awareness of natural hazards (survey participants are provided links to local hazard and preparedness information). The data and preparedness suggestions can be shared with local emergency managers, who are encouraged to attend the student's final presentations. All module materials are published at serc.carleton.edu/integrate/ and are appropriate to a wide range of classrooms.

  12. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  13. Systemic characterization and evaluation of particle packings as initial sets for discrete element simulations

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos Recarey; Cortés, Lucía Argüelles; Farias, Márcio Muniz de; Morales, Irvin Pablo Pérez; Valera, Roberto Roselló; Oñate, Eugenio

    2018-07-01

    A methodology that comprises several characterization properties for particle packings is proposed in this paper. The methodology takes into account factors such as dimension and shape of particles, space occupation, homogeneity, connectivity and isotropy, among others. This classification and integration of several properties allows to carry out a characterization process to systemically evaluate the particle packings in order to guarantee the quality of the initial meshes in discrete element simulations, in both the micro- and the macroscales. Several new properties were created, and improvements in existing ones are presented. Properties from other disciplines were adapted to be used in the evaluation of particle systems. The methodology allows to easily characterize media at the level of the microscale (continuous geometries—steels, rocks microstructures, etc., and discrete geometries) and the macroscale. A global, systemic and integral system for characterizing and evaluating particle sets, based on fuzzy logic, is presented. Such system allows researchers to have a unique evaluation criterion based on the aim of their research. Examples of applications are shown.

  14. Systemic characterization and evaluation of particle packings as initial sets for discrete element simulations

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos Recarey; Cortés, Lucía Argüelles; Farias, Márcio Muniz de; Morales, Irvin Pablo Pérez; Valera, Roberto Roselló; Oñate, Eugenio

    2017-10-01

    A methodology that comprises several characterization properties for particle packings is proposed in this paper. The methodology takes into account factors such as dimension and shape of particles, space occupation, homogeneity, connectivity and isotropy, among others. This classification and integration of several properties allows to carry out a characterization process to systemically evaluate the particle packings in order to guarantee the quality of the initial meshes in discrete element simulations, in both the micro- and the macroscales. Several new properties were created, and improvements in existing ones are presented. Properties from other disciplines were adapted to be used in the evaluation of particle systems. The methodology allows to easily characterize media at the level of the microscale (continuous geometries—steels, rocks microstructures, etc., and discrete geometries) and the macroscale. A global, systemic and integral system for characterizing and evaluating particle sets, based on fuzzy logic, is presented. Such system allows researchers to have a unique evaluation criterion based on the aim of their research. Examples of applications are shown.

  15. Safety Assessment of Multi Purpose Small Payload Rack(MSPR)

    NASA Astrophysics Data System (ADS)

    Mizutani, Yoshinobu; Takada, Satomi; Murata, Kosei; Ozawa, Daisaku; Kobayashi, Ryoji; Nakamura, Yasuhiro

    2010-09-01

    We are reporting summary of preliminary safety assessment for Multi Purpose Small Payload Rack(MSPR), which is one of the micro gravity experiment facilities that are being developed for the 2nd phase JEM utilization(JEM: Japanese Experiment Module) that will be launched on H-II Transfer Vehicle(HTV) 2nd flight in 2011. MSPR is used for multi-purpose micro-g experiment providing experimental spaces and work stations. MSPR has three experimental spaces; first, there is a space called Work Volume(WV) with capacity volume of approximately 350 litters, in which multiple resources including electricity, communication, and moving image functions can be used. Within this space, installation of devices can be done by simple, prompt attachment by Velcro and pins with high degree of flexibility. Second, there is Small Experiment Area(SEA), with capacity volume of approximately 70 litters, in which electricity, communication, and moving image functions can also be used in the same way as WV. These spaces protect experiment devices and specimens from contingent loads by the crewmembers. Third, there is Work Bench with area of 0.5 square meters, on which can be used for maintenance, inspection and data operations of installed devices, etc. This bench can be stored in the rack during contingency. Chamber for Combustion Experiment(CCE) that is planned to be installed in WV is a pressure-resistant experimental container that can be used to seal hazardous materials from combustion experiments. This CCE has double sealing design in chamber itself, which resist gas leakage under normal the temperature and pressure. Electricity, communication, moving image function can be used in the same way as WV. JAXA Phase 2 Safety Review Panel(SRP) has been held in April, 2010. For safety analysis of MSPR, hazards were identified based on Fault Tree Analysis methodology and then these hazards were classified into either eight ISS standard-type hazards or eight unique-type hazards that requires special controls based on ISS common safety assessment methodology. Safety evaluation results are reported in the Safety Assessment Report(SAR) 1). Regarding structural failure, unique hazards are especially evaluated considering not only the tolerance for launch load but also load by crewmembers or orbital loads. Regarding electrical shock, electricity design up to secondary power is evaluated in unique hazard from a view point of Electrical design suitable for high voltage(32VDC or more) circuit. Regarding rupture/leakage of pressure system, hazards of fuel supply line, waste line for combustion gas, and pressure system including CCE are evaluated. Also evaluation for contamination due to hazardous gas leakage from CCE is conducted. External propagation of fire from CCE is also evaluated. In this report, we will show the overview of the result of safety assessment and future plan toward critical design phase activity.

  16. The influence of image valence on visual attention and perception of risk in drivers.

    PubMed

    Jones, M P; Chapman, P; Bailey, K

    2014-12-01

    Currently there is little research into the relationship between emotion and driving in the context of advertising and distraction. Research that has looked into this also has methodological limitations that could be affecting the results rather than emotional processing (Trick et al., 2012). The current study investigated the relationship between image valence and risk perception, eye movements and physiological reactions. Participants watched hazard perception clips which had emotional images from the international affective picture system overlaid onto them. They rated how hazardous or safe they felt, whilst eye movements, galvanic skin response and heart rate were recorded. Results suggested that participants were more aware of potential hazards when a neutral image had been shown, in comparison to positive and negative valenced images; that is, participants showed higher subjective ratings of risk, larger physiological responses and marginally longer fixation durations when viewing a hazard after a neutral image, but this effect was attenuated after emotional images. It appears that emotional images reduce sensitivity to potential hazards, and we suggest that future studies could apply these findings to higher fidelity paradigms such as driving simulators. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Near Earth Objects and Cascading Effects from the Policy Perspective: Implications from Problem and Solution Definition

    NASA Astrophysics Data System (ADS)

    Lindquist, Eric

    2016-04-01

    The characterization of near-Earth-objects (NEOs) in regard to physical attributes and potential risk and impact factors presents a complex and complicates scientific and engineering challenge. The societal and policy risks and impacts are no less complex, yet are rarely considered in the same context as material properties or related factors. Further, NEO impacts are typically considered as discrete events, not as initial events in a dynamic cascading system. The objective of this contribution is to position the characterization of NEOs within the public policy process domain as a means to reflect on the science-policy nexus in regard to risks and multi-hazard impacts associated with these hazards. This will be accomplished through, first, a brief overview of the science-policy nexus, followed by a discussion of policy process frameworks, such as agenda setting and the multiple streams model, focusing events, and punctuated equilibrium, and their application and appropriateness to the problem of NEOs. How, too, for example, does NEO hazard and risk compare with other low probability, high risk, hazards in regard to public policy? Finally, we will reflect on the implications of alternative NEO "solutions" and the characterization of the NEO "problem," and the political and public acceptance of policy alternatives as a way to link NEO science and policy in the context of the overall NH9.12 panel.

  18. Using SAFRAN Software to Assess Radiological Hazards from Dismantling of Tammuz-2 Reactor Core at Al-tuwaitha Nuclear Site

    NASA Astrophysics Data System (ADS)

    Abed Gatea, Mezher; Ahmed, Anwar A.; jundee kadhum, Saad; Ali, Hasan Mohammed; Hussein Muheisn, Abbas

    2018-05-01

    The Safety Assessment Framework (SAFRAN) software has implemented here for radiological safety analysis; to verify that the dose acceptance criteria and safety goals are met with a high degree of confidence for dismantling of Tammuz-2 reactor core at Al-tuwaitha nuclear site. The activities characterizing, dismantling and packaging were practiced to manage the generated radioactive waste. Dose to the worker was considered an endpoint-scenario while dose to the public has neglected due to that Tammuz-2 facility is located in a restricted zone and 30m berm surrounded Al-tuwaitha site. Safety assessment for dismantling worker endpoint-scenario based on maximum external dose at component position level in the reactor pool and internal dose via airborne activity while, for characterizing and packaging worker endpoints scenarios have been done via external dose only because no evidence for airborne radioactivity hazards outside the reactor pool. The in-situ measurements approved that reactor core components are radiologically activated by Co-60 radioisotope. SAFRAN results showed that the maximum received dose for workers are (1.85, 0.64 and 1.3mSv/y) for activities dismantling, characterizing and packaging of reactor core components respectively. Hence, the radiological hazards remain below the low level hazard and within the acceptable annual dose for workers in radiation field

  19. Geological Investigation Program for the Site of a New Nuclear Power Plant in Hungary

    NASA Astrophysics Data System (ADS)

    Gerstenkorn, András; Trosits, Dalma; Chikán, Géza; János Katona, Tamás

    2015-04-01

    Comprehensive site evalaution program is implemented for the new Nuclear Power Plant to be constructed at Paks site in Hungary with the aim of confirmation of acceptability of the site and definition of site-related design basis data. Most extensive part of this program is to investigate geological-tectonical features of the site with particular aim on the assessment of the capability of faults at and around the site, characterization of site seismic hazard, and definition of the design basis earthquake. A brief description of the scope and methodology of the geological, seismological, geophysical, geotechnical and hydrogeological investigations will be given on the poster. Main focus of the presentation is to show the graded structure and extent of the geological investigations that follow the needs and scale of the geological modeling, starting with the site and its vicinity, as well as on the near regional and the regional scale. Geological inverstigations includes several boreholes up-to the base-rock, plenty of boreholes discovering the Pannonian and large number of shallow boreholes for investigation of more recent development. The planning of the geological investigations is based on the 3D seismic survey performed around the site, that is complemented by shallow-seimic survey at and in the vicinity of the site. The 3D geophysical imaging provides essential geodynamic information to assess the capability of near site faults and for the seismic hazard analysis, as well as for the hydrogeological modeling. The planned seismic survey gives a unique dataset for understanding the spatial relationship between individual fault segments. Planning of the research (trenching, etc.) for paleoseismic manifestations is also based on the 3D seismic survey. The seismic survey and other geophysical data (including data of space geodesy) allow the amendment of the understanding and the model of the tectonic evolution of the area and geological events. As it is known from earlier studies, seismic sources in the near regional area are the dominating contributors to the site seimic hazard. Therefore a 3D geological model will be developed for the 50 km region around the site in order to consider different geological scenarios. Site-scale investigations are aimed on the characterization of local geotechnical and hydrogeological conditions. The geotechnical investigations provide data for the evaluation of site response, i.e. the free-field ground motion response spectra, assessment of the liquefaction hazard and foundation design. Important element of the hydrogeological survey is numerical groundwater modeling. The aim of hydrogeological modeling is the summary of hydrogeological data in a numeric system, the description, simulation of underground water flow and transport conditions.

  20. A Windshear Hazard Index

    NASA Technical Reports Server (NTRS)

    Proctor, Fred H.; Hinton, David A.; Bowles, Roland L.

    2000-01-01

    An aircraft exposed to hazardous low-level windshear may suffer a critical loss of airspeed and altitude, thus endangering its ability to remain airborne. In order to characterize this hazard, a nondimensional index was developed based oil aerodynamic principals and understanding of windshear phenomena, 'This paper reviews the development and application of the Bowles F-tactor. which is now used by onboard sensors for the detection of hazardous windshear. It was developed and tested during NASA/I:AA's airborne windshear program and is now required for FAA certification of onboard radar windshear detection systems. Reviewed in this paper are: 1) definition of windshear and description of atmospheric phenomena that may cause hazardous windshear. 2) derivation and discussion of the F-factor. 3) development of the F-factor hazard threshold, 4) its testing during field deployments, and 5) its use in accident reconstructions,

  1. Housing as a Determinant of Tongan Children’s Health: Innovative Methodology Using Wearable Cameras

    PubMed Central

    Robinson, Andrew; Puloka, Viliami; Smith, Moira; Stanley, James; Signal, Louise

    2017-01-01

    Housing is a significant determinant of health, particularly in developing countries such as Tonga. Currently, very little is known about the quality of the housing in Tonga, as is the case with many developing countries, nor about the interaction between children and the home environment. This study aimed to identify the nature and extent of health risk factors and behaviours in Tongan houses from a child’s perspective. An innovative methodology was used, Kids’Cam Tonga. Seventy-two Class 6 children (10 to 13-year-olds) were randomly selected from 12 randomly selected schools in Tongatapu, the main island. Each participating child wore a wearable camera on lanyards around their neck. The device automatically took wide-angled, 136° images of the child’s perspective every seven seconds. The children were instructed to wear the camera all day from Friday morning to Sunday evening, inclusive. The analysis showed that the majority of Tongan children in the study live in houses that have structural deficiencies and hazards, including water damage (42%), mould (36%), and electrical (89%) and burn risk factors (28%). The findings suggest that improvements to the housing stock may reduce the associated health burden and increase buildings’ resilience to natural hazards. A collaborative approach between communities, community leaders, government and non-governmental organisations (NGOs) is urgently needed. This research methodology may be of value to other developing countries. PMID:28976919

  2. Methodological Framework for World Health Organization Estimates of the Global Burden of Foodborne Disease

    PubMed Central

    Devleesschauwer, Brecht; Haagsma, Juanita A.; Angulo, Frederick J.; Bellinger, David C.; Cole, Dana; Döpfer, Dörte; Fazil, Aamir; Fèvre, Eric M.; Gibb, Herman J.; Hald, Tine; Kirk, Martyn D.; Lake, Robin J.; Maertens de Noordhout, Charline; Mathers, Colin D.; McDonald, Scott A.; Pires, Sara M.; Speybroeck, Niko; Thomas, M. Kate; Torgerson, Paul R.; Wu, Felicia; Havelaar, Arie H.; Praet, Nicolas

    2015-01-01

    Background The Foodborne Disease Burden Epidemiology Reference Group (FERG) was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs). This paper describes the methodological framework developed by FERG's Computational Task Force to transform epidemiological information into FBD burden estimates. Methods and Findings The global and regional burden of 31 FBDs was quantified, along with limited estimates for 5 other FBDs, using Disability-Adjusted Life Years in a hazard- and incidence-based approach. To accomplish this task, the following workflow was defined: outline of disease models and collection of epidemiological data; design and completion of a database template; development of an imputation model; identification of disability weights; probabilistic burden assessment; and estimating the proportion of the disease burden by each hazard that is attributable to exposure by food (i.e., source attribution). All computations were performed in R and the different functions were compiled in the R package 'FERG'. Traceability and transparency were ensured by sharing results and methods in an interactive way with all FERG members throughout the process. Conclusions We developed a comprehensive framework for estimating the global burden of FBDs, in which methodological simplicity and transparency were key elements. All the tools developed have been made available and can be translated into a user-friendly national toolkit for studying and monitoring food safety at the local level. PMID:26633883

  3. Toward an Application Guide for Safety Integrity Level Allocation in Railway Systems.

    PubMed

    Ouedraogo, Kiswendsida Abel; Beugin, Julie; El-Koursi, El-Miloudi; Clarhaut, Joffrey; Renaux, Dominique; Lisiecki, Frederic

    2018-02-02

    The work in the article presents the development of an application guide based on feedback and comments stemming from various railway actors on their practices of SIL allocation to railway safety-related functions. The initial generic methodology for SIL allocation has been updated to be applied to railway rolling stock safety-related functions in order to solve the SIL concept application issues. Various actors dealing with railway SIL allocation problems are the intended target of the methodology; its principles will be summarized in this article with a focus on modifications and precisions made in order to establish a practical guide for railway safety authorities. The methodology is based on the flowchart formalism used in CSM (common safety method) European regulation. It starts with the use of quantitative safety requirements, particularly tolerable hazard rates (THR). THR apportioning rules are applied. On the one hand, the rules are related to classical logical combinations of safety-related functions preventing hazard occurrence. On the other hand, to take into account technical conditions (last safety weak link, functional dependencies, technological complexity, etc.), specific rules implicitly used in existing practices are defined for readjusting some THR values. SIL allocation process based on apportioned and validated THR values is finally illustrated through the example of "emergency brake" subsystems. Some specific SIL allocation rules are also defined and illustrated. © 2018 Society for Risk Analysis.

  4. Study on the evaluation method for fault displacement based on characterized source model

    NASA Astrophysics Data System (ADS)

    Tonagi, M.; Takahama, T.; Matsumoto, Y.; Inoue, N.; Irikura, K.; Dalguer, L. A.

    2016-12-01

    In IAEA Specific Safety Guide (SSG) 9 describes that probabilistic methods for evaluating fault displacement should be used if no sufficient basis is provided to decide conclusively that the fault is not capable by using the deterministic methodology. In addition, International Seismic Safety Centre compiles as ANNEX to realize seismic hazard for nuclear facilities described in SSG-9 and shows the utility of the deterministic and probabilistic evaluation methods for fault displacement. In Japan, it is required that important nuclear facilities should be established on ground where fault displacement will not arise when earthquakes occur in the future. Under these situations, based on requirements, we need develop evaluation methods for fault displacement to enhance safety in nuclear facilities. We are studying deterministic and probabilistic methods with tentative analyses using observed records such as surface fault displacement and near-fault strong ground motions of inland crustal earthquake which fault displacements arose. In this study, we introduce the concept of evaluation methods for fault displacement. After that, we show parts of tentative analysis results for deterministic method as follows: (1) For the 1999 Chi-Chi earthquake, referring slip distribution estimated by waveform inversion, we construct a characterized source model (Miyake et al., 2003, BSSA) which can explain observed near-fault broad band strong ground motions. (2) Referring a characterized source model constructed in (1), we study an evaluation method for surface fault displacement using hybrid method, which combines particle method and distinct element method. At last, we suggest one of the deterministic method to evaluate fault displacement based on characterized source model. This research was part of the 2015 research project `Development of evaluating method for fault displacement` by the Secretariat of Nuclear Regulation Authority (S/NRA), Japan.

  5. Preliminary Considerations for Classifying Hazards of Unmanned Aircraft Systems

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Maddalon, Jeffrey M.; Miner, Paul S.; Szatkowski, George N.; Ulrey, Michael L.; DeWalt, Michael P.; Spitzer, Cary R.

    2007-01-01

    The use of unmanned aircraft in national airspace has been characterized as the next great step forward in the evolution of civil aviation. To make routine and safe operation of these aircraft a reality, a number of technological and regulatory challenges must be overcome. This report discusses some of the regulatory challenges with respect to deriving safety and reliability requirements for unmanned aircraft. In particular, definitions of hazards and their classification are discussed and applied to a preliminary functional hazard assessment of a generic unmanned system.

  6. A systems-based food safety evaluation: an experimental approach.

    PubMed

    Higgins, Charles L; Hartfield, Barry S

    2004-11-01

    Food establishments are complex systems with inputs, subsystems, underlying forces that affect the system, outputs, and feedback. Building on past exploration of the hazard analysis critical control point concept and Ludwig von Bertalanffy General Systems Theory, the National Park Service (NPS) is attempting to translate these ideas into a realistic field assessment of food service establishments and to use information gathered by these methods in efforts to improve food safety. Over the course of the last two years, an experimental systems-based methodology has been drafted, developed, and tested by the NPS Public Health Program. This methodology is described in this paper.

  7. Examining the Association between Hazardous Waste Facilities and Rural "Brain Drain"

    ERIC Educational Resources Information Center

    Hunter, Lori M.; Sutton, Jeannette

    2004-01-01

    Rural communities are increasingly being faced with the prospect of accepting facilities characterized as "opportunity-threat," such as facilities that generate, treat, store, or otherwise dispose of hazardous wastes. Such facilities may offer economic gains through jobs and tax revenue, although they may also act as environmental "disamenities."…

  8. Further Evaluation of DNT Hazard Screening using Neural Networks from Rat Cortical Neurons on Multi-well Microelectrode Arrays

    EPA Science Inventory

    Thousands of chemicals have not been characterized for their DNT potential. Due to the need for DNT hazard identification, efforts to develop screening assays for DNT potential is a high priority. Multi-well microelectrode arrays (MEA) measure the spontaneous activity of electr...

  9. Specific surface to evaluate the efficiencies of milling and pretreatment of wood for enzymatic saccharification

    Treesearch

    Junyong Zhu; G.S. Wang; X.J. Pan; Roland Gleisner

    2009-01-01

    Sieving methods have been almost exclusively used for feedstock size-reduction characterization in the biomass refining literature. This study demonstrates a methodology to properly characterize specific surface of biomass substrates through two dimensional measurement of each fiber of the substrate using a wet imaging technique. The methodology provides more...

  10. Remote sensing for site characterization

    USGS Publications Warehouse

    Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.; Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.

    2000-01-01

    This volume, Remote Sensing for Site Characterization, describes the feasibility of aircraft- and satellite-based methods of revealing environmental-geological problems. A balanced ratio between explanations of the methodological/technical side and presentations of case studies is maintained. The comparison of case studies from North America and Germany show how the respective territorial conditions lead to distinct methodological approaches.

  11. Airborne Turbulence Detection System Certification Tool Set

    NASA Technical Reports Server (NTRS)

    Hamilton, David W.; Proctor, Fred H.

    2006-01-01

    A methodology and a corresponding set of simulation tools for testing and evaluating turbulence detection sensors has been presented. The tool set is available to industry and the FAA for certification of radar based airborne turbulence detection systems. The tool set consists of simulated data sets representing convectively induced turbulence, an airborne radar simulation system, hazard tables to convert the radar observable to an aircraft load, documentation, a hazard metric "truth" algorithm, and criteria for scoring the predictions. Analysis indicates that flight test data supports spatial buffers for scoring detections. Also, flight data and demonstrations with the tool set suggest the need for a magnitude buffer.

  12. Safety management and risk assessment in chemical laboratories.

    PubMed

    Marendaz, Jean-Luc; Friedrich, Kirstin; Meyer, Thierry

    2011-01-01

    The present paper highlights a new safety management program, MICE (Management, Information, Control and Emergency), which has been specifically adapted for the academic environment. The process starts with an exhaustive hazard inventory supported by a platform assembling specific hazards encountered in laboratories and their subsequent classification. A proof of concept is given by a series of implementations in the domain of chemistry targeting workplace health protection. The methodology is expressed through three examples to illustrate how the MICE program can be used to address safety concerns regarding chemicals, strong magnetic fields and nanoparticles in research laboratories. A comprehensive chemical management program is also depicted.

  13. Making the Hubble Space Telescope servicing mission safe

    NASA Technical Reports Server (NTRS)

    Bahr, N. J.; Depalo, S. V.

    1992-01-01

    The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.

  14. Applications of polymeric smart materials to environmental problems.

    PubMed Central

    Gray, H N; Bergbreiter, D E

    1997-01-01

    New methods for the reduction and remediation of hazardous wastes like carcinogenic organic solvents, toxic materials, and nuclear contamination are vital to environmental health. Procedures for effective waste reduction, detection, and removal are important components of any such methods. Toward this end, polymeric smart materials are finding useful applications. Polymer-bound smart catalysts are useful in waste minimization, catalyst recovery, and catalyst reuse. Polymeric smart coatings have been developed that are capable of both detecting and removing hazardous nuclear contaminants. Such applications of smart materials involving catalysis chemistry, sensor chemistry, and chemistry relevant to decontamination methodology are especially applicable to environmental problems. PMID:9114277

  15. Delivering meat carcasses/cuts to craft-butcher shops: an investigation of work characteristics and manual handling hazards.

    PubMed

    Okunribido, Olanrewaju O; Gingell, Alison

    2014-11-01

    This study investigated delivery scenarios of service drivers working in the retail meat industry. The methodology included analysis of accident reports, and field investigations of deliveries at craft-butcher shop premises, including semi-structured interviews with managers and workers. The findings provide greater clarity about the hazards in this job, and suggest for peripatetic delivery activities, four main factors on which decisions about risk and good practice may be made: composition of the orders; characteristics of the delivery vehicle/truck; handling method most often used; and, the road/access conditions. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  16. Hazards/Failure Modes and Effects Analysis MK 1 MOD 0 LSO-HUD Console System.

    DTIC Science & Technology

    1980-03-24

    AsI~f~ ! 127 = 3gc Z Isre -0 -q ~sI I I 𔃻~~~ ~ _ _ 3_______ II! -0udC Z Z’ P4 12 d-U * ~s ’:i~i42 S- 60 -, Uh ~ U3l I OM -C ~ . - U 4~ dcd 8U-q Ali...8 VI SCOPE AND METHODOLOGY OF ANALYSIS ........ 1O FIGURE 1: H/ FMEA /(SSA) WORK SHEET FORMAT ........... 14 APPENDIX A: HAZARD/FAILURE MODES AND...EFFECTS ANALYSIS (H/ FMEA ) -- WORK SHEETS ......... 15(A-O) TABLE: SUBSYSTEM: UNIT I Heads-Up Display Console .............. 17(A-1) UNIT 2 Auxiliary

  17. Monitoring and characterizing natural hazards with satellite InSAR imagery

    USGS Publications Warehouse

    Lu, Zhong; Zhang, Jixian; Zhang, Yonghong; Dzurisin, Daniel

    2010-01-01

    Interferometric synthetic aperture radar (InSAR) provides an all-weather imaging capability for measuring ground-surface deformation and inferring changes in land surface characteristics. InSAR enables scientists to monitor and characterize hazards posed by volcanic, seismic, and hydrogeologic processes, by landslides and wildfires, and by human activities such as mining and fluid extraction or injection. Measuring how a volcano’s surface deforms before, during, and after eruptions provides essential information about magma dynamics and a basis for mitigating volcanic hazards. Measuring spatial and temporal patterns of surface deformation in seismically active regions is extraordinarily useful for understanding rupture dynamics and estimating seismic risks. Measuring how landslides develop and activate is a prerequisite to minimizing associated hazards. Mapping surface subsidence or uplift related to extraction or injection of fluids during exploitation of groundwater aquifers or petroleum reservoirs provides fundamental data on aquifer or reservoir properties and improves our ability to mitigate undesired consequences. Monitoring dynamic water-level changes in wetlands improves hydrological modeling predictions and the assessment of future flood impacts. In addition, InSAR imagery can provide near-real-time estimates of fire scar extents and fire severity for wildfire management and control. All-weather satellite radar imagery is critical for studying various natural processes and is playing an increasingly important role in understanding and forecasting natural hazards.

  18. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of launch. (6) Population density, location, susceptibility (health categories) and sheltering for all..., or for use in any real-time physics models used to ensure compliance with the toxic flight commit...

  19. Use of COTS Batteries on ISS and Shuttle: Payload Safety and Mission Success

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Judith A.

    2004-01-01

    Contents: Current program requirements; Challenges with COTS batteries; manned vehicle COTS methodology in use; List of typical flight COTS batteries; Energy content and toxicity; Hazards, failure modes and controls for different battery chemistries; JSC test details; List of incidents from Consumer Protection Safety Commission; Conclusions ans recommendations.

  20. Modeling Repeatable Events Using Discrete-Time Data: Predicting Marital Dissolution

    ERIC Educational Resources Information Center

    Teachman, Jay

    2011-01-01

    I join two methodologies by illustrating the application of multilevel modeling principles to hazard-rate models with an emphasis on procedures for discrete-time data that contain repeatable events. I demonstrate this application using data taken from the 1995 National Survey of Family Growth (NSFG) to ascertain the relationship between multiple…

  1. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of..., or for use in any real-time physics models used to ensure compliance with the toxic flight commit... atmospheric physics on the transport and diffusion of toxic propellants released; (5) Meteorological...

  2. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of..., or for use in any real-time physics models used to ensure compliance with the toxic flight commit... atmospheric physics on the transport and diffusion of toxic propellants released; (5) Meteorological...

  3. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of..., or for use in any real-time physics models used to ensure compliance with the toxic flight commit... atmospheric physics on the transport and diffusion of toxic propellants released; (5) Meteorological...

  4. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of..., or for use in any real-time physics models used to ensure compliance with the toxic flight commit... atmospheric physics on the transport and diffusion of toxic propellants released; (5) Meteorological...

  5. A Biologically Informed Framework for the Analysis of the PPAR Signaling Pathway using a Bayesian Network

    EPA Science Inventory

    The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...

  6. [Main factors of occupational risk for railway workers].

    PubMed

    Kaptsov, V A; Pankova, V B; Kutovoĭ, V S

    2001-01-01

    The paper shows the specific features of development of occupational diseases associated with the classes of working conditions by the degree of hazard and risk. It provides a scientifically founded evidence for that the indicators of a potential risk of working conditions should be included into the methodology of assessment of a professional risk.

  7. Report of an exploratory study: Safety and liability considerations for photovoltaic modules/panels

    NASA Technical Reports Server (NTRS)

    Weinstein, A. S.; Meeker, D. G.

    1981-01-01

    An overview of legal issues as they apply to design, manufacture and use of photovoltaic module/array devices is provided and a methodology is suggested for use of the design stage of these products to minimize or eliminate perceived hazards. Questions are posed to stimulate consideration of this area.

  8. Explosion and/or fire risk assessment methodology: a common approach, structured for underground coalmine environments / Metoda szacowania ryzyka wybuchu i pożarów: podejście ogólne, dostosowane do środowiska kopalni podziemnej

    NASA Astrophysics Data System (ADS)

    Cioca, Ionel-Lucian; Moraru, Roland Iosif

    2012-10-01

    In order to meet statutory requirements concerning the workers health and safety, it is necessary for mine managers within Valea Jiului coal basin in Romania to address the potential for underground fires and explosions and their impact on the workforce and the mine ventilation systems. Highlighting the need for a unified and systematic approach of the specific risks, the authors are developing a general framework for fire/explosion risk assessment in gassy mines, based on the quantification of the likelihood of occurrence and gravity of the consequences of such undesired events and employing Root-Cause analysis method. It is emphasized that even a small fire should be regarded as being a major hazard from the point of view of explosion initiation, should a combustible atmosphere arise. The developed methodology, for the assessment of underground fire and explosion risks, is based on the known underground explosion hazards, fire engineering principles and fire test criteria for potentially combustible materials employed in mines.

  9. Converting HAZUS capacity curves to seismic hazard-compatible building fragility functions: effect of hysteretic models

    USGS Publications Warehouse

    Ryu, Hyeuk; Luco, Nicolas; Baker, Jack W.; Karaca, Erdem

    2008-01-01

    A methodology was recently proposed for the development of hazard-compatible building fragility models using parameters of capacity curves and damage state thresholds from HAZUS (Karaca and Luco, 2008). In the methodology, HAZUS curvilinear capacity curves were used to define nonlinear dynamic SDOF models that were subjected to the nonlinear time history analysis instead of the capacity spectrum method. In this study, we construct a multilinear capacity curve with negative stiffness after an ultimate (capping) point for the nonlinear time history analysis, as an alternative to the curvilinear model provided in HAZUS. As an illustration, here we propose parameter values of the multilinear capacity curve for a moderate-code low-rise steel moment resisting frame building (labeled S1L in HAZUS). To determine the final parameter values, we perform nonlinear time history analyses of SDOF systems with various parameter values and investigate their effects on resulting fragility functions through sensitivity analysis. The findings improve capacity curves and thereby fragility and/or vulnerability models for generic types of structures.

  10. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data limitations to date. Conclusions include a brief discussion of subsequent uncertainty and risk analysis methodologies, and a commentary on present and future applications of QRA in the management of the public health risks associated with the presence of pathogenic microorganisms in the food supply.

  11. Methods for Characterizing Fine Particulate Matter Using Satellite Remote-Sensing Data and Ground Observations: Potential Use for Environmental Public Health Surveillance

    NASA Technical Reports Server (NTRS)

    Al-Hamdan, Mohammad Z.; Crosson, William L.; Limaye, Ashutosh S.; Rickman, Douglas L.; Quattrochi, Dale A.; Estes, Maurice G.; Qualters, Judith R.; Niskar, Amanda S.; Sinclair, Amber H.; Tolsma, Dennis D.; hide

    2007-01-01

    This study describes and demonstrates different techniques for surfacing daily environmental / hazards data of particulate matter with aerodynamic diameter less than or equal to 2.5 micrometers (PM2.5) for the purpose of integrating respiratory health and environmental data for the Centers for Disease Control and Prevention (CDC s) pilot study of Health and Environment Linked for Information Exchange (HELIX)-Atlanta. It described a methodology for estimating ground-level continuous PM2.5 concentrations using B-Spline and inverse distance weighting (IDW) surfacing techniques and leveraging National Aeronautics and Space Administration (NASA) Moderate Resolution Imaging Spectrometer (MODIS) data to complement The Environmental Protection Agency (EPA) ground observation data. The study used measurements of ambient PM2.5 from the EPA database for the year 2003 as well as PM2.5 estimates derived from NASA s satellite data. Hazard data have been processed to derive the surrogate exposure PM2.5 estimates. The paper has shown that merging MODIS remote sensing data with surface observations of PM2.5 not only provides a more complete daily representation of PM2.5 than either data set alone would allow, but it also reduces the errors in the PM2.5 estimated surfaces. The results of this paper have shown that the daily IDW PM2.5 surfaces had smaller errors, with respect to observations, than those of the B-Spline surfaces in the year studied. However the IDW mean annual composite surface had more numerical artifacts, which could be due to the interpolating nature of the IDW that assumes that the maxima and minima can occur only at the observation points. Finally, the methods discussed in this paper improve temporal and spatial resolutions and establish a foundation for environmental public health linkage and association studies for which determining the concentrations of an environmental hazard such as PM2.5 with good accuracy levels is critical.

  12. Landslide Hazard Assessment In Mountaneous Area of Uzbekistan

    NASA Astrophysics Data System (ADS)

    Nyazov, R. A.; Nurtaev, B. S.

    Because of the growth of population and caretaking of the flat areas under agricul- ture, mountain areas have been intensively mastered, producing increase of natural and technogenic processes in Uzbekistan last years. The landslides are the most dan- gerous phenomena and 7240 of them happened during last 40 years. More than 50 % has taken place in the term of 1991 - 2000 years. The situation is aggravated be- cause these regions are situated in zones, where disastrous earthquakes with M> 7 occurred in past and are expected in the future. Continuing seismic gap in Uzbek- istan during last 15-20 years and last disastrous earthquakes occurred in Afghanistan, Iran, Turkey, Greece, Taiwan and India worry us. On the basis of long-term observa- tions the criteria of landslide hazard assessment (suddenness, displacement interval, straight-line directivity, kind of residential buildings destruction) are proposed. This methodology was developed on two geographic levels: local (town scale) and regional (region scale). Detailed risk analysis performed on a local scale and extrapolated to the regional scale. Engineering-geologic parameters content of hazard estimation of landslides and mud flows also is divided into regional and local levels. Four degrees of danger of sliding processes are distinguished for compiling of small-scale, medium- and large-scale maps. Angren industrial area in Tien-Shan mountain is characterized by initial seismic intensity of 8-9 (MSC scale). Here the human technological activity (open-cast mining) has initiated the forming of the large landslide that covers more- over 8 square kilometers and corresponds to a volume of 800 billion cubic meters. In turn the landslide influence can become the source of industrial emergencies. On an example of Angren industrial mining region, the different scenarios on safety control of residing of the people and motion of transport, regulating technologies definition of field improvement and exploitation of mountain water reservoirs are proposed for prevention of dangerous geological processes.

  13. An exhaustive approach for identification of flood risk hotspots in data poor regions enforcing combined geomorphic and socio-economic indicators

    NASA Astrophysics Data System (ADS)

    Mohanty, M. P.; Karmakar, S.; Ghosh, S.

    2017-12-01

    Many countries across the Globe are victims of floods. To monitor them, various sophisticated algorithms and flood models are used by the scientific community. However, there still lies a gap to efficiently mapping flood risk. The limitations being: (i) scarcity of extensive data inputs required for precise flood modeling, (ii) fizzling performance of models in large and complex terrains (iii) high computational cost and time, and (iv) inexpertise in handling model simulations by civic bodies. These factors trigger the necessity of incorporating uncomplicated and inexpensive, yet precise approaches to identify areas at different levels of flood risk. The present study addresses this issue by utilizing various easily available, low cost data in a GIS environment for a large flood prone and data poor region. A set of geomorphic indicators of Digital Elevation Model (DEM) are analysed through linear binary classification, and are used to identify the flood hazard. The performance of these indicators is then investigated using receiver operating characteristics (ROC) curve, whereas the calibration and validation of the derived flood maps are accomplished through a comparison with dynamically coupled 1-D 2-D flood model outputs. A high degree of similarity on flood inundation proves the reliability of the proposed approach in identifying flood hazard. On the other hand, an extensive list of socio-economic indicators is selected to represent the flood vulnerability at a very finer forward sortation level using multivariate Data Envelopment Analysis (DEA). A set of bivariate flood risk maps is derived combining the flood hazard and socio-economic vulnerability maps. Given the acute problem of floods in developing countries, the proposed methodology which may be characterized by low computational cost, lesser data requirement and limited flood modeling complexity may facilitate local authorities and planners for deriving effective flood management strategies.

  14. Methods for characterizing fine particulate matter using ground observations and remotely sensed data: potential use for environmental public health surveillance.

    PubMed

    Al-Hamdan, Mohammad Z; Crosson, William L; Limaye, Ashutosh S; Rickman, Douglas L; Quattrochi, Dale A; Estes, Maurice G; Qualters, Judith R; Sinclair, Amber H; Tolsma, Dennis D; Adeniyi, Kafayat A; Niskar, Amanda Sue

    2009-07-01

    This study describes and demonstrates different techniques for surface fitting daily environmental hazards data of particulate matter with aerodynamic diameter less than or equal to 2.5 microm (PM2.5) for the purpose of integrating respiratory health and environmental data for the Centers for Disease Control and Prevention (CDC) pilot study of Health and Environment Linked for Information Exchange (HELIX)-Atlanta. It presents a methodology for estimating daily spatial surfaces of ground-level PM2.5 concentrations using the B-Spline and inverse distance weighting (IDW) surface-fitting techniques, leveraging National Aeronautics and Space Administration (NASA) Moderate Resolution Imaging Spectrometer (MODIS) data to complement U.S. Environmental Protection Agency (EPA) ground observation data. The study used measurements of ambient PM2.5 from the EPA database for the year 2003 as well as PM2.5 estimates derived from NASA's satellite data. Hazard data have been processed to derive the surrogate PM2.5 exposure estimates. This paper shows that merging MODIS remote sensing data with surface observations of PM,2. not only provides a more complete daily representation of PM,2. than either dataset alone would allow, but it also reduces the errors in the PM2.5-estimated surfaces. The results of this study also show that although the IDW technique can introduce some numerical artifacts that could be due to its interpolating nature, which assumes that the maxima and minima can occur only at the observation points, the daily IDW PM2.5 surfaces had smaller errors in general, with respect to observations, than those of the B-Spline surfaces. Finally, the methods discussed in this paper establish a foundation for environmental public health linkage and association studies for which determining the concentrations of an environmental hazard such as PM2.5 with high accuracy is critical.

  15. Modular Exposure Disaggregation Methodologies for Catastrophe Modelling using GIS and Remotely-Sensed Data

    NASA Astrophysics Data System (ADS)

    Foulser-Piggott, R.; Saito, K.; Spence, R.

    2012-04-01

    Loss estimates produced by catastrophe models are dependent on the quality of the input data, including both the hazard and exposure data. Currently, some of the exposure data input into a catastrophe model is aggregated over an area and therefore an estimate of the risk in this area may have a low level of accuracy. In order to obtain a more detailed and accurate loss estimate, it is necessary to have higher resolution exposure data. However, high resolution exposure data is not commonly available worldwide and therefore methods to infer building distribution and characteristics at higher resolution from existing information must be developed. This study is focussed on the development of disaggregation methodologies for exposure data which, if implemented in current catastrophe models, would lead to improved loss estimates. The new methodologies developed for disaggregating exposure data make use of GIS, remote sensing and statistical techniques. The main focus of this study is on earthquake risk, however the methods developed are modular so that they may be applied to different hazards. A number of different methods are proposed in order to be applicable to different regions of the world which have different amounts of data available. The new methods give estimates of both the number of buildings in a study area and a distribution of building typologies, as well as a measure of the vulnerability of the building stock to hazard. For each method, a way to assess and quantify the uncertainties in the methods and results is proposed, with particular focus on developing an index to enable input data quality to be compared. The applicability of the methods is demonstrated through testing for two study areas, one in Japan and the second in Turkey, selected because of the occurrence of recent and damaging earthquake events. The testing procedure is to use the proposed methods to estimate the number of buildings damaged at different levels following a scenario earthquake event. This enables the results of the models to be compared with real data and the relative performance of the different methodologies to be evaluated. A sensitivity analysis is also conducted for two main reasons. Firstly, to determine the key input variables in the methodology that have the most significant impact on the resulting loss estimate. Secondly, to enable the uncertainty in the different approaches to be quantified and therefore provide a range of uncertainty in the loss estimates.

  16. Methodology for Elaborating Regional Susceptibility Maps of Slope Instability: the State of Guerrero (mexico) Case Study

    NASA Astrophysics Data System (ADS)

    González Huesca, A. E.; Ferrés, D.; Domínguez-M, L.

    2013-05-01

    Numerous cases of different types of slope instability occur every year in the mountain areas of México. Sometimes these instabilities severely affect the exposed communities, roads and infrastructure, causing deaths and serious material damage, mainly in the states of Puebla, Veracruz, Oaxaca, Guerrero and Chiapas, at the central and south sectors of the country. The occurrence of the slope instability is the result of the combination of climatic, geologic, hydrologic, geomorphologic and anthropogenic factors. The National Center for Disaster Prevention (CENAPRED) is developing several projects in order to offer civil protection authorities of the Mexican states some methodologies to address the hazard assessment for different natural phenomena in a regional level. In this framework, during the past two years, a methodology was prepared to construct susceptibility maps for slope instability at regional (≤ 1:100 000) and national (≤ 1:1 000 000) levels. This research was addressed in accordance to the criteria established by the International Association of Engineering Geology, which is the highest international authority in this topic. The state of Guerrero has been taken as a pilot scheme to elaborate the susceptibility map for slope instability at a regional level. The major constraints considered in the methodology to calculate susceptibility are: a) the slope of the surface, b) the geology and c) the land use, which were integrated using a Geographic Information System (GIS). The arithmetic sum and weighting factors to obtain the final susceptibility map were based on the average values calculated in the individual study of several cases of slope instability occurred in the state in the past decade. For each case, the evaluation format proposed by CENAPRED in 2006 in the "Guía Básica para la elaboración de Atlas Estatales y Municipales de Peligros y Riesgos" to evaluate instabilities in a local level, was applied. The resulting susceptibility map shows that the central and east-central sectors of the state of Guerrero are those with higher values of susceptibility to slope instability. Future work will elaborate the hazard maps of slope instability for the state of Guerrero using and combining the information of susceptibility obtained with the data of the trigger factors, such as precipitation and seismicity, for different periods of recurrence. The final goal is that this methodology can be applied to other states of the country, in order to nourish and enhance their Atlas of hazards and risk.

  17. Characterizing the nature of home care work and occupational hazards: a developmental intervention study.

    PubMed

    Markkanen, Pia; Quinn, Margaret; Galligan, Catherine; Sama, Susan; Brouillette, Natalie; Okyere, Daniel

    2014-04-01

    Home care (HC) aide is the fastest growing occupation, yet job hazards are under-studied. This study documents the context of HC aide work, characterizes occupational safety and health (OSH) hazards, and identifies preventive interventions using qualitative methods. We conducted 12 focus groups among aides and 26 in-depth interviews comprising 15 HC agency, union, and insurance company representatives as well as 11 HC recipients in Massachusetts. All focus groups and interviews were audio-recorded, transcribed, and coded with NVIVO software. Major OSH concerns were musculoskeletal disorders from client care tasks and verbal abuse. Performing tasks beyond specified job duties may be an OSH risk factor. HC aides' safety and clients' safety are closely linked. Client handling devices, client evaluation, care plan development, and training are key interventions for both aides' and clients' safety. Promoting OSH in HC is essential for maintaining a viable workforce. © 2013 Wiley Periodicals, Inc.

  18. New Tools for Investigating Chemical and Product Use

    EPA Science Inventory

    - The timely characterization of the human and ecological risk posed by thousands of existing and emerging commercial chemicals is a critical challenge - High throughput (HT) risk prioritization relies on hazard and exposure characterization - While advances have been made ...

  19. Use of Severity Grades to Characterize Histopathologic Changes

    EPA Science Inventory

    The severity grade is an important component of a histopathologic diagnosis in a nonclinical toxicity study that helps distinguish treatment-related effects from background findings and aids in determining adverse dose levels during hazard characterization. Severity grades should...

  20. A coupled hydrological-hydraulic flood inundation model calibrated using post-event measurements and integrated uncertainty analysis in a poorly gauged Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Hdeib, Rouya; Abdallah, Chadi; Moussa, Roger; Colin, Francois

    2017-04-01

    Developing flood inundation maps of defined exceedance probabilities is required to provide information on the flood hazard and the associated risk. A methodology has been developed to model flood inundation in poorly gauged basins, where reliable information on the hydrological characteristics of floods are uncertain and partially captured by the traditional rain-gauge networks. Flood inundation is performed through coupling a hydrological rainfall-runoff (RR) model (HEC-HMS) with a hydraulic model (HEC-RAS). The RR model is calibrated against the January 2013 flood event in the Awali River basin, Lebanon (300 km2), whose flood peak discharge was estimated by post-event measurements. The resulting flows of the RR model are defined as boundary conditions of the hydraulic model, which is run to generate the corresponding water surface profiles and calibrated against 20 post-event surveyed cross sections after the January-2013 flood event. An uncertainty analysis is performed to assess the results of the models. Consequently, the coupled flood inundation model is simulated with design storms and flood inundation maps are generated of defined exceedance probabilities. The peak discharges estimated by the simulated RR model were in close agreement with the results from different empirical and statistical methods. This methodology can be extended to other poorly gauged basins facing common stage-gauge failure or characterized by floods with a stage exceeding the gauge measurement level, or higher than that defined by the rating curve.

Top