Sample records for integrated hazard analysis

  1. Integrated Safety Analysis Tiers

    NASA Technical Reports Server (NTRS)

    Shackelford, Carla; McNairy, Lisa; Wetherholt, Jon

    2009-01-01

    Commercial partnerships and organizational constraints, combined with complex systems, may lead to division of hazard analysis across organizations. This division could cause important hazards to be overlooked, causes to be missed, controls for a hazard to be incomplete, or verifications to be inefficient. Each organization s team must understand at least one level beyond the interface sufficiently enough to comprehend integrated hazards. This paper will discuss various ways to properly divide analysis among organizations. The Ares I launch vehicle integrated safety analyses effort will be utilized to illustrate an approach that addresses the key issues and concerns arising from multiple analysis responsibilities.

  2. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Volume 1; Appendices

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  3. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Appendices; Volume 2

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  4. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and sufficient requirements of one of the significant contributors to mission success, the IHA integrator. Discussions will be provided to describe both the mindset required as well as deleterious assumptions/behaviors to avoid when integrating within a large scale system.

  5. Automated Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riddle, F. J.

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less

  6. Tracking Hazard Analysis Data in a Jungle of Changing Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Robin S.; Young, Jonathan

    2006-05-16

    Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.

  7. A Hybrid FPGA/Tilera Compute Element for Autonomous Hazard Detection and Navigation

    NASA Technical Reports Server (NTRS)

    Villalpando, Carlos Y.; Werner, Robert A.; Carson, John M., III; Khanoyan, Garen; Stern, Ryan A.; Trawny, Nikolas

    2013-01-01

    To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.

  8. A hybrid FPGA/Tilera compute element for autonomous hazard detection and navigation

    NASA Astrophysics Data System (ADS)

    Villalpando, C. Y.; Werner, R. A.; Carson, J. M.; Khanoyan, G.; Stern, R. A.; Trawny, N.

    To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.

  9. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  10. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    EPA Science Inventory

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  11. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  12. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  13. Integrated Safety Analysis Teams

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jonathan C.

    2008-01-01

    Today's complex systems require understanding beyond one person s capability to comprehend. Each system requires a team to divide the system into understandable subsystems which can then be analyzed with an Integrated Hazard Analysis. The team must have both specific experiences and diversity of experience. Safety experience and system understanding are not always manifested in one individual. Group dynamics make the difference between success and failure as well as the difference between a difficult task and a rewarding experience. There are examples in the news which demonstrate the need to connect the pieces of a system into a complete picture. The Columbia disaster is now a standard example of a low consequence hazard in one part of the system; the External Tank is a catastrophic hazard cause for a companion subsystem, the Space Shuttle Orbiter. The interaction between the hardware, the manufacturing process, the handling, and the operations contributed to the problem. Each of these had analysis performed, but who constituted the team which integrated this analysis together? This paper will explore some of the methods used for dividing up a complex system; and how one integration team has analyzed the parts. How this analysis has been documented in one particular launch space vehicle case will also be discussed.

  14. Integrated Risk Research. Case of Study: Motozintla, Chiapas, Mexico

    NASA Astrophysics Data System (ADS)

    Novelo-Casanova, D. A.; Jaimes, M.

    2015-12-01

    This integrated risk research include the analysis of all components of individual constituents of risk such hazard identification, hazard exposure, and vulnerability. We determined risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37ºN, 92.25ºW. Due to its geographical and geological location, this community is continuously exposed mainly to earthquakes, landslides and floods. We developed integrated studies and analysis of seismic zonation, landslides and flood susceptibility using standard methodologies. Vulnerability was quantified from data collected from local families interviews considering five social variables: characteristics of housing construction, availability of basic public services, family economic conditions, existing community plans for disaster preparedness, and risk perception. Local families surveyed were randomly selected considering a sample statistically significant. Our results were spatially represented using a Geographical Information System (GIS). Structural vulnerability curves were generated for typical housing constructions. Our integrated risk analysis demonstrates that the community of Motozintla has a high level of structural and socio-economical risk to floods and earthquakes. More than half of the population does not know any existing Civil Protection Plan and perceive that they are in high risk to landslides and floods. Although the community is located in a high seismic risk zone, most of the local people believe that cannot be impacted by a large earthquake. These natural and social conditions indicate that the community of Motozintla has a very high level of risk to natural hazards. This research will support local decision makers in developing an integrated comprehensive natural hazards mitigation and prevention program.

  15. Step 1: Human System Integration Pilot-Technology Interface Requirements for Weather Management

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document involves definition of technology interface requirements for Hazardous Weather Avoidance. Technology concepts in use by the Access 5 Weather Management Work Package were considered. Beginning with the Human System Integration (HIS) high-level functional requirement for Hazardous Weather Avoidance, and Hazardous Weather Avoidance technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of hazardous weather, and (2) the control capability needed by the pilot to obtain hazardous weather information. Fundamentally, these requirements provide the candidate Hazardous Weather Avoidance technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Hazardous Weather Avoidance operations and functions should interface with the pilot to provide the necessary Weather Management functionality to the UA-pilot system. Requirements and guidelines for Hazardous Weather Avoidance are partitioned into four categories: (1) Planning En Route (2) Encountering Hazardous Weather En Route, (3) Planning to Destination, and (4) Diversion Planning Alternate Airport. Each requirement is stated and is supported with a rationale and associated reference(s).

  16. 10 CFR 70.62 - Safety program and integrated safety analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Radiological hazards related to possessing or processing licensed material at its facility; (ii) Chemical hazards of licensed material and hazardous chemicals produced from licensed material; (iii) Facility... performed by a team with expertise in engineering and process operations. The team shall include at least...

  17. An Integrated GIS-Expert System Framework for Live Hazard Monitoring and Detection.

    PubMed

    McCarthy, James D; Graniero, Phil A; Rozic, Steven M

    2008-02-08

    In the context of hazard monitoring, using sensor web technology to monitor anddetect hazardous conditions in near-real-time can result in large amounts of spatial data thatcan be used to drive analysis at an instrumented site. These data can be used for decisionmaking and problem solving, however as with any analysis problem the success ofanalyzing hazard potential is governed by many factors such as: the quality of the sensordata used as input; the meaning that can be derived from those data; the reliability of themodel used to describe the problem; the strength of the analysis methods; and the ability toeffectively communicate the end results of the analysis. For decision makers to make use ofsensor web data these issues must be dealt with to some degree. The work described in thispaper addresses all of these areas by showing how raw sensor data can be automaticallytransformed into a representation which matches a predefined model of the problem context.This model can be understood by analysis software that leverages rule-based logic andinference techniques to reason with, and draw conclusions about, spatial data. These toolsare integrated with a well known Geographic Information System (GIS) and existinggeospatial and sensor web infrastructure standards, providing expert users with the toolsneeded to thoroughly explore a problem site and investigate hazards in any domain.

  18. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  19. Towards an integrated approach to natural hazards risk assessment using GIS: with reference to bushfires.

    PubMed

    Chen, Keping; Blong, Russell; Jacobson, Carol

    2003-04-01

    This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.

  20. ASIL determination for motorbike's Electronics Throttle Control System (ETCS) mulfunction

    NASA Astrophysics Data System (ADS)

    Zaman Rokhani, Fakhrul; Rahman, Muhammad Taqiuddin Abdul; Ain Kamsani, Noor; Sidek, Roslina Mohd; Saripan, M. Iqbal; Samsudin, Khairulmizam; Khair Hassan, Mohd

    2017-11-01

    Electronics Throttle Control System (ETCS) is the principal electronic unit in all fuel injection engine motorbike, augmenting the engine performance efficiency in comparison to the conventional carburetor based engine. ETCS is regarded as a safety-critical component, whereby ETCS malfunction can cause unintended acceleration or deceleration event, which can be hazardous to riders. In this study, Hazard Analysis and Risk Assessment, an ISO26262 functional safety standard analysis has been applied on motorbike's ETCS to determine the required automotive safety integrity level. Based on the analysis, the established automotive safety integrity level can help to derive technical and functional safety measures for ETCS development.

  1. Flood Hazard Mapping by Applying Fuzzy TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.

    2017-12-01

    There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS Acknowlegement This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  2. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  3. Automation for System Safety Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  4. An integrated knowledge system for the Space Shuttle hazardous gas detection system

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Shi, George Z.; Bangasser, Carl; Fensky, Connie; Cegielski, Eric; Overbey, Glenn

    1993-01-01

    A computer-based integrated Knowledge-Based System, the Intelligent Hypertext Manual (IHM), was developed for the Space Shuttle Hazardous Gas Detection System (HGDS) at NASA Marshall Space Flight Center (MSFC). The IHM stores HGDS related knowledge and presents it in an interactive and intuitive manner. This manual is a combination of hypertext and an expert system which store experts' knowledge and experience in hazardous gas detection and analysis. The IHM's purpose is to provide HGDS personnel with the capabilities of: locating applicable documentation related to procedures, constraints, and previous fault histories; assisting in the training of personnel; enhancing the interpretation of real time data; and recognizing and identifying possible faults in the Space Shuttle sub-systems related to hazardous gas detection.

  5. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.

  6. 49 CFR Appendix C to Part 195 - Guidance for Implementation of an Integrity Management Program

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... understanding and analysis of the failure mechanisms or threats to integrity of each pipeline segment. (2) An... pipeline, information and data used for the information analysis; (13) results of the information analyses...

  7. Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site

    NASA Astrophysics Data System (ADS)

    Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.

    2012-04-01

    The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".

  8. Integrated risk management and communication: case study of Canton Vaud (Switzerland)

    NASA Astrophysics Data System (ADS)

    Artigue, Veronica; Aye, Zar Chi; Gerber, Christian; Derron, Marc-Henri; Jaboyedoff, Michel

    2017-04-01

    Canton Vaud's history is marked by events that remind us that any territory may have to cope with natural hazards such as devastating floods of the Baye and the Veraye rivers in Montreux (1927), the overflowing of the Rhône by dam failure (1935), the mud flow of Pissot (1995) and avalanches in the Prealps (1999). All of these examples have caused significant damage, and sometimes even fatalities, in the regions of Canton Vaud. In response to these new issues, the Swiss Confederation and the local authorities of the Canton decided to implement an integrated management policy of natural risks. The realization of natural hazards maps was the first step of the integrated management process. This work resulted in more than 10'000 maps and related documents for 94% of the municipalities of the Canton, covering 17% of its total surface. From this significant amount of data, the main issue is to propose a relevant communication and to build an integrated risk management structure. To make this available information relevant for end users, the implied teams worked to realize documents and tools for a better understanding of these data by all stakeholders. The first step of this process was to carry out a statistical and geographical analysis of hazard maps that allows identifying the most exposed areas to natural hazards. An atlas could thus be created. Then, continued under this framework, several topics have been discussed for each identified risk. The results show that 88 of 318 municipalities in Canton Vaud have at least a high hazard level on their territory, 108 with a moderate hazard level, 41 with a low level and 8 with a residual level. Only 73 of 318 municipalities remain with a minimum or zero hazard level. Concerning the type of hazard considered, 16% of the building zones are exposed to floods, 18% to mud flow, 16% to deep landslides, 14% to spontaneous surface landslides, 6% to rockfall, 55% to rock collapses and less than 5% to avalanches. As the national policies require to take into account the risk at the building scale, further analysis on the buildings have been made. 1'154 buildings are exposed to a high hazard level, while 8409, 21'130 and 14'980 buildings are exposed to a moderate, low and residual hazard level respectively. This paper addresses the complexity of the realization of the hazard map products of the Canton Vaud, particularly through the statistical analysis and the difficulties encountered for data availability and quality at the building scale. The authors highlight the necessary processes to build a robust communication for all the implied stakeholders of risk management in a dynamic and changing area through the example of the Canton Vaud.

  9. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  10. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  11. Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map

    NASA Astrophysics Data System (ADS)

    Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.

    2016-03-01

    Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.

  12. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sezen, Halil; Aldemir, Tunc; Denning, R.

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  13. 76 FR 28336 - Domestic Licensing of Source Material-Amendments/Integrated Safety Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ... considered. The HF gas (and uranyl fluoride) is quickly produced from the chemical reaction that occurs when... worker's death was the inhalation of HF gas, which was produced from the chemical reaction of UF6 and..., would address both the radiological and chemical hazards from licensed material and hazardous chemicals...

  14. Hazard Function Estimation with Cause-of-Death Data Missing at Random.

    PubMed

    Wang, Qihua; Dinse, Gregg E; Liu, Chunling

    2012-04-01

    Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data.

  15. HAZARDS SUMMARY REPORT FOR A TWO WATT PROMETHIUM-147 FUELED THERMOELECTRIC GENERATOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1959-06-01

    Discussions are included of the APU design, vehicle integration, Pm/sup 147/ properties, shielding requirements, hazards design criteria, statistical analysis for impact, and radiation protection. The use of Pm/sup 147/ makes possible the fabrication of an auxiliary power unit which has applications for low power space missions of <10 watts (electrical). (B.O.G.)

  16. Hazard Function Estimation with Cause-of-Death Data Missing at Random

    PubMed Central

    Wang, Qihua; Dinse, Gregg E.; Liu, Chunling

    2010-01-01

    Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data. PMID:22267874

  17. EPOS Thematic Core Service Anthropogenic Hazards: Implementation Plan

    NASA Astrophysics Data System (ADS)

    Orlecka-Sikora, Beata; Lasocki, Stanislaw; Grasso, Jean Robert; Schmittbuhl, Jean; Styles, Peter; Kwiatek, Grzegorz; Sterzel, Mariusz; Garcia, Alexander

    2015-04-01

    EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) aims to integrate distributed research infrastructures (RI) to facilitate and stimulate research on anthropogenic hazards (AH) especially those associated with the exploration and exploitation of geo-resources. The innovative element is the uniqueness of the integrated RI which comprises two main deliverables: (1) Exceptional datasets, called "episodes", which comprehensively describe a geophysical process; induced or triggered by human technological activity, posing hazard for populations, infrastructure and the environment, (2) Problem-oriented, bespoke services uniquely designed for the discrimination and analysis of correlations between technology, geophysical response and resulting hazard. These objectives will be achieved through the Science-Industry Synergy (SIS) built by EPOS WG10, ensuring bi-directional information exchange, including unique and previously unavailable data furnished by industrial partners. The Episodes and services to be integrated have been selected using strict criteria during the EPOS PP. The data are related to a wide spectrum of inducing technologies, with seismic/aseismic deformation and production history as a minimum data set requirement and the quality of software services is confirmed and referenced in literature. Implementation of TCS AH is planned for four years and requires five major activities: (1) Strategic Activities and Governance: will define and establish the governance structure to ensure the long-term sustainability of these research infrastructures for data provision through EPOS. (2) Coordination and Interaction with the Community: will establish robust communication channels within the whole TCS AH community while supporting global EPOS communication strategy. (3) Interoperability with EPOS Integrated Core Service (ICS) and Testing Activities: will coordinate and ensure interoperability between the RIs and the ICS. Within this modality a functional e-research environment with access to High-Performance Computing will be built. A prototype for such an environment is already under construction and will become operational in mid -2015 (is-epos.eu). (4) Integration of AH Episodes: will address at least 20 global episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production which will be integrated into the e-environment of TCS AH. All the multi-disciplinary heterogeneous data from these particular episodes will be transformed to unified structures to form integrated data sets articulated with the defined standards of ICS and other TCS's. (5) Implementation of services for analyzing Episodes: will deliver the protocols and methodologies for analysis of the seismic/deformation response to time-varying georesource exploitation technologies on long and short time scales and the related time- and technology-dependent seismic hazard issues.

  18. Combining System Safety and Reliability to Ensure NASA CoNNeCT's Success

    NASA Technical Reports Server (NTRS)

    Havenhill, Maria; Fernandez, Rene; Zampino, Edward

    2012-01-01

    Hazard Analysis, Failure Modes and Effects Analysis (FMEA), the Limited-Life Items List (LLIL), and the Single Point Failure (SPF) List were applied by System Safety and Reliability engineers on NASA's Communications, Navigation, and Networking reConfigurable Testbed (CoNNeCT) Project. The integrated approach involving cross reviews of these reports by System Safety, Reliability, and Design engineers resulted in the mitigation of all identified hazards. The outcome was that the system met all the safety requirements it was required to meet.

  19. Integration of Aquifer Storage Transfer and Recovery and HACCP for Ensuring Drinking Water Quality

    NASA Astrophysics Data System (ADS)

    Lee, S. I.; Ji, H. W.

    2015-12-01

    The integration of ASTR (Aquifer Storage Transfer and Recovery) and HACCP (Hazard Analysis and Critical Control Point) is being attempted to ensure drinking water quality in a delta area. ASTR is a water supply system in which surface water is injected into a well for storage and recovered from a different well. During the process natural water treatment is achieved in the aquifer. ASTR has advantages over surface reservoirs in that the water is protected from external contaminants and free from water loss by evaporation. HACCP, originated from the food industry, can efficiently manage hazards and reduce risks when it is introduced to the drinking water production. The study area is the located in the Nakdong River Delta, South Korea. Water quality of this region has been deteriorated due to the increased pollution loads from the upstream cities and industrial complexes. ASTR equipped with HACCP system is suggested as a means to heighten the public trust in drinking water. After the drinking water supply system using ASTR was decomposed into ten processes, principles of HACCP were applied. Hazardous event analysis was conducted for 114 hazardous events and nine major hazardous events were identified based on the likelihood and the severity assessment. Potential risk of chemical hazards, as a function of amounts, travel distance and toxicity, was evaluated and the result shows the relative threat a city poses to the drinking water supply facility. Next, critical control points were determined using decision tree analysis. Critical limits, maximum and/or minimum values to which biological, chemical or physical parameters must be controlled, were established. Other procedures such as monitoring, corrective actions and will be presented.

  20. Preharvest food safety.

    PubMed

    Childers, A B; Walsh, B

    1996-07-23

    Preharvest food safety is essential for the protection of our food supply. The production and transport of livestock and poultry play an integral part in the safety of these food products. The goals of this safety assurance include freedom from pathogenic microorganisms, disease, and parasites, and from potentially harmful residues and physical hazards. Its functions should be based on hazard analysis and critical control points from producer to slaughter plant with emphasis on prevention of identifiable hazards rather than on removal of contaminated products. The production goal is to minimize infection and insure freedom from potentially harmful residues and physical hazards. The marketing goal is control of exposure to pathogens and stress. Both groups should have functional hazard analysis and critical control points management programs which include personnel training and certification of producers. These programs must cover production procedures, chemical usage, feeding, treatment practices, drug usage, assembly and transportation, and animal identification. Plans must use risk assessment principles, and the procedures must be defined. Other elements would include preslaughter certification, environmental protection, control of chemical hazards, live-animal drug-testing procedures, and identification of physical hazards.

  1. The role of models in estimating consequences as part of the risk assessment process.

    PubMed

    Forde-Folle, K; Mitchell, D; Zepeda, C

    2011-08-01

    The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.

  2. Assessing hazard risk, cost of adaptation and traditional land use activities in the context of permafrost thaw in communities in Yukon and the Northwest Territories, Canada

    NASA Astrophysics Data System (ADS)

    Benkert, B.; Perrin, A.; Calmels, F.

    2015-12-01

    Together with its partners, the Northern Climate ExChange (NCE, part of the Yukon Research Centre at Yukon College) has been mapping permafrost-related hazard risk in northern communities since 2010. By integrating geoscience and climate project data, we have developed a series of community-scale hazard risk maps. The maps depict hazard risk in stoplight colours for easy interpretation, and support community-based, future-focused adaptation planning. Communities, First Nations, consultants and local regulatory agencies have used the hazard risk maps to site small-scale infrastructure projects, guide land planning processes, and assess suitability of land development applications. However, we know that assessing risk is only one step in integrating the implications of permafrost degradation in societal responses to environmental change. To build on our permafrost hazard risk maps, we are integrating economic principles and traditional land use elements. To assess economic implications of adaptation to permafrost change, we are working with geotechnical engineers to identify adaptation options (e.g., modified building techniques, permafrost thaw mitigation approaches) that suit the risks captured by our existing hazard risk maps. We layer this with an economic analysis of the costs associated with identified adaptation options, providing end-users with a more comprehensive basis upon which to make decisions related to infrastructure. NCE researchers have also integrated traditional land use activities in assessments of permafrost thaw risk, in a project led by Jean Marie River First Nation in the Northwest Territories. Here, the implications of permafrost degradation on food security and land use priorities were assessed by layering key game and gathering areas on permafrost thaw vulnerability maps. Results indicated that close to one quarter of big and small game habitats, and close to twenty percent of key furbearer and gathering areas within the First Nation's traditional territory, are situated on highly thaw sensitive permafrost. These projects demonstrate how physical and socio-economic factors can be integrated in assessments of permafrost vulnerability to thaw, thus providing tangible, useable results that reflect community priorities and support local decision making.

  3. Integrating volcanic hazard data in a systematic approach to develop volcanic hazard maps in the Lesser Antilles

    NASA Astrophysics Data System (ADS)

    Lindsay, Jan M.; Robertson, Richard E. A.

    2018-04-01

    We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick 'em Jenny and Ronde/Caille), Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s) of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past 10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that the documentation of our experience might be useful for other map makers to take into account when creating new or updating existing maps.

  4. Safety issues of high-concentrated hydrogen peroxide production used as rocket propellant

    NASA Astrophysics Data System (ADS)

    Romantsova, O. V.; Ulybin, V. B.

    2015-04-01

    The article dwells on the possibility of production of high-concentrated hydrogen peroxide with the Russian technology of isopropyl alcohol autoxidation. Analysis of fire/explosion hazards and reasons of insufficient quality is conducted for the technology. Modified technology is shown. Non-standard fire/explosion characteristics required for integrated fire/explosion hazards rating for modified hydrogen peroxide production based on the autoxidation of isopropyl alcohol are defined.

  5. Integrated Risk Assessment to Natural Hazards in Motozintla, Chiapas, Mexico

    NASA Astrophysics Data System (ADS)

    Novelo-Casanova, D. A.

    2012-12-01

    An integrated risk assessment includes the analysis of all components of individual constituents of risk such as baseline study, hazard identification and categorization, hazard exposure, and vulnerability. Vulnerability refers to the inability of people, organizations, and societies to withstand adverse impacts from multiple stressors to which they are exposed. These impacts are due to characteristics inherent in social interactions, institutions, and systems of cultural values. Thus, social vulnerability is a pre-existing condition that affects a society's ability to prepare for and recover from a disruptive event. Risk is the probability of a loss, and this loss depends on three elements: hazard, exposure, and vulnerability. Thus, risk is the estimated impact that a hazard event would have on people, services, facilities, structures and assets in a community. In this work we assess the risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37N, 92.25W) with a population of about 20 000 habitants. Due to its geographical and geological location, this community is continuously exposed to many different natural hazards (earthquakes, landslides, volcanic eruptions, and floods). To determine the level of exposure of the community to natural hazards, we developed integrated studies and analysis of seismic microzonation, landslide and flood susceptibility as well as volcanic impact using standard methodologies. Social vulnerability was quantified from data obtained from local families interviews. Five variables were considered: household structure quality and design, availability of basic public services, family economic conditions, existing family plans for disaster preparedness, and risk perception.The number of families surveyed was determined considering a sample statistically significant. The families that were interviewed were selected using the simple random sampling technique with replacement. With these procedure, each household was chosen randomly and entirely by chance with the same probability of being chosen at any stage during the sampling process. To facilitate our interpretation, all results were spatially analyzed using a Geographical Information System (GIS). Our results indicate that the community of Motozintla is higly exposed to floods, landslides and earthquakes and to a lesser extent to the impact of a volcanic eruption. The locality has a high level of structural vulnerability to the main identified hazards (floods and landslides). About 70% of the families has a daily income below 11 USD. Approximately 66% of the population does not know any existing Civil Protection Plan. Another major observation is that the community organization for disaster prevention is practically nonexistent. These natural and social conditions indicate that the community of Motozintla has a very high level of risk to natural hazards. This research will support decision makers in Mexico, and particularly from the sate of Chiapas, in the development of an integrated comprenhensive natural hazards mitigation and prevention program in this region.

  6. Engineering risk reduction in satellite programs

    NASA Technical Reports Server (NTRS)

    Dean, E. S., Jr.

    1979-01-01

    Methods developed in planning and executing system safety engineering programs for Lockheed satellite integration contracts are presented. These procedures establish the applicable safety design criteria, document design compliance and assess the residual risks where non-compliant design is proposed, and provide for hazard analysis of system level test, handling and launch preparations. Operations hazard analysis identifies product protection and product liability hazards prior to the preparation of operational procedures and provides safety requirements for inclusion in them. The method developed for documenting all residual hazards for the attention of program management assures an acceptable minimum level of risk prior to program deployment. The results are significant for persons responsible for managing or engineering the deployment and production of complex high cost equipment under current product liability law and cost/time constraints, have a responsibility to minimize the possibility of an accident, and should have documentation to provide a defense in a product liability suit.

  7. Multi-hazard risk analysis using the FP7 RASOR Platform

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew

    2014-10-01

    Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.

  8. Assessing qualitative long-term volcanic hazards at Lanzarote Island (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Becerril, Laura; Martí, Joan; Bartolini, Stefania; Geyer, Adelina

    2017-07-01

    Conducting long-term hazard assessment in active volcanic areas is of primary importance for land-use planning and defining emergency plans able to be applied in case of a crisis. A definition of scenario hazard maps helps to mitigate the consequences of future eruptions by anticipating the events that may occur. Lanzarote is an active volcanic island that has hosted the largest (> 1.5 km3 DRE) and longest (6 years) eruption, the Timanfaya eruption (1730-1736), on the Canary Islands in historical times (last 600 years). This eruption brought severe economic losses and forced local people to migrate. In spite of all these facts, no comprehensive hazard assessment or hazard maps have been developed for the island. In this work, we present an integrated long-term volcanic hazard evaluation using a systematic methodology that includes spatial analysis and simulations of the most probable eruptive scenarios.

  9. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

    PubMed

    Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.

  10. Assessing natural hazards in forestry for risk management: a review

    Treesearch

    Marc Hanewinkel; Susan Hummel; Axel Albrecht

    2011-01-01

    We address the problem of how to integrate risk assessment into forest management and therefore provide a comprehensive review of recent and past literature on risk analysis and modeling and, moreover, an evaluation and summary on these papers. We provide a general scheme on how to integrate concepts of risk into forest management decisions. After an overview of the...

  11. The SARVIEWS Project: Automated SAR Processing in Support of Operational Near Real-time Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Webley, P. W.; Dehn, J.; Arko, S. A.; McAlpin, D. B.; Gong, W.

    2016-12-01

    Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing has become established in operational volcano monitoring. Centers like the Alaska Volcano Observatory rely heavily on remote sensing data from optical and thermal sensors to provide time-critical hazard information. Despite this high use of remote sensing data, the presence of clouds and a dependence on solar illumination often limit their impact on decision making. Synthetic Aperture Radar (SAR) systems are widely considered superior to optical sensors in operational monitoring situations, due to their weather and illumination independence. Still, the contribution of SAR to operational volcano monitoring has been limited in the past due to high data costs, long processing times, and low temporal sampling rates of most SAR systems. In this study, we introduce the automatic SAR processing system SARVIEWS, whose advanced data analysis and data integration techniques allow, for the first time, a meaningful integration of SAR into operational monitoring systems. We will introduce the SARVIEWS database interface that allows for automatic, rapid, and seamless access to the data holdings of the Alaska Satellite Facility. We will also present a set of processing techniques designed to automatically generate a set of SAR-based hazard products (e.g. change detection maps, interferograms, geocoded images). The techniques take advantage of modern signal processing and radiometric normalization schemes, enabling the combination of data from different geometries. Finally, we will show how SAR-based hazard information is integrated in existing multi-sensor decision support tools to enable joint hazard analysis with data from optical and thermal sensors. We will showcase the SAR processing system using a set of recent natural disasters (both earthquakes and volcanic eruptions) to demonstrate its robustness. We will also show the benefit of integrating SAR with data from other sensors to support volcano monitoring. For historic eruptions at Okmok and Augustine volcano, both located in the North Pacific, we will demonstrate that the addition of SAR can lead to a significant improvement in activity detection and eruption forecasting.

  12. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe

    2009-10-15

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose ofmore » GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.« less

  13. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran.

    PubMed

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe; Mosstafakhani, Parasto; Taheri, Kamal; Shahoie, Saber; Khodamoradpour, Mehran

    2009-10-01

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  14. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    NASA Astrophysics Data System (ADS)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach torrent in Tyrol (Austria), are analysed in detail. A couple of buildings are entirely reconstructed within the physical scale model at the scale 1:30. They include basement and first floor and thereby all relevant openings on the building envelopes. The results from experimental modelling represent the data basis for further physics-based vulnerability analysis. Hence, the applied vulnerability analysis concept significantly extends the methods presently used in flood risk assessment. The results of the study are of basic importance for practical application, as they provide extensive information to support hazard zone mapping and management, as well as the planning of local technical protection measures.

  15. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHAmore » then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment controls in the field.« less

  16. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA

  17. New lakes in deglaciating high-mountain areas: Regional intercomparison of current and future risks from impact waves due to rock/ice avalanches in the Swiss Alps

    NASA Astrophysics Data System (ADS)

    Schaub, Y.; Huggel, C.; Serraino, M.; Haeberli, W.

    2012-04-01

    The changes in high-mountain environments are increasingly fast and complex. GIS-based models of the Swiss Alps show that numerous topographic overdeepenings are likely to appear on progressively exposed glacier beds, which are considered as potential sites of future lake formation. In many cases these newly forming lakes will be situated in an over-steepened and destabilized high-mountain environment and are, therefore, prone to impact waves from landslides. The risk of glacier lake outburst floods, endangering infrastructure, residential areas and persons further downvalley, is increasing with further lake formation and glacier recession. This risk may persist for many decades if not centuries. Future-oriented hazard assessments have to be integrative and must deal with all possible process chains. Reference studies and methodologies are still scarce, however. We present an approach to compare risks resulting from high-mountain lakes in the Swiss Alps amongst each other. Already existing lakes are thereby as much included in the analysis as future ones. The presented risk assessment approach integrates the envisaged high-mountain hazard process chain with present and future socio-economic conditions. Applying the concept of integral risk management, the hazard and damage potentials have to be analyzed. The areas that feature the topographic potential for rock/iceavalanches to reach a lake were analyzed regarding their susceptibility to slope failure including the factors slope inclination, permafrost occurrence, glacier recession and bedrock lithology. Together with the analysis of the lakes (volume and runout path of potential outburst floods), the hazard analysis of the process chain was completed. As an example, high long-term hazard potentials in the Swiss Alps have, for instance, to be expected in the area of the Great Aletsch glacier. A methodology for the assessment of the damage potential was elaborated and will be presented. In order to estimate the location of the largest damage potentials, driving forces of different spatial development scenarios for the Swiss Alps will be implemented in a land allocation model for the Swiss Alps. By bringing together hazard, exposure and vulnerability analyses, a risk assessment for the entire Swiss Alps regarding lake-outburst floods triggered by impacts of rock/ice avalanches can be conducted for today, the middle of the century and even beyond.

  18. Ensemble of ground subsidence hazard maps using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Park, Inhye; Lee, Jiyeong; Saro, Lee

    2014-06-01

    Hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok, Korea, were constructed using fuzzy ensemble techniques and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, groundwater, and ground subsidence maps. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 70/30 for training and validation of the models. The relationships between the detected ground-subsidence area and the factors were identified and quantified by frequency ratio (FR), logistic regression (LR) and artificial neural network (ANN) models. The relationships were used as factor ratings in the overlay analysis to create ground-subsidence hazard indexes and maps. The three GSH maps were then used as new input factors and integrated using fuzzy-ensemble methods to make better hazard maps. All of the hazard maps were validated by comparison with known subsidence areas that were not used directly in the analysis. As the result, the ensemble model was found to be more effective in terms of prediction accuracy than the individual model.

  19. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    NASA Astrophysics Data System (ADS)

    Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey

    2015-06-01

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.

  20. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  1. Integrating Remote Sensing Data, Hybrid-Cloud Computing, and Event Notifications for Advanced Rapid Imaging & Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Fielding, E. J.; Agram, P.; Manipon, G.; Stough, T. M.; Simons, M.; Rosen, P. A.; Wilson, B. D.; Poland, M. P.; Cervelli, P. F.; Cruz, J.

    2013-12-01

    Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR) and Continuous Global Positioning System (CGPS) are now important elements in our toolset for monitoring earthquake-generating faults, volcanic eruptions, hurricane damage, landslides, reservoir subsidence, and other natural and man-made hazards. Geodetic imaging's unique ability to capture surface deformation with high spatial and temporal resolution has revolutionized both earthquake science and volcanology. Continuous monitoring of surface deformation and surface change before, during, and after natural hazards improves decision-making from better forecasts, increased situational awareness, and more informed recovery. However, analyses of InSAR and GPS data sets are currently handcrafted following events and are not generated rapidly and reliably enough for use in operational response to natural disasters. Additionally, the sheer data volumes needed to handle a continuous stream of InSAR data sets also presents a bottleneck. It has been estimated that continuous processing of InSAR coverage of California alone over 3-years would reach PB-scale data volumes. Our Advanced Rapid Imaging and Analysis for Monitoring Hazards (ARIA-MH) science data system enables both science and decision-making communities to monitor areas of interest with derived geodetic data products via seamless data preparation, processing, discovery, and access. We will present our findings on the use of hybrid-cloud computing to improve the timely processing and delivery of geodetic data products, integrating event notifications from USGS to improve the timely processing for response, as well as providing browse results for quick looks with other tools for integrative analysis.

  2. Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset

    NASA Astrophysics Data System (ADS)

    Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.

    2017-12-01

    Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.

  3. Fleeing to Fault Zones: Incorporating Syrian Refugees into Earthquake Risk Analysis along the East Anatolian and Dead Sea Rift Fault Zones

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Paradise, T. R.

    2016-12-01

    The influx of millions of Syrian refugees into Turkey has rapidly changed the population distribution along the Dead Sea Rift and East Anatolian Fault zones. In contrast to other countries in the Middle East where refugees are accommodated in camp environments, the majority of displaced individuals in Turkey are integrated into cities, towns, and villages—placing stress on urban settings and increasing potential exposure to strong shaking. Yet, displaced populations are not traditionally captured in data sources used in earthquake risk analysis or loss estimations. Accordingly, we present a district-level analysis assessing the spatial overlap of earthquake hazards and refugee locations in southeastern Turkey to determine how migration patterns are altering seismic risk in the region. Using migration estimates from the U.S. Humanitarian Information Unit, we create three district-level population scenarios that combine official population statistics, refugee camp populations, and low, median, and high bounds for integrated refugee populations. We perform probabilistic seismic hazard analysis alongside these population scenarios to map spatial variations in seismic risk between 2011 and late 2015. Our results show a significant relative southward increase of seismic risk for this period due to refugee migration. Additionally, we calculate earthquake fatalities for simulated earthquakes using a semi-empirical loss estimation technique to determine degree of under-estimation resulting from forgoing migration data in loss modeling. We find that including refugee populations increased casualties by 11-12% using median population estimates, and upwards of 20% using high population estimates. These results communicate the ongoing importance of placing environmental hazards in their appropriate regional and temporal context which unites physical, political, cultural, and socio-economic landscapes. Keywords: Earthquakes, Hazards, Loss-Estimation, Syrian Crisis, Migration, Refugees

  4. Integrating Behavioral Health in Primary Care Using Lean Workflow Analysis: A Case Study.

    PubMed

    van Eeghen, Constance; Littenberg, Benjamin; Holman, Melissa D; Kessler, Rodger

    2016-01-01

    Primary care offices are integrating behavioral health (BH) clinicians into their practices. Implementing such a change is complex, difficult, and time consuming. Lean workflow analysis may be an efficient, effective, and acceptable method for use during integration. The objectives of this study were to observe BH integration into primary care and to measure its impact. This was a prospective, mixed-methods case study in a primary care practice that served 8,426 patients over a 17-month period, with 652 patients referred to BH services. Secondary measures included primary care visits resulting in BH referrals, referrals resulting in scheduled appointments, time from referral to the scheduled appointment, and time from the referral to the first visit. Providers and staff were surveyed on the Lean method. Referrals increased from 23 to 37 per 1000 visits (P < .001). Referrals resulted in more scheduled (60% to 74%; P < .001) and arrived visits (44% to 53%; P = .025). Time from referral to the first scheduled visit decreased (hazard ratio, 1.60; 95% confidence interval, 1.37-1.88) as did time to first arrived visit (hazard ratio, 1.36; 95% confidence interval, 1.14-1.62). Survey responses and comments were positive. This pilot integration of BH showed significant improvements in treatment initiation and other measures. Strengths of Lean analysis included workflow improvement, system perspective, and project success. Further evaluation is indicated. © Copyright 2016 by the American Board of Family Medicine.

  5. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin Leigh

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  6. Geomorphological hazards and environmental impact: Assessment and mapping

    NASA Astrophysics Data System (ADS)

    Panizza, Mario

    In five sections the author develops the methods for the integration of geomorphological concepts into Environmental Impact and Mapping. The first section introduces the concepts of Impact and Risk through the relationships between Geomorphological Environment and Anthropical Element. The second section proposes a methodology for the determination of Geomorphological Hazard and the identification of Geomorphological Risk. The third section synthesizes the procedure for the compilation of a Geomorphological Hazards Map. The fourth section outlines the concepts of Geomorphological Resource Assessment for the analysis of the Environmental Impact. The fifth section considers the contribution of geomorphological studies and mapping in the procedure for Environmental Impact Assessment.

  7. Integrating multidisciplinary science, modelling and impact data into evolving, syn-event volcanic hazard mapping and communication: A case study from the 2012 Tongariro eruption crisis, New Zealand

    NASA Astrophysics Data System (ADS)

    Leonard, Graham S.; Stewart, Carol; Wilson, Thomas M.; Procter, Jonathan N.; Scott, Bradley J.; Keys, Harry J.; Jolly, Gill E.; Wardman, Johnny B.; Cronin, Shane J.; McBride, Sara K.

    2014-10-01

    New Zealand's Tongariro National Park volcanoes produce hazardous eruptions every few years to decades. On 6 August 2012 the Te Maari vent of Tongariro Volcano erupted, producing a series of explosions and a fine ash of minor volume which was dispersed rapidly to the east. This manuscript presents a summary of the eruption impacts and the way these supported science communication during the crisis, particularly in terms of hazard map development. The most significant proximal impact was damage from pyroclastic surges and ballistics to the popular and economically-important Tongariro Alpine Crossing track. The only hazard to affect the medial impact zone was a few mms of ashfall with minor impacts. Field testing indicated that the Te Maari ash had extremely low resistivity when wetted, implying a very high potential to cause disruption to nationally-important power transmission networks via the mechanism of insulator flashover. This was not observed, presumably due to insufficient ash accumulation on insulators. Virtually no impacts from distal ashfall were reported. Post-event analysis of PM10 data demonstrates the additional value of regional air quality monitoring networks in quantifying population exposure to airborne respirable ash. While the eruption was minor, it generated a high level of public interest and a demand for information on volcanic hazards and impacts from emergency managers, the public, critical infrastructure managers, health officials, and the agriculture sector. Meeting this demand fully taxed available resources. We present here aspects of the New Zealand experience which may have wider applicability in moving towards improved integration of hazard impact information, mapping, and communication. These include wide use of a wiki technical clearinghouse and email listservs, a focus on multi-agency consistent messages, and a recently developed environment of collaboration and alignment of both research funding and technical science advice. Hazard maps were integral to science communication during the crisis, but there is limited international best practice information available on hazard maps as communication devices, as most volcanic hazard mapping literature is concerned with defining hazard zones. We propose that hazard maps are only as good as the communications framework and inter-agency relationships in which they are embedded, and we document in detail the crisis hazard map development process. We distinguish crisis hazard maps from background hazard maps and ashfall prediction maps, illustrating the complementary nature of these three distinct communication mechanisms. We highlight issues that arose and implications for the development of future maps.

  8. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a GIS-based system can be for effective and efficient disaster response management. In the coming years our GIS application will be a data base containing all information needed for the evaluation of risk sites along the Gotthard line. Our GIS application can help the technical management to decide about protection measures because of, in addition to the visualisation, tools for spatial data analysis will be available. REFERENCES Bründl M. (Ed.) 2009 : Risikokonzept für Naturgefahren - Leitfaden. Nationale Plattform für Naturgefahren PLANAT, Bern. 416 S. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004: La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Maggi R. et al, 2009: Evaluation of the optimal resilience for vulnerable infrastructure networks. An interdisciplinary pilot study on the transalpine transportation corridors, NRP 54 "Sustainable Development of the Built Environment", Projekt Nr. 405 440, Final Scientific Report, Lugano

  9. Transportation of Hazardous Evidentiary Material.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Douglas.

    2005-06-01

    This document describes the specimen and transportation containers currently available for use with hazardous and infectious materials. A detailed comparison of advantages, disadvantages, and costs of the different technologies is included. Short- and long-term recommendations are also provided.3 DraftDraftDraftExecutive SummaryThe Federal Bureau of Investigation's Hazardous Materials Response Unit currently has hazardous material transport containers for shipping 1-quart paint cans and small amounts of contaminated forensic evidence, but the containers may not be able to maintain their integrity under accident conditions or for some types of hazardous materials. This report provides guidance and recommendations on the availability of packages for themore » safe and secure transport of evidence consisting of or contaminated with hazardous chemicals or infectious materials. Only non-bulk containers were considered because these are appropriate for transport on small aircraft. This report will addresses packaging and transportation concerns for Hazardous Classes 3, 4, 5, 6, 8, and 9 materials. If the evidence is known or suspected of belonging to one of these Hazardous Classes, it must be packaged in accordance with the provisions of 49 CFR Part 173. The anthrax scare of several years ago, and less well publicized incidents involving unknown and uncharacterized substances, has required that suspicious substances be sent to appropriate analytical laboratories for analysis and characterization. Transportation of potentially hazardous or infectious material to an appropriate analytical laboratory requires transport containers that maintain both the biological and chemical integrity of the substance in question. As a rule, only relatively small quantities will be available for analysis. Appropriate transportation packaging is needed that will maintain the integrity of the substance, will not allow biological alteration, will not react chemically with the substance being shipped, and will otherwise maintain it as nearly as possible in its original condition.The recommendations provided are short-term solutions to the problems of shipping evidence, and have considered only currently commercially available containers. These containers may not be appropriate for all cases. Design, testing, and certification of new transportation containers would be necessary to provide a container appropriate for all cases.Table 1 provides a summary of the recommendations for each class of hazardous material.Table 1: Summary of RecommendationsContainerCost1-quart paint can with ArmlockTM seal ringLabelMaster(r)%242.90 eachHazard Class 3, 4, 5, 8, or 9 Small ContainersTC Hazardous Material Transport ContainerCurrently in Use4 DraftDraftDraftTable 1: Summary of Recommendations (continued)ContainerCost55-gallon open or closed-head steel drumsAll-Pak, Inc.%2458.28 - %2473.62 eachHazard Class 3, 4, 5, 8, or 9 Large Containers95-gallon poly overpack LabelMaster(r)%24194.50 each1-liter glass container with plastic coatingLabelMaster(r)%243.35 - %243.70 eachHazard Class 6 Division 6.1 Poisonous by Inhalation (PIH) Small ContainersTC Hazardous Material Transport ContainerCurrently in Use20 to 55-gallon PIH overpacksLabelMaster(r)%24142.50 - %24170.50 eachHazard Class 6 Division 6.1 Poisonous by Inhalation (PIH) Large Containers65 to 95-gallon poly overpacksLabelMaster(r)%24163.30 - %24194.50 each1-liter transparent containerCurrently in UseHazard Class 6 Division 6.2 Infectious Material Small ContainersInfectious Substance ShipperSource Packaging of NE, Inc.%24336.00 eachNone Commercially AvailableN/AHazard Class 6 Division 6.2 Infectious Material Large ContainersNone Commercially Available N/A5« less

  10. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    NASA Astrophysics Data System (ADS)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground motion uncertainties. The approach is designed to integrate loss distribution functions with different degrees of correlation for portfolio analysis. The analysis is based on USGS 2002 regional seismicity model.

  11. Reliability, Safety and Error Recovery for Advanced Control Software

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2003-01-01

    For long-duration automated operation of regenerative life support systems in space environments, there is a need for advanced integration and control systems that are significantly more reliable and safe, and that support error recovery and minimization of operational failures. This presentation outlines some challenges of hazardous space environments and complex system interactions that can lead to system accidents. It discusses approaches to hazard analysis and error recovery for control software and challenges of supporting effective intervention by safety software and the crew.

  12. Proceedings on Combating the Unrestricted Warfare Threat: Integrating Strategy, Analysis, and Technology, 10-11 March 2008

    DTIC Science & Technology

    2008-03-01

    irregular struggle, and, finally, a protracted struggle that will last decades rather than years. how will this war Evolve? It is hazardous to...There is no downside to engagement. It is not an act of para- noia or pessimism to engage Americans in the very real hazards that confront us. It...making process about what to do next. Because it is an undisciplined process, they work through about 100 options when there are only two: duck or

  13. Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel; Malamud, Bruce D.

    2016-04-01

    Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  14. An assessment of the crash fire hazard of liquid hydrogen fueled aircraft

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The crash fire hazards of liquid hydrogen fueled aircraft relative to those of mission equivalent aircraft fueled either with conventional fuel or with liquefied methane were evaluated. The aircraft evaluated were based on Lockheed Corporation design for 400 passenger, Mach 0.85, 5500 n. mile aircraft. Four crash scenarios were considered ranging from a minor incident causing some loss of fuel system integrity to a catastrophic crash. Major tasks included a review of hazardous properties of the alternate fuels and of historic crash fire data; a comparative hazard evluation for each of the three fuels under four crash scenarios a comprehensive review and analysis and an identification of areas further development work. The conclusion was that the crash fire hazards are not significantly different when compared in general for the three fuels, although some fuels showed minor advantages in one respect or another.

  15. Reliability studies of Integrated Modular Engine system designs

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-01-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  16. System safety in Stirling engine development

    NASA Technical Reports Server (NTRS)

    Bankaitis, H.

    1981-01-01

    The DOE/NASA Stirling Engine Project Office has required that contractors make safety considerations an integral part of all phases of the Stirling engine development program. As an integral part of each engine design subtask, analyses are evolved to determine possible modes of failure. The accepted system safety analysis techniques (Fault Tree, FMEA, Hazards Analysis, etc.) are applied in various degrees of extent at the system, subsystem and component levels. The primary objectives are to identify critical failure areas, to enable removal of susceptibility to such failures or their effects from the system and to minimize risk.

  17. Reliability studies of integrated modular engine system designs

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-01-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  18. Reliability studies of integrated modular engine system designs

    NASA Astrophysics Data System (ADS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-06-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  19. Reliability studies of Integrated Modular Engine system designs

    NASA Astrophysics Data System (ADS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-06-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  20. Multimedia-modeling integration development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelton, Mitchell A.; Hoopes, Bonnie L.

    2002-09-02

    There are many framework systems available; however, the purpose of the framework presented here is to capitalize on the successes of the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) and Multi-media Multi-pathway Multi-receptor Risk Assessment (3MRA) methodology as applied to the Hazardous Waste Identification Rule (HWIR) while focusing on the development of software tools to simplify the module developer?s effort of integrating a module into the framework.

  1. Hazard Detection Software for Lunar Landing

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Johnson, Andrew E.; Werner, Robert A.; Montgomery, James F.

    2011-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing a system for safe and precise manned lunar landing that involves novel sensors, but also specific algorithms. ALHAT has selected imaging LIDAR (light detection and ranging) as the sensing modality for onboard hazard detection because imaging LIDARs can rapidly generate direct measurements of the lunar surface elevation from high altitude. Then, starting with the LIDAR-based Hazard Detection and Avoidance (HDA) algorithm developed for Mars Landing, JPL has developed a mature set of HDA software for the manned lunar landing problem. Landing hazards exist everywhere on the Moon, and many of the more desirable landing sites are near the most hazardous terrain, so HDA is needed to autonomously and safely land payloads over much of the lunar surface. The HDA requirements used in the ALHAT project are to detect hazards that are 0.3 m tall or higher and slopes that are 5 or greater. Steep slopes, rocks, cliffs, and gullies are all hazards for landing and, by computing the local slope and roughness in an elevation map, all of these hazards can be detected. The algorithm in this innovation is used to measure slope and roughness hazards. In addition to detecting these hazards, the HDA capability also is able to find a safe landing site free of these hazards for a lunar lander with diameter .15 m over most of the lunar surface. This software includes an implementation of the HDA algorithm, software for generating simulated lunar terrain maps for testing, hazard detection performance analysis tools, and associated documentation. The HDA software has been deployed to Langley Research Center and integrated into the POST II Monte Carlo simulation environment. The high-fidelity Monte Carlo simulations determine the required ground spacing between LIDAR samples (ground sample distances) and the noise on the LIDAR range measurement. This simulation has also been used to determine the effect of viewing on hazard detection performance. The software has also been deployed to Johnson Space Center and integrated into the ALHAT real-time Hardware-in-the-Loop testbed.

  2. Potential flood hazard assessment by integration of ALOS PALSAR and ASTER GDEM: a case study for the Hoa Chau commune, Hoa Vang district, in central Vietnam

    NASA Astrophysics Data System (ADS)

    Huong, Do Thi Viet; Nagasawa, Ryota

    2014-01-01

    The potential flood hazard was assessed for the Hoa Chau commune in central Vietnam in order to identify the high flood hazard zones for the decision makers who will execute future rural planning. A new approach for deriving the potential flood hazard based on integration of inundation and flow direction maps is described. Areas inundated in the historical flood event of 2007 were extracted from Advanced Land Observing Satellite (ALOS) phased array L-band synthetic aperture data radar (PALSAR) images, while flow direction characteristics were derived from the ASTER GDEM to extract the depressed surfaces. Past flood experience and the flow direction were then integrated to analyze and rank the potential flood hazard zones. The land use/cover map extracted from LANDSAT TM and flood depth point records from field surveys were utilized to check the possibility of susceptible inundated areas, extracting data from ALOS PALSAR and ranking the potential flood hazard. The estimation of potential flood hazard areas revealed that 17.43% and 17.36% of Hoa Chau had high and medium potential flood hazards, respectively. The flow direction and ALOS PALSAR data were effectively integrated for determining the potential flood hazard when hydrological and meteorological data were inadequate and remote sensing images taken during flood times were not available or were insufficient.

  3. A decision analysis framework for estimating the potential hazards for drinking water resources of chemicals used in hydraulic fracturing fluids.

    PubMed

    Yost, Erin E; Stanek, John; Burgoon, Lyle D

    2017-01-01

    Despite growing concerns over the potential for hydraulic fracturing to impact drinking water resources, there are limited data available to identify chemicals used in hydraulic fracturing fluids that may pose public health concerns. In an effort to explore these potential hazards, a multi-criteria decision analysis (MCDA) framework was employed to analyze and rank selected subsets of these chemicals by integrating data on toxicity, frequency of use, and physicochemical properties that describe transport in water. Data used in this analysis were obtained from publicly available databases compiled by the United States Environmental Protection Agency (EPA) as part of a larger study on the potential impacts of hydraulic fracturing on drinking water. Starting with nationwide hydraulic fracturing chemical usage data from EPA's analysis of the FracFocus Chemical Disclosure Registry 1.0, MCDAs were performed on chemicals that had either noncancer toxicity values (n=37) or cancer-specific toxicity values (n=10). The noncancer MCDA was then repeated for subsets of chemicals reported in three representative states (Texas, n=31; Pennsylvania, n=18; and North Dakota, n=20). Within each MCDA, chemicals received scores based on relative toxicity, relative frequency of use, and physicochemical properties (mobility in water, volatility, persistence). Results show a relative ranking of these chemicals based on hazard potential, and provide preliminary insight into chemicals that may be more likely than others to impact drinking water resources. Comparison of nationwide versus state-specific analyses indicates regional differences in the chemicals that may be of more concern to drinking water resources, although many chemicals were commonly used and received similar overall hazard rankings. Several chemicals highlighted by these MCDAs have been reported in groundwater near areas of hydraulic fracturing activity. This approach is intended as a preliminary analysis, and represents one possible method for integrating data to explore potential public health impacts. Published by Elsevier B.V.

  4. Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard.

    PubMed

    Liu, Rui; Chen, Yun; Wu, Jianping; Gao, Lei; Barrett, Damian; Xu, Tingbao; Li, Xiaojuan; Li, Linyi; Huang, Chang; Yu, Jia

    2017-04-01

    Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies. © 2016 Society for Risk Analysis.

  5. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  6. A CFD Analysis of Hydrogen Leakage During On-Pad Purge in the ORION/ARES I Shared Volume

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Edwards, Daryl A.

    2011-01-01

    A common open volume is created by the stacking of the Orion vehicle onto the Ares I Upper Stage. Called the Shared Volume, both vehicles contribute to its gas, fluid, and thermal environment. One of these environments is related to hazardous hydrogen gas. While both vehicles use inert purge gas to mitigate any hazardous gas buildup, there are concerns that hydrogen gas may still accumulate and that the Ares I Hazardous Gas Detection System will not be sufficient for monitoring the integrated volume. This Computational Fluid Dynamics (CFD) analysis has been performed to examine these topics. Results of the analysis conclude that the Ares I Hazardous Gas Detection System will be able to sample the vent effluent containing the highest hydrogen concentrations. A second conclusion is that hydrogen does not accumulate under the Orion Service Module (SM) avionics ring as diffusion and purge flow mixing sufficiently dilute the hydrogen to safe concentrations. Finally the hydrogen concentrations within the Orion SM engine nozzle may slightly exceed the 1 percent volume fraction when the entire worse case maximum full leak is directed vertically into the engine nozzle.

  7. 'Worst case' methodology for the initial assessment of societal risk from proposed major accident installations.

    PubMed

    Carter, D A; Hirst, I L

    2000-01-07

    This paper considers the application of one of the weighted risk indicators used by the Major Hazards Assessment Unit (MHAU) of the Health and Safety Executive (HSE) in formulating advice to local planning authorities on the siting of new major accident hazard installations. In such cases the primary consideration is to ensure that the proposed installation would not be incompatible with existing developments in the vicinity, as identified by the categorisation of the existing developments and the estimation of individual risk values at those developments. In addition a simple methodology, described here, based on MHAU's "Risk Integral" and a single "worst case" even analysis, is used to enable the societal risk aspects of the hazardous installation to be considered at an early stage of the proposal, and to determine the degree of analysis that will be necessary to enable HSE to give appropriate advice.

  8. Safety Analysis of Soybean Processing for Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Hentges, Dawn L.

    1999-01-01

    Soybeans (cv. Hoyt) is one of the crops planned for food production within the Advanced Life Support System Integration Testbed (ALSSIT), a proposed habitat simulation for long duration lunar/Mars missions. Soybeans may be processed into a variety of food products, including soymilk, tofu, and tempeh. Due to the closed environmental system and importance of crew health maintenance, food safety is a primary concern on long duration space missions. Identification of the food safety hazards and critical control points associated with the closed ALSSIT system is essential for the development of safe food processing techniques and equipment. A Hazard Analysis Critical Control Point (HACCP) model was developed to reflect proposed production and processing protocols for ALSSIT soybeans. Soybean processing was placed in the type III risk category. During the processing of ALSSIT-grown soybeans, critical control points were identified to control microbiological hazards, particularly mycotoxins, and chemical hazards from antinutrients. Critical limits were suggested at each CCP. Food safety recommendations regarding the hazards and risks associated with growing, harvesting, and processing soybeans; biomass management; and use of multifunctional equipment were made in consideration of the limitations and restraints of the closed ALSSIT.

  9. Evaluation of hazard and integrity monitor functions for integrated alerting and notification using a sensor simulation framework

    NASA Astrophysics Data System (ADS)

    Bezawada, Rajesh; Uijt de Haag, Maarten

    2010-04-01

    This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.

  10. European Natural Disaster Coordination and Information System for Aviation (EUNADICS-AV)

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Hirtl, Marcus; Arnold, Delia; Katzler-Fuchs, Susanne; Pappalardo, Gelsomina; Mona, Lucia; Sofiev, Mikhail; de Leeuw, Gerrit; Theys, Nicolas; Brenot, Hugues; Plu, Matthieu; Rockitansky, Carl-Herbert; Eschbacher, Kurt; Apituley, Arnoud; Som de Cerff, Wim

    2017-04-01

    Commercial aviation is one of the key infrastructures of our modern world. Even short interruptions can cause economic damages summing up to the Billion-Euro range. As evident from the past, aviation shows vulnerability with regard to natural hazards. Safe flight operations, air traffic management and air traffic control is a shared responsibility of EUROCONTROL, national authorities, airlines and pilots. All stakeholders have one common goal, namely to warrant and maintain the safety of flight crews and passengers. Currently, however, there is a significant gap in the Europe-wide availability of real time hazard measurement and monitoring information for airborne hazards describing "what, where, how much" in 3 dimensions, combined with a near-real-time European data analysis and assimilation system. This gap creates circumstances where various stakeholders in the system may base their decisions on different data and information. The H-2020 project EUNADICS-AV ("European Natural Disaster Coordination and Information System for Aviation"), started in October 2016, intends to close this gap in data and information availability, enabling all stakeholders in the aviation system to obtain fast, coherent and consistent information. The project intends to combine and harmonize data from satellite earth observation, ground based and airborne platforms, and to integrate them into state-of-the art data assimilation and analysis systems. Besides operational data sources, data from the research community are integrated as well. Hazards considered in the project include volcano eruptions, nuclear accidents and events, and forest fires. The availability of consistent and coherent data analysis fields based on all available measurements will greatly enhances our capability to respond to disasters effectively and efficiently, minimizing system downtimes and thus economic damage while maintaining the safety of millions of passengers.

  11. The Nature of Natural Hazards Communication (Invited)

    NASA Astrophysics Data System (ADS)

    Kontar, Y. Y.

    2013-12-01

    Some of the many issues of interest to natural hazards professionals include the analysis of proactive approaches to the governance of risk from natural hazards and approaches to broaden the scope of public policies related to the management of risks from natural hazards, as well as including emergency and environmental management, community development and spatial planning related to natural hazards. During the talk we will present results of scientific review, analysis and synthesis, which emphasize same new trends in communication of the natural hazards theories and practices within an up-to-the-minute context of new environmental and climate change issues, new technologies, and a new focus on resiliency. The presentation is divided into five sections that focus on natural hazards communication in terms of education, risk management, public discourse, engaging the public, theoretical perspectives, and new media. It includes results of case studies and best practices. It delves into natural hazards communication theories, including diffusion, argumentation, and constructivism, to name a few. The presentation will provide information about: (1) A manual of natural hazards communication for scientists, policymakers, and media; (2) An up-to-the-minute context of environmental hazards, new technologies & political landscape; (3) A work by natural hazards scientists for geoscientists working with social scientists and communication principles; (4) A work underpinned by key natural hazards communication theories and interspersed with pragmatic solutions; (5) A work that crosses traditional natural hazards boundaries: international, interdisciplinary, theoretical/applied. We will further explore how spatial planning can contribute to risk governance by influencing the occupation of natural hazard-prone areas, and review the central role of emergency management in risk policy. The goal of this presentation is to contribute to the augmentation of the conceptual framework of risk governance and increase the awareness of practitioners and decision-makers to the need to adopt proactive policies, leading to a more integrated, participative, and adaptive governance that can respond more efficiently to the increasing uncertainty resulting from escalating natural hazards risk exposure.

  12. HACCP: Integrating Science and Management through ASTM Standards

    EPA Science Inventory

    From a technical perspective, hazard analysis-critical control point (HACCP) evaluation may be considered a risk management tool suited to a wide range of applications. As one outcome of a symposium convened by American Society for Testing and Materials (ASTM) in August, 2005, th...

  13. Why the Eurocontrol Safety Regulation Commission Policy on Safety Nets and Risk Assessment is Wrong

    NASA Astrophysics Data System (ADS)

    Brooker, Peter

    2004-05-01

    Current Eurocontrol Safety Regulation Commission (SRC) policy says that the Air Traffic Management (ATM) system (including safety minima) must be demonstrated through risk assessments to meet the Target Level of Safety (TLS) without needing to take safety nets (such as Short Term Conflict Alert) into account. This policy is wrong. The policy is invalid because it does not build rationally and consistently from ATM's firm foundations of TLS and hazard analysis. The policy is bad because it would tend to retard safety improvements. Safety net policy must rest on a clear and rational treatment of integrated ATM system safety defences. A new safety net policy, appropriate to safe ATM system improvements, is needed, which recognizes that safety nets are an integrated part of ATM system defences. The effects of safety nets in reducing deaths from mid-air collisions should be fully included in hazard analysis and safety audits in the context of the TLS for total system design.

  14. Integration and Validation of Avian Radars (IVAR)

    DTIC Science & Technology

    2011-08-01

    hazards of electromagnetic radiation to fuel HERO hazards of electromagnetic radiation to ordnance HERP hazards ... hazard Radiation hazard to humans, fuels, and ordnance can be easily managed. Demonstration of how operation of radars can meet hazards of... electromagnetic radiation to personnel (HERP), hazards of electromagnetic radiation to

  15. Physically-based extreme flood frequency with stochastic storm transposition and paleoflood data on large watersheds

    NASA Astrophysics Data System (ADS)

    England, John F.; Julien, Pierre Y.; Velleux, Mark L.

    2014-03-01

    Traditionally, deterministic flood procedures such as the Probable Maximum Flood have been used for critical infrastructure design. Some Federal agencies now use hydrologic risk analysis to assess potential impacts of extreme events on existing structures such as large dams. Extreme flood hazard estimates and distributions are needed for these efforts, with very low annual exceedance probabilities (⩽10-4) (return periods >10,000 years). An integrated data-modeling hydrologic hazard framework for physically-based extreme flood hazard estimation is presented. Key elements include: (1) a physically-based runoff model (TREX) coupled with a stochastic storm transposition technique; (2) hydrometeorological information from radar and an extreme storm catalog; and (3) streamflow and paleoflood data for independently testing and refining runoff model predictions at internal locations. This new approach requires full integration of collaborative work in hydrometeorology, flood hydrology and paleoflood hydrology. An application on the 12,000 km2 Arkansas River watershed in Colorado demonstrates that the size and location of extreme storms are critical factors in the analysis of basin-average rainfall frequency and flood peak distributions. Runoff model results are substantially improved by the availability and use of paleoflood nonexceedance data spanning the past 1000 years at critical watershed locations.

  16. Integrating socio-economic and infrastructural dimension to reveal hazard vulnerability of coastal districts

    NASA Astrophysics Data System (ADS)

    Mazumdar, Jublee; Paul, Saikat

    2015-04-01

    Losses of life and property due to natural hazards have intensified in the past decade, motivating an alteration of disaster management away from simple post event resettlement and rehabilitation. The degree of exposure to hazard for a homogeneous population is not entirely reliant upon nearness to the source of hazard event. Socio-economic factors and infrastructural capability play an important role in determining the vulnerability of a place. This study investigates the vulnerability of eastern coastal states of India from tropical cyclones. The record of past hundred years shows that the physical vulnerability of eastern coastal states is four times as compared to the western coastal states in terms of frequency and intensity of tropical cyclones. Nevertheless, these physical factors played an imperative role in determining the vulnerability of eastern coast. However, the socio-economic and infrastructural factors influence the risk of exposure exponentially. Inclusion of these indicators would provide better insight regarding the preparedness and resilience of settlements to hazard events. In this regard, the present study is an effort to develop an Integrated Vulnerability Model (IVM) based on socio-economic and infrastructural factors for the districts of eastern coastal states of India. A method is proposed for quantifying the socio-economic and infrastructural vulnerability to tropical cyclone in these districts. The variables included in the study are extracted from Census of India, 2011 at district level administrative unit. In the analysis, a large number of variables are reduced to a smaller number of factors by using principal component analysis that represents the socio-economic and infrastructure vulnerability to tropical cyclone. Subsequently, the factor scores in socio-economic Vulnerability Index (SeVI) and Infrastructure Vulnerability Index (InVI) are standardized from 0 to 1, indicating the range from low to high vulnerability. The factor scores are then mapped for spatial analysis. Utilizing SeVI and InVI, the highly vulnerable districts are demonstrated that are likely to face significant challenges in coping with tropical cyclone and require strategies to address the various aspects of socio-economic and infrastructural vulnerability. Moreover, this model can be incorporated not only for multi-level governance but also to integrate it with the real-time weather forecasts to identify the predictive areas of vulnerability.

  17. A life cycle hazard assessment (LCHA) framework to address fire hazards at the wildland-urban interface

    NASA Astrophysics Data System (ADS)

    Lindquist, Eric; Pierce, Jen; Wuerzer, Thomas; Glenn, Nancy; Dialani, Jijay; Gibble, Katie; Frazier, Tim; Strand, Eva

    2015-04-01

    The stages of planning for and responding to natural hazards, such as wildfires and related events, are often conducted as discrete (and often not connected) efforts. Disaster response often takes precedence, exhausting agency and stakeholder resources, and the planning stages are conducted by different agencies or entities with different and often competing agendas and jurisdictions. The result is that evaluation after a disaster can be minimal or even non-existent as resources are expended and interest moves on to the next event. Natural disasters and hazards, however, have a tendency to cascade and multiply: wildfires impact the vulnerability of hillslopes, for example, which may result in landslides, flooding and debris flows long after the initial event has occurred. Connecting decisions across multiple events and time scales is ignored, yet these connections could lead to better policy making at all stages of disaster risk reduction. Considering this situation, we present an adapted life cycle analysis (LCA) approach to examine fire-related hazards at the Wildland-Urban Interface in the American West. The LCHA focuses on the temporal integration of : 1) the 'pre-fire' set of physical conditions (e.g. fuel loads) and human conditions (e.g. hazard awareness), 2) the 'fire event', focusing on computational analysis of the communication patterns and responsibility for response to the event, and 3) the 'post-event' analysis of the landscape susceptibility to fire-related debris flows. The approach of the LCHA follows other models used by governmental agencies to prepare for disasters through 1) preparation and prevention, 2) response and 3) recovery. As an overlay are the diverse agencies and policies associated with these stages and their respective resource and management decisions over time. LCAs have evolved from a business-centric consideration of the environmental impact of a specific product over the products life. This approach takes several phases to end up with an assessment of the impact of the product on the environment over time and is being considered beyond the business and logistics communities in such areas as biodiversity and ecosystem impacts. From our perspective, we consider wildfire as the "product" and want to understand how it impacts the environment (spatially, temporally, across the bio-physical and social domains). Through development of this LCHA we adapt the LCA approach with a focus on the inputs (from fire and pre-fire efforts) outputs (from post fire conditions) and how they evolve and are responded to by the responsible agencies and stakeholders responsible. A Life Cycle Hazard Assessment (LCHA) approach extends and integrates the understanding of hazards over much longer periods of time than previously considered. The LCHA also provides an integrated platform for the necessary interdisciplinary approach to understanding decision and environmental change across the life cycle of the fire event. This presentation will discuss our theoretical and empirical framework for developing a longitudinal LCHA and contribute to the overall goals of the NH7.1 session.

  18. Utilization of accident databases and fuzzy sets to estimate frequency of HazMat transport accidents.

    PubMed

    Qiao, Yuanhua; Keren, Nir; Mannan, M Sam

    2009-08-15

    Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.

  19. Efficacy of an integrated hospital-primary care program for heart failure: a population-based analysis of 56,742 patients.

    PubMed

    Comín-Colet, Josep; Verdú-Rotellar, José María; Vela, Emili; Clèries, Montse; Bustins, Montserrat; Mendoza, Lola; Badosa, Neus; Cladellas, Mercè; Ferré, Sofía; Bruguera, Jordi

    2014-04-01

    The efficacy of heart failure programs has been demonstrated in clinical trials but their applicability in the real world practice setting is more controversial. This study evaluates the feasibility and efficacy of an integrated hospital-primary care program for the management of patients with heart failure in an integrated health area covering a population of 309,345. For the analysis, we included all patients consecutively admitted with heart failure as the principal diagnosis who had been discharged alive from all of the hospitals in Catalonia, Spain, from 2005 to 2011, the period when the program was implemented, and compared mortality and readmissions among patients exposed to the program with the rates in the patients of all the remaining integrated health areas of the Servei Català de la Salut (Catalan Health Service). We included 56,742 patients in the study. There were 181,204 hospital admissions and 30,712 deaths during the study period. In the adjusted analyses, when compared to the 54,659 patients from the other health areas, the 2083 patients exposed to the program had a lower risk of death (hazard ratio=0.92 [95% confidence interval, 0.86-0.97]; P=.005), a lower risk of clinically-related readmission (hazard ratio=0.71 [95% confidence interval, 0.66-0.76]; P<.001), and a lower risk of readmission for heart failure (hazard ratio=0.86 [95% confidence interval, 0.80-0.94]; P<.001). The positive impact on the morbidity and mortality rates was more marked once the program had become well established. The implementation of multidisciplinary heart failure management programs that integrate the hospital and the community is feasible and is associated with a significant reduction in patient morbidity and mortality. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  20. From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment

    NASA Astrophysics Data System (ADS)

    Klose, M.; Damm, B.

    2014-12-01

    The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their impact on public budgets is a further component of this approach. In integrated risk assessment, combination of methods plays an important role, with the objective of collecting and integrating complex data sets on landslide risk.

  1. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  2. An integrated risk assessment of the environmental hazards influence on Kazakhstan section (from Khorgas and Dostyk dry ports to Aktau seaport) of the international transport corridor

    NASA Astrophysics Data System (ADS)

    Akiyanova, F. Zh; Arykbayeva, Z. K.; Atalikhova, A. M.; Dauilbayev, B. A.; Zinabdin, N. B.; Kubeyev, A. B.; Tkach, K. A.

    2018-01-01

    The article outlines research results on the assessment of natural hazards impact risk on the international transport corridors’ Kazakhstan section (from Khorgas and Dostyk dry ports to the seaport of Aktau) functioning. Based on the component-by-stage analysis of physical and geographical conditions with the use of qualimetric approach, the areas with different risk levels of natural disasters were identified. To minimize the risk of natural problems exposure, a set of environmental recommendations has been developed.

  3. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    NASA Astrophysics Data System (ADS)

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.

  4. Integrating anthropogenic hazard data to facilitate research related to the exploitation of geo-resources

    NASA Astrophysics Data System (ADS)

    Kwiatek, Grzegorz; Blanke, Aglaja; Olszewska, Dorota; Orlecka-Sikora, Beata; Lasocki, Stanisław; Kozlovskaya, Elena; Nevalainen, Jouni; Schmittbuhl, Jean; Grasso, Jean-Robert; Schaming, Marc; Bigarre, Pascal; Kinscher, Jannes-Lennart; Saccorotti, Gilberto; Garcia, Alexander; Cassidy, Nigel; Toon, Sam; Mutke, Grzegorz; Sterzel, Mariusz; Szepieniec, Tomasz

    2017-04-01

    The Thematic Core Service "Anthropogenic Hazards" (TCS AH) integrates data and provides various data services in a form of complete e-research infrastructure for advanced analysis and geophysical modelling of anthropogenic hazard due to georesources exploitation. TCS AH is based on the prototype built in the framework of the IS-EPOS project POIG.02.03.00-14-090/13-00 (https://tcs.ah-epos.eu/). The TCS AH is currently being further developed within EPOS Implementation phase (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). The TCS AH aims to have a measurable impact on innovative research and development by providing a comprehensive, wide-scale and high quality research infrastructure available to the scientific community, industrial partners and public. One of the main deliverable of TCS AH is the access to numerous induced seismicity datasets called "episodes". The episode is defined as a comprehensive set of data describing the geophysical process induced or triggered by technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment. The episode is a time-correlated, standardized collection of geophysical, technological and other relevant geodata forming complete documentation of seismogenic process. In addition to the 6 episodes already implemented during previous phase of integration, and 3 episodes integrated within SHEER project, at least 18 new episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production are currently being integrated into the TCS AH. The heterogeneous multi-disciplinary data from different episodes are subjected to an extensive quality control (QC) procedure composed of five steps and involving the collaborative work of data providers, quality control team, IT team, that is being supervised by the quality control manager with the aid of Redmine platform. The first three steps of QC are performed at local data center and include the (1) transfer of episode data to the local data center, (2) data standardization and validation of formats, (3) metadata preparation according to TCS AH metadata scheme. The final two steps of QC are performed already at the level of TCS AH website and include (4) Contextual analysis of data quality followed by appearance of episode in TCS AH maintenance area, and finally the (5) Episode publication at TCS AH website.

  5. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    NASA Astrophysics Data System (ADS)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m resolution was used for the analysis. Three failure mechanisms were analyzed: planar and wedge sliding, as well as toppling. Based on this kinematic analysis, areas where failure is feasible were used as source areas for run out analysis using Rockyfor3D v. 4.1 (www.ecorisq.org). The software calculates trajectories of single falling blocks in three dimensions using physically based algorithms developed under a stochastic approach. The ALS-DEM was down-scaled to 5 m resolution to optimize processing time. Results were compared with run-out simulations using Rockyfor3D with the whole rock wall as source area, and with maps of deposits generated from field observations and aerial photo interpretation. The results product of our implementation show a better correlation with field observations, and help to produce more accurate rock fall hazard assessment maps by a better definition of the source areas. It reduces the time processing for the analysis as well. The findings presented in this contribution are part of an effort to produce guidelines for natural hazard mapping in Norway. Guidelines will be used in upcoming years for hazard mapping in areas where larger groups of population are exposed to mass movements from steep slopes.

  6. Food safety issues associated with products from aquaculture. Report of a Joint FAO/NACA/WHO Study Group.

    PubMed

    1999-01-01

    The past decade has seen rapid expansion in aquaculture production. In the fisheries sector, as in animal production, farming is replacing hunting as the primary food production strategy. In future, farmed fish will be an even more important source of protein foods than they are today, and the safety for human consumption of products from aquaculture is of public health significance. This is the report of a Study Group that considered food safety issues associated with farmed finfish and crustaceans. The principal conclusion was that an integrated approach--involving close collaboration between the aquaculture, agriculture, food safety, health and education sectors--is needed to identify and control hazards associated with products from aquaculture. Food safety assurance should be included in fish farm management and form an integral part of the farm-to-table food safety continuum. Where appropriate, measures should be based on Hazard Analysis and Critical Control Point (HACCP) methods; however, difficulties in applying HACCP principles to small-scale farming systems were recognized. Food safety hazards associated with products from aquaculture differ according to region, habitat and environmental conditions, as well as methods of production and management. Lack of awareness of hazards can hinder risk assessment and the application of risk management strategies to aquaculture production, and education is therefore needed. Chemical and biological hazards that should to be taken into account in public health policies concerning products from aquaculture are discussed in this report, which should be of use to policy-makers and public health officials. The report will also assist fish farmers to identify hazards and develop appropriate hazard-control strategies.

  7. Landslide hazards and systems analysis: A Central European perspective

    NASA Astrophysics Data System (ADS)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to landslides. Along with a large number of small, but costly landslide events and widespread insidious damages, the interplay of these societal trends determines landslide hazard and risk in Germany or elsewhere in Central Europe (e.g., Houlihan, 1994; Klose et al., 2015). The case studies presented here help to better understand human-environment interactions in the hazard context. Although there has been substantial progress in assessing landslide hazards, integrated approaches with an interdisciplinary focus are still exceptional. The scope of historical datasets available for hazard assessments, however, covers the whole range of natural and social systems interacting with hazards, their influences on overall system vulnerability, and the feedbacks, time lags, and couplings among these systems. In combination with methods from the natural and social sciences, systems analysis supports hazard assessments across disciplinary boundaries to take a broader look at landslide hazards as is usually done. References Houlihan, B., 1994. Europe's ageing infrastructure: Politics, finance and the environment. Utilities Policy 4, 243-252. Liu, J., Dietz, T., Carpenter, S.R., Alberti, M., Folke, C., Moran, E., Pell, A.N., Deadman, P., Kratz, T., Lubchenco, J., Ostrom, E., Ouyang, Z., Provencher, W., Redman, C.L., Schneider, S.H., Taylor, W.W., 2007. Complexity of Coupled Human and Natural Systems. Science 317, 1513-1516. Klose, M., Damm, B., Maurischat, P., 2015. Landslide impacts in Germany: A historical and socioeconomic perspective. Landslides, doi:10.1007/s10346-015-0643-9.

  8. Linking local vulnerability to climatic hazard damage assessment for integrated river basin management

    NASA Astrophysics Data System (ADS)

    Hung, Hung-Chih; Liu, Yi-Chung; Chien, Sung-Ying

    2015-04-01

    1. Background Major portions of areas in Asia are expected to increase exposure and vulnerability to climate change and weather extremes due to rapid urbanization and overdevelopment in hazard-prone areas. To prepare and confront the potential impacts of climate change and related hazard risk, many countries have implemented programs of integrated river basin management. This has led to an impending challenge for the police-makers in many developing countries to build effective mechanism to assess how the vulnerability distributes over river basins, and to understand how the local vulnerability links to climatic (climate-related) hazard damages and risks. However, the related studies have received relatively little attention. This study aims to examine whether geographic localities characterized by high vulnerability experience significantly more damages owing to onset weather extreme events at the river basin level, and to explain what vulnerability factors influence these damages or losses. 2. Methods and data An indicator-based assessment framework is constructed with the goal of identifying composite indicators (including exposure, biophysical, socioeconomic, land-use and adaptive capacity factors) that could serve as proxies for attributes of local vulnerability. This framework is applied by combining geographical information system (GIS) techniques with multicriteria decision analysis (MCDA) to evaluate and map integrated vulnerability to climatic hazards across river basins. Furthermore, to explain the relationship between vulnerability factors and disaster damages, we develop a disaster damage model (DDM) based on existing disaster impact theory. We then synthesize a Zero-Inflated Poisson regression model with a Tobit regression analysis to identify and examine how the disaster impacts and vulnerability factors connect to typhoon disaster damages and losses. To illustrate the proposed methodology, the study collects data on the vulnerability attributes of the Kaoping, Tsengwen, and Taimali River basins in southern Taiwan, and on the disaster impacts and damages in these river basins due to Typhoon Morakot in 2009. The data was offered by the National Science and Technology Center for Disaster Reduction, Taiwan, as well as collected from the National Land Use Investigation, official census statistics and questionnaire surveys. 3. Results We use an MCDA to create a composite vulnerability index, and this index is incorporated into a GIS analysis to demonstrate the results of integrated vulnerability assessment throughout the river basins. Results of the vulnerability assessment indicate that the most vulnerable areas are almost all situated in the regions of middle and upper reaches of the river basins. Through the examining of DDM, it shows that the vulnerability factors play a critical role in determining disaster damages. Findings also present that the losses and casualties caused by Typhoon Morakot increase with elevation, urban and agricultural developments, proximity to rivers, and decrease with levels of income and adaptive capacity. Finally, we propose the adaptive options for minimizing vulnerability and risk, as well as for integrated river basin governance.

  9. Integrated survival analysis using an event-time approach in a Bayesian framework

    USGS Publications Warehouse

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  10. Integrated survival analysis using an event-time approach in a Bayesian framework.

    PubMed

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  11. Integrating population dynamics into mapping human exposure to seismic hazard

    NASA Astrophysics Data System (ADS)

    Freire, S.; Aubrecht, C.

    2012-11-01

    Disaster risk is not fully characterized without taking into account vulnerability and population exposure. Assessment of earthquake risk in urban areas would benefit from considering the variation of population distribution at more detailed spatial and temporal scales, and from a more explicit integration of this improved demographic data with existing seismic hazard maps. In the present work, "intelligent" dasymetric mapping is used to model population dynamics at high spatial resolution in order to benefit the analysis of spatio-temporal exposure to earthquake hazard in a metropolitan area. These night- and daytime-specific population densities are then classified and combined with seismic intensity levels to derive new spatially-explicit four-class-composite maps of human exposure. The presented approach enables a more thorough assessment of population exposure to earthquake hazard. Results show that there are significantly more people potentially at risk in the daytime period, demonstrating the shifting nature of population exposure in the daily cycle and the need to move beyond conventional residence-based demographic data sources to improve risk analyses. The proposed fine-scale maps of human exposure to seismic intensity are mainly aimed at benefiting visualization and communication of earthquake risk, but can be valuable in all phases of the disaster management process where knowledge of population densities is relevant for decision-making.

  12. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    PubMed

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  13. Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS

    NASA Astrophysics Data System (ADS)

    Ahmad, Raed; Adris, Ahmad; Singh, Ramesh

    2016-07-01

    In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.

  14. Integrating GIS with AHP and Fuzzy Logic to generate hand, foot and mouth disease hazard zonation (HFMD-HZ) model in Thailand

    NASA Astrophysics Data System (ADS)

    Samphutthanon, R.; Tripathi, N. K.; Ninsawat, S.; Duboz, R.

    2014-12-01

    The main objective of this research was the development of an HFMD hazard zonation (HFMD-HZ) model by applying AHP and Fuzzy Logic AHP methodologies for weighting each spatial factor such as disease incidence, socio-economic and physical factors. The outputs of AHP and FAHP were input into a Geographic Information Systems (GIS) process for spatial analysis. 14 criteria were selected for analysis as important factors: disease incidence over 10 years from 2003 to 2012, population density, road density, land use and physical features. The results showed a consistency ratio (CR) value for these main criteria of 0.075427 for AHP, the CR for FAHP results was 0.092436. As both remained below the threshold of 0.1, the CR value were acceptable. After linking to actual geospatial data (disease incidence 2013) through spatial analysis by GIS for validation, the results of the FAHP approach were found to match more accurately than those of the AHP approach. The zones with the highest hazard of HFMD outbreaks were located in two main areas in central Muang Chiang Mai district including suburbs and Muang Chiang Rai district including the vicinity. The produced hazardous maps may be useful for organizing HFMD protection plans.

  15. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  16. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE PAGES

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; ...

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  17. Risk assessment of oil spills along the Mediterranean coast: A sensitivity analysis of the choice of hazard quantification.

    PubMed

    Al Shami, A; Harik, G; Alameddine, I; Bruschi, D; Garcia, D Astiaso; El-Fadel, M

    2017-01-01

    Oil pollution in the Mediterranean represents a serious threat to the coastal environment. Quantifying the risks associated with a potential spill is often based on results generated from oil spill models. In this study, MEDSLIK-II, an EU funded and endorsed oil spill model, is used to assess potential oil spill scenarios at four pilot areas located along the northern, eastern, and southern Mediterranean shoreline, providing a wide range of spill conditions and coastal geomorphological characteristics. Oil spill risk assessment at the four pilot areas was quantified as a function of three oil pollution metrics that include the susceptibility of oiling per beach segment, the average volume of oiling expected in the event of beaching, and the average oil beaching time. The results show that while the three pollution metrics tend to agree in their hazard characterization when the shoreline morphology is simple, considerable differences in the quantification of the associated hazard is possible under complex coastal morphologies. These differences proved to greatly alter the evaluation of environmental risks. An integrative hazard index is proposed that encompasses the three simulated pollution metrics. The index promises to shed light on oil spill hazards that can be universally applied across the Mediterranean basin by integrating it with the unified oil spill risk assessment tool developed by the Regional Marine Pollution Emergency Response Centre for the Mediterranean (REMPEC). Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Total Risk Integrated Methodology (TRIM) - TRIM.Risk

    EPA Pesticide Factsheets

    TRIM.Riskis used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.

  19. Concept of a spatial data infrastructure for web-mapping, processing and service provision for geo-hazards

    NASA Astrophysics Data System (ADS)

    Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara

    2017-04-01

    Geo-hazards and their effects are distributed geographically over wide regions. The effective mapping and monitoring is essential for hazard assessment and mitigation. It is often best achieved using satellite imagery and new object-based image analysis approaches to identify and delineate geo-hazard objects (landslides, floods, forest fires, storm damages, etc.). At the moment, several local/national databases and platforms provide and publish data of different types of geo-hazards as well as web-based risk maps and decision support systems. Also, the European commission implemented the Copernicus Emergency Management Service (EMS) in 2015 that publishes information about natural and man-made disasters and risks. Currently, no platform for landslides or geo-hazards as such exists that enables the integration of the user in the mapping and monitoring process. In this study we introduce the concept of a spatial data infrastructure for object delineation, web-processing and service provision of landslide information with the focus on user interaction in all processes. A first prototype for the processing and mapping of landslides in Austria and Italy has been developed within the project Land@Slide, funded by the Austrian Research Promotion Agency FFG in the Austrian Space Applications Program ASAP. The spatial data infrastructure and its services for the mapping, processing and analysis of landslides can be extended to other regions and to all types of geo-hazards for analysis and delineation based on Earth Observation (EO) data. The architecture of the first prototypical spatial data infrastructure includes four main areas of technical components. The data tier consists of a file storage system and the spatial data catalogue for the management of EO-data, other geospatial data on geo-hazards, as well as descriptions and protocols for the data processing and analysis. An interface to extend the data integration from external sources (e.g. Sentinel-2 data) is planned for the possibility of rapid mapping. The server tier consists of java based web and GIS server. Sub and main services are part of the service tier. Sub services are for example map services, feature editing services, geometry services, geoprocessing services and metadata services. For (meta)data provision and to support data interoperability, web standards of the OGC and the rest-interface is used. Four central main services are designed and developed: (1) a mapping service (including image segmentation and classification approaches), (2) a monitoring service to monitor changes over time, (3) a validation service to analyze landslide delineations from different sources and (4) an infrastructure service to identify affected landslides. The main services use and combine parts of the sub services. Furthermore, a series of client applications based on new technology standards making use of the data and services offered by the spatial data infrastructure. Next steps include the design to extend the current spatial data infrastructure to other areas and geo-hazard types to develop a spatial data infrastructure that can assist targeted mapping and monitoring of geo-hazards on a global context.

  20. From leaves to landscape: A multiscale approach to assess fire hazard in wildland-urban interface areas.

    PubMed

    Ghermandi, Luciana; Beletzky, Natacha A; de Torres Curth, Mónica I; Oddi, Facundo J

    2016-12-01

    The overlapping zone between urbanization and wildland vegetation, known as the wildland urban interface (WUI), is often at high risk of wildfire. Human activities increase the likelihood of wildfires, which can have disastrous consequences for property and land use, and can pose a serious threat to lives. Fire hazard assessments depend strongly on the spatial scale of analysis. We assessed the fire hazard in a WUI area of a Patagonian city by working at three scales: landscape, community and species. Fire is a complex phenomenon, so we used a large number of variables that correlate a priori with the fire hazard. Consequently, we analyzed environmental variables together with fuel load and leaf flammability variables and integrated all the information in a fire hazard map with four fire hazard categories. The Nothofagus dombeyi forest had the highest fire hazard while grasslands had the lowest. Our work highlights the vulnerability of the wildland-urban interface to fire in this region and our suggested methodology could be applied in other wildland-urban interface areas. Particularly in high hazard areas, our work could help in spatial delimitation policies, urban planning and development of plans for the protection of human lives and assets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Coastal vulnerability: climate change and natural hazards perspectives

    NASA Astrophysics Data System (ADS)

    Romieu, E.; Vinchon, C.

    2009-04-01

    Introduction Studying coastal zones as a territorial concept (Integrated coastal zone management) is an essential issue for managers, as they have to consider many different topics (natural hazards, resources management, tourism, climate change…). The recent approach in terms of "coastal vulnerability" studies (since the 90's) is the main tool used nowadays to help them in evaluating impacts of natural hazards on coastal zones, specially considering climate change. This present communication aims to highlight the difficulties in integrating this concept in risk analysis as it is usually practiced in natural hazards sciences. 1) Coastal vulnerability as a recent issue The concept of coastal vulnerability mainly appears in the International panel on climate change works of 1992 (IPCC. 2001), where it is presented as essential for climate change adaptation. The concept has been defined by a common methodology which proposes the assessment of seven indicators, in regards to a sea level rise of 1m in 2100: people affected, people at risk, capital value at loss, land at loss, wetland at loss, potential adaptation costs, people at risk assuming this adaptation. Many national assessments have been implemented (Nicholls, et al. 1995) and a global assessment was proposed for three indicators (Nicholls, et al. 1999). The DINAS-Coast project reuses this methodology to produce the DIVA-tool for coastal managers (Vafeidis, et al. 2004). Besides, many other methodologies for national or regional coastal vulnerability assessments have been developed (review by (UNFCCC. 2008). The use of aggregated vulnerability indicators (including geomorphology, hydrodynamics, climate change…) is widespread: the USGS coastal vulnerability index is used worldwide and was completed by a social vulnerability index (Boruff, et al. 2005). Those index-based methods propose a vulnerability mapping which visualise indicators of erosion, submersion and/or socio economic sensibility in coastal zones. This concept is a great tool for policy makers to help managing their action and taking into account climate change (McFadden, et al. 2006). However, in those approaches, vulnerability is the output itself (cost of effective impacts, geomorphologic impacts…), but is not integrated it in a risk analysis. Furthermore, those studies emerged from a climatic perspective, which leads to consider climate change as a hazard or pressure whereas risk studies commonly consider hazards such as erosion and flooding, where climate change modifies the drivers of the hazard. 2) The natural hazards and socio economic perspectives In order to reduce impacts of natural hazards, decision makers need a complete risk assessment (probability of losses). Past studies on natural risks (landslide, earthquake...) highlighted the pertinence of defining risk as a combination of : (1)hazard occurrence and intensity, (2) exposition and (3)vulnerability of assets and population to this hazard (e.g. Douglas. 2007, Sarewitz, et al. 2003). Following the Renn and Klinke risk assessment frame, high uncertainties associated with coastal risks considering climatic and anthropic change highlights the importance of working on that concept of "vulnerability" (Klinke and Renn. 2002). Past studies on vulnerability assessment showed a frequently mentioned gap between "impact based" and "human based" points of view. It is nowadays a great issue for natural risk sciences. Many research efforts in FP7 projects such as MOVE and ENSURE focus on integrating the different dimensions of vulnerability (Turner, et al. 2003, Birkmann. 2006). Coastal risk studies highlight another issue of concern. We previously detailed the different use of the term "vulnerability" in the coastal context, quite different of the "natural risk's" use. Interaction of social, economic and physical sciences is considered within two french research projects (Vulsaco, Miseeva), in order to identify the vulnerability of a system to flooding or erosion (i.e. its characteristics that create potential harm), and integrate them in a risk assessment. Global change is considered by modifications of hazard, anthropogenic pressure and exposition, in order to point out possible modification of vulnerabilities. 3) Learning from both perspectives Coastal vulnerability in its "end in itself" and climate change dimension is a widespread tool for decision makers but it can be inadequate when vulnerability is a component of risk. This is mainly due to the consideration of climate change as a "hazard", so that coastal vulnerability is seen as the possible adverse impacts of climate change. As a matter of fact, this concept is clearly well considered by managers, who feel deeply concerned by climate change. However, coastal risk managers would gain in considering climate change more like a driver able to modify existing hazards than like the pressure in itself. Using this concept could lead to new perspectives of coastal risk mitigation for decision makers (social vulnerability, risk perception…), learning from other disciplines and sciences thanks to research projects such as MOVE (FP7). Acknowledgements The authors would like to thank the BRGM coastal team for rich discussions and fruitful collaborations in coastal vulnerability studies, more specially Déborah Idier for animating the Vulsaco project and Manuel Garcin for his work on tsunamis in Sri Lanka. They are also grateful to the MISEEVA and MOVE teams, which are doing some great trans-disciplinary work. References Birkmann, J., 2006. Measuring vulnerability to Natural Hazards : towards disaster resilient societies. United Nations University Press. Boruff, B. J., Emrich, C., Cutter, S. L., 2005. Erosion hazard vulnerability of US coastal counties. Journal of Coastal Research. 21, 932-942. Douglas, J., 2007. Physical vulnerability modelling in natural hazard risk assessment. Natural Hazards and Earth System Sciences. 7, 283-288. IPCC, 2001. Climate change 2001 : synthesis report. A contribution of working groups I, II and III to the Third Assesment Report of the Intergovernmental Panel on Climate Change. Klinke, A. and Renn, O., 2002. A new approach to risk evaluation and management : risk based, precaution based and discourse based strategies. Risk Analysis. 22, 1071-1094. McFadden, L., Nicholls, R.J., Penning-Rowsell, E. (Eds.), 2006. Managing coastal vulnerability. Elsevier Science. Nicholls, R. J., Hoozemans, F. M. J., Marchand, M., 1999. Increasing flood risk and wetland losses due to global sea-level rise: regional and global analyses. Global Environmental Change, Part A: Human and Policy Dimensions. 9, S69-S87. Nicholls, R. J., Leatherman, S. P., Volonte, C. R., 1995. Impacts and responses to sea-level rise; qualitative and quantitative assessments; Potential impacts of accelerated sea-level rise on developing countries. Journal of Coastal Research. Special issue 14, 26-43. Sarewitz, D., Pielke, R., Keykhah, M., 2003. Vulnerability and Risk: Some Thoughts from a Political and Policy Perspective. Risk Analysis. 23, 805-810. Turner, B. L.,II, Kasperson, R. E., Matson, P. A., McCarthy, J. J., Corell, R. W., Christensen, L., Eckley, N., Kasperson, J. X., Luers, A., Martello, M. L., Polsky, C., Pulsipher, A., Schiller, A., 2003. A framework for vulnerability analysis in sustainability science. Proceedings of the National Academy of Sciences, USA. 100, 8074-8079. UNFCCC, 2008. Compendium on methods and tools to evaluate impacts of vulnerability and adaptation to climate change. Vafeidis, A., Nicholls, R., McFadden, L., 2004. Developing a database for global vulnerability analysis of coastal zones: The DINAS-COAST project and the DIVA tool.

  2. Multi-hazards risk assessment at different levels

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2012-04-01

    Natural and technological disasters are becoming more frequent and devastating. Social and economic losses due to those events increase annually, which is definitely in relation with evolution of society. Natural hazards identification and analysis, as well natural risk assessment taking into account secondary technological accidents are the first steps in prevention strategy aimed at saving lives and protecting property against future events. The paper addresses methodological issues of natural and technological integrated risk assessment and mapping at different levels [1, 2]. At the country level the most hazardous natural processes, which may results in fatalities, injuries and economic loss in the Russian Federation, are considered. They are earthquakes, landslides, mud flows, floods, storms, avalanches. The special GIS environment for the country territory was developed which includes information about hazards' level and reoccurrence, an impact databases for the last 20 years, as well as models for estimating damage and casualties caused by these hazards. Federal maps of seismic individual and collective risk, as well as multi-hazards natural risk maps are presented. The examples of regional seismic risk assessment taking into account secondary accidents at fire, explosion and chemical hazardous facilities and regional integrated risk assessment are given for the earthquake prone areas of the Russian Federation. The paper also gives examples of loss computations due to scenario earthquakes taking into account accidents trigged by strong events at critical facilities: fire and chemical hazardous facilities, including oil pipe lines routes located in the earthquake prone areas. The estimations of individual seismic risk obtained are used by EMERCOM of the Russian Federation, as well as by other federal and local authorities, for planning and implementing preventive measures, aimed at saving lives and protecting property against future disastrous events. The results also allow to develop effective emergency response plans taking into account possible scenario events. Taking into consideration the size of the oil pipe line systems located in the highly active seismic zones, the results of seismic risk computation are used by TRANSNEFT JSC.

  3. Integration of different data gap filling techniques to facilitate assessment of polychlorinated biphenyls: A proof of principle case study (ASCCT meeting)

    EPA Science Inventory

    Data gap filling techniques are commonly used to predict hazard in the absence of empirical data. The most established techniques are read-across, trend analysis and quantitative structure-activity relationships (QSARs). Toxic equivalency factors (TEFs) are less frequently used d...

  4. System hazards in managing laboratory test requests and results in primary care: medical protection database analysis and conceptual model.

    PubMed

    Bowie, Paul; Price, Julie; Hepworth, Neil; Dinwoodie, Mark; McKay, John

    2015-11-27

    To analyse a medical protection organisation's database to identify hazards related to general practice systems for ordering laboratory tests, managing test results and communicating test result outcomes to patients. To integrate these data with other published evidence sources to inform design of a systems-based conceptual model of related hazards. A retrospective database analysis. General practices in the UK and Ireland. 778 UK and Ireland general practices participating in a medical protection organisation's clinical risk self-assessment (CRSA) programme from January 2008 to December 2014. Proportion of practices with system risks; categorisation of identified hazards; most frequently occurring hazards; development of a conceptual model of hazards; and potential impacts on health, well-being and organisational performance. CRSA visits were undertaken to 778 UK and Ireland general practices of which a range of systems hazards were recorded across the laboratory test ordering and results management systems in 647 practices (83.2%). A total of 45 discrete hazard categories were identified with a mean of 3.6 per practice (SD=1.94). The most frequently occurring hazard was the inadequate process for matching test requests and results received (n=350, 54.1%). Of the 1604 instances where hazards were recorded, the most frequent was at the 'postanalytical test stage' (n=702, 43.8%), followed closely by 'communication outcomes issues' (n=628, 39.1%). Based on arguably the largest data set currently available on the subject matter, our study findings shed new light on the scale and nature of hazards related to test results handling systems, which can inform future efforts to research and improve the design and reliability of these systems. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. 2002 Hyperspectral Analysis of Hazardous Waste Sites on the Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gladden, J.B.

    2003-08-28

    Hazardous waste site inspection is a labor intensive, time consuming job, performed primarily on the ground using visual inspection and instrumentation. It is an expensive process to continually monitor hazardous waste and/or landfill sites to determine if they are maintaining their integrity. In certain instances, it may be possible to monitor aspects of the hazardous waste sites and landfills remotely. The utilization of multispectral data was suggested for the mapping of clays and iron oxides associated with contaminated groundwater, vegetation stress, and methane gas emissions (which require longer wavelength detectors). The Savannah River Site (SRS) near Aiken, S.C. is amore » United States Department of Energy facility operated by the Westinghouse Savannah River Company. For decades the SRS was responsible for developing weapons grade plutonium and other materials for the nation's nuclear defense. Hazardous waste was generated during this process. Waste storage site inspection is a particularly important issue at the SRS because there are over 100 hazardous waste sites scattered throughout the 300 mile complex making it difficult to continually monitor all of the facilities. The goal is to use remote sensing technology to identify surface anomalies on the hazardous waste sites as early as possible so that remedial work can take place rapidly to maintain the integrity of the storage sites. The anomalous areas are then targeted for intensive in situ human examination and measurement. During the 1990s, many of the hazardous waste sites were capped with protective layers of polyethelene sheeting and soil, and planted with bahia grass and/or centipede grass. This research investigated hyperspectral remote sensing technology to determine if it can be used to measure accurately and monitor possible indicators of change on vegetated hazardous waste sites. Specifically, it evaluated the usefulness of hyperspectral remote sensing to assess the condition of vegetation on clay- caps on the Mixed Waste Management Facility (MWMF). This report first describes the principles of hyperspectral remote sensing. In situ measurement and hyperspectral remote sensing methods used to analyze hazardous waste sites on the Savannah River Site are then presented.« less

  6. Near surface geophysics techniques and geomorphological approach to reconstruct the hazard cave map in historical and urban areas

    NASA Astrophysics Data System (ADS)

    Lazzari, M.; Loperte, A.; Perrone, A.

    2010-03-01

    This work, carried out with an integrated methodological approach, focuses on the use of near surface geophysics techniques, such as ground penetrating radar and electrical resistivity tomography (ERT), and geomorphological analysis, in order to reconstruct the cave distribution and geometry in a urban context and, in particular, in historical centres. The interaction during recent centuries between human activity (caves excavation, birth and growth of an urban area) and the characters of the natural environment were the reasons of a progressive increase in hazard and vulnerability levels of several sites. The reconstruction of a detailed cave map distribution is the first step to define the anthropic and geomorphological hazard in urban areas, fundamental basis for planning and assessing the risk.

  7. Crowdsourcing engagement and applications for communities within crisis events

    NASA Astrophysics Data System (ADS)

    Frigerio, Simone; Schenato, Luca; Bossi, Giulia; Mantovani, Matteo; Crema, Stefano; Cavalli, Marco; Marcato, Gianluca; Pasuto, Alessandro

    2017-04-01

    Civil protection attitude is a changing pattern within natural hazards, deploying responsibilities from central government to local authorities. The competence of volunteers and the awareness and involvement of local inhabitants are key points for prevention and preparedness. Citizens and volunteers become first actors of civil protection, toward context-specific strategies of surveillance and territorial surveys. The crowd-mapping technology includes a mobile solution tested insight trained communities, as participation within disaster response. The platform includes also a user-friendly dashboard for data gathering and analysis in multi-hazard realities, tested with pilot case studies. Usability and gradual innovation of platform are continuous granted by cloud dataset and bugfixing controls. The first module focuses on flood processes gathering data from local and trained population, for awareness and long-term preparedness. The second module integrates field survey of several volunteers within rescue squads, combining geolocations and comparing dataset collected in pre-emergency steps in urban case studies. The results include an easy-to-use data interface for crisis management, a tested support within crisis combined with personal awareness, continuously updated and customized. The development provides a version for Android 4.0 onward, the web application combines a cloud architecture with a relational database and web services, integrated with SDK cloud notification. The wireframes planned two accesses for a Citizens Kit and a Volunteers Kit, synchronized with a common dashboard. The follow up includes the integration between mobile solutions with sensors for dynamic update and data export for GIS analysis. The location-based services uses location data to monitor parameters and control features within natural hazard. A human sensor network is the aim, integrating sensor measurements with external observation as baseline of future modelling. Point data like humidity, temperature and pressure are geolocated and real-time. Human sensors reveal a massive approach of crowdsourcing, and user-friendly dashboards appears as solid control of data management to support resilience and quality of risk assessment.

  8. Investigating Uncertainty and Sensitivity in Integrated, Multimedia Environmental Models: Tools for FRAMES-3MRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babendreier, Justin E.; Castleton, Karl J.

    2005-08-01

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less

  9. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  10. Large Scale System Safety Integration for Human Rated Space Vehicles

    NASA Astrophysics Data System (ADS)

    Massie, Michael J.

    2005-12-01

    Since the 1960s man has searched for ways to establish a human presence in space. Unfortunately, the development and operation of human spaceflight vehicles carry significant safety risks that are not always well understood. As a result, the countries with human space programs have felt the pain of loss of lives in the attempt to develop human space travel systems. Integrated System Safety is a process developed through years of experience (since before Apollo and Soyuz) as a way to assess risks involved in space travel and prevent such losses. The intent of Integrated System Safety is to take a look at an entire program and put together all the pieces in such a way that the risks can be identified, understood and dispositioned by program management. This process has many inherent challenges and they need to be explored, understood and addressed.In order to prepare truly integrated analysis safety professionals must gain a level of technical understanding of all of the project's pieces and how they interact. Next, they must find a way to present the analysis so the customer can understand the risks and make decisions about managing them. However, every organization in a large-scale project can have different ideas about what is or is not a hazard, what is or is not an appropriate hazard control, and what is or is not adequate hazard control verification. NASA provides some direction on these topics, but interpretations of those instructions can vary widely.Even more challenging is the fact that every individual/organization involved in a project has different levels of risk tolerance. When the discrete hazard controls of the contracts and agreements cannot be met, additional risk must be accepted. However, when one has left the arena of compliance with the known rules, there can be no longer be specific ground rules on which to base a decision as to what is acceptable and what is not. The integrator must find common grounds between all parties to achieve concurrence on these non-compliant conditionsAnother area of challenge lies in determining the credibility of a proposed hazard. For example, NASA's definition of a credible hazard is accurate but does not provide specific guidance about contractors declaring a hazard "not credible" and ceasing working on that item.Unfortunately, this has the side effect of taking valuable resources from high-risk areas and using them to investigate whether these extremely low risk items have the potential to become worse than they appear.In order to deal with these types of issues, there must exist the concept of a "Safe State" and it must be used as a building block to help address many of the technical and social challenges in working safety and risk management. This "Safe State" must serve as the foundation for building the cultural modifications needed to assure that safety issues are properly identified, heard, and dispositioned by our space program management.As the space program and the countries involved in it move forward in development of human rated spacecraft, they must learn from the recent Columbia accident and establish new/modified basis for safety risk decisions. Those involved must also become more cognizant of the diversity in safety approaches and agree on how to deal with them. Most of all, those involved must never forget that while the System Safety duty maybe difficult, their efforts help to preserve the lives of space crews and their families.

  11. Rapid field-based landslide hazard assessment in response to post-earthquake emergency

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Gambini, Stefano; Cancelliere, Giorgio

    2016-04-01

    On April 25, 2015 a Mw 7.8 earthquake occurred 80 km to the northwest of Kathmandu (Nepal). The largest aftershock, occurred on May 12, 2015, was the Mw 7.3 Nepal earthquake (SE of Zham, China), 80 km to the east of Kathmandu. . The earthquakes killed ~9000 people and severely damaged a 10,000 sqkm region in Nepal and neighboring countries. Several thousands of landslides have been triggered during the event, causing widespread damages to mountain villages and the evacuation of thousands of people. Rasuwa was one of the most damaged districts. This contribution describes landslide hazard analysis of the Saramthali, Yarsa and Bhorle VDCs (122 km2, Rasuwa district). Hazard is expressed in terms of qualitative classes (low, medium, high), through a simple matrix approach that combines frequency classes and magnitude classes. The hazard analysis is based primarily on the experience gained during a field survey conducted in September 2014. During the survey, local knowledge has been systematically exploited through interviews with local people that have experienced the earthquake and the coseismic landslides. People helped us to recognize fractures and active deformations, and allowed to reconstruct a correct chronicle of landslide events, in order to assign the landslide events to the first shock, the second shock, or the post-earthquake 2015 monsoon. The field experience was complemented with a standard analysis of the relationship between potential controlling factors and the distribution of landslides reported in Kargel et al (2016). This analysis allowed recognizing the most important controlling factor. This information was integrated with the field observations to verify the mapped units and to complete the mapping in area not accessible for field activity. Finally, the work was completed with the analysis and the use of a detailed landslide inventory produced by the University of Milano Bicocca that covers most of the area affected by coseismic landslides in Nepal (Valagussa et al, 2016). As a result, a 1:10.000 hazard map was produced. About 47% of the area is classified at high hazard, almost 19 % at medium level and 34% at low risk. In addition, the hazard map reports 262 polygons of active coseismic or postseismic landslides.

  12. Comparison of a traditional with a new approach based on Graph Theory to vulnerability assessment to natural hazards: a study case on a socio-economic complex system

    NASA Astrophysics Data System (ADS)

    Arosio, Marcello; Martina, Mario L. V.

    2017-04-01

    The emergent behaviour of the contemporary complex, socio-technical and interconnected society makes the collective risk greater than the sum of the parts and this requires a holistic, systematic and integrated approach. Although there have been major improvements in recent years, there are still some limitation in term of a holistic approach that is able to include the emergent value hidden in the connections between exposed elements and the interactions between the different spheres of the multi-hazards, vulnerability, exposure and resilience. To deal with these challenges it is necessary to consider the connections between the exposed elements (e.g. populations, schools, hospital, etc.) and to quantify the relative importance of the elements and their interconnections (e.g. the need of injured people to go to hospital or children to school). In a system (e.g. road, hospital and ecological network, etc.), or in a System of System (e.g. socio-technical urban service), there are critical elements that, beyond the intrinsic vulnerability, can be characterized by greater or lower vulnerability because of their physical, geographical, cyber or logical connections. To this aim, we propose in this study a comparative analysis between traditional reductionist approach and a new holistic approach to vulnerability assessment to natural hazards. The analysis considers a study case of a socio-economic complex system through an innovative approach based on the properties of graph G=(N,L). A graph consists of two sets N (nodes) and L (links): the nodes represent the single exposed elements (physical, social, environmental, etc.) to a hazard, while the links (or connections) represent the interaction between the elements. The final goal is to illustrate an application of this innovative approach of integrated collective vulnerability assessment.

  13. The risk concept and its application in natural hazard risk management in Switzerland

    NASA Astrophysics Data System (ADS)

    Bründl, M.; Romang, H. E.; Bischof, N.; Rheinberger, C. M.

    2009-05-01

    Over the last ten years, a risk-based approach to manage natural hazards - termed the risk concept - has been introduced to the management of natural hazards in Switzerland. Large natural hazard events, new political initiatives and limited financial resources have led to the development and introduction of new planning instruments and software tools that should support natural hazard engineers and planners to effectively and efficiently deal with natural hazards. Our experience with these new instruments suggests an improved integration of the risk concept into the community of natural hazard engineers and planners. Important factors for the acceptance of these new instruments are the integration of end-users during the development process, the knowledge exchange between science, developers and end-users as well as training and education courses for users. Further improvements require the maintenance of this knowledge exchange and a mindful adaptation of the instruments to case-specific circumstances.

  14. Application of seismic interpretation in the development of Jerneh Field, Malay Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yusoff, Z.

    1994-07-01

    Development of the Jerneh gas field has been significantly aided by the use of 3-D and site survey seismic interpretations. The two aspects that have been of particular importance are identification of sea-floor and near-surface safety hazards for safe platform installation/development drilling and mapping of reservoirs/hydrocarbons within gas-productive sands of the Miocene groups B, D, and E. Choice of platform location as well as casing design require detailed analysis of sea-floor and near-surface safety hazards. At Jerneh, sea-floor pockmarks near-surface high amplitudes, distributary channels, and minor faults were recognized as potential operational safety hazards. The integration of conventional 3-D andmore » site survey seismic data enabled comprehensive understanding of the occurrence and distribution of potential hazards to platform installation and development well drilling. Three-dimensional seismic interpretation has been instrumental not only in the field structural definition but also in recognition of reservoir trends and hydrocarbon distribution. Additional gas reservoirs were identified by their DHI characteristics and subsequently confirmed by development wells. The innovative use of seismic attribute mapping techniques has been very important in defining both fluid and reservoir distribution in groups B and D. Integration of 3-D seismic data and well-log interpretations has helped in optimal field development, including the planning of well locations and drilling sequence.« less

  15. Advanced Environmental Monitoring and Control Program: Strategic Plan

    NASA Technical Reports Server (NTRS)

    Schmidt, Gregory

    1996-01-01

    Human missions in space, from short-duration shuttle missions lasting no more than several days to the medium-to-long-duration missions planned for the International Space Station, face a number of hazards that must be understood and mitigated for the mission to be carried out safely. Among these hazards are those posed by the internal environment of the spacecraft itself; through outgassing of toxic vapors from plastics and other items, failures or off-nominal operations of spacecraft environmental control systems, accidental exposure to hazardous compounds used in experiments: all present potential hazards that while small, may accumulate and pose a danger to crew health. The first step toward mitigating the dangers of these hazards is understanding the internal environment of the spacecraft and the compounds contained within it. Future spacecraft will have integrated networks of redundant sensors which will not only inform the crew of hazards, but will pinpoint the problem location and, through analysis by intelligent systems, recommend and even implement a course of action to stop the problem. This strategic plan details strategies to determine NASA's requirements for environmental monitoring and control systems for future spacecraft, and goals and objectives for a program to answer these needs.

  16. 75 FR 15485 - Pipeline Safety: Workshop on Guidelines for Integrity Assessment of Cased Pipe

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-29

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID...: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of workshop. SUMMARY... ``Guidelines for Integrity Assessment of Cased Pipe in Gas Transmission Pipelines'' and related Frequently...

  17. 75 FR 5244 - Pipeline Safety: Integrity Management Program for Gas Distribution Pipelines; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-02

    ... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...

  18. 77 FR 34123 - Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-08

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0100] Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines AGENCY: Office of Pipeline Safety, Pipeline and Hazardous Materials Safety Administration, DOT. ACTION...

  19. Hazard Maps in the Classroom.

    ERIC Educational Resources Information Center

    Cross, John A.

    1988-01-01

    Emphasizes the use of geophysical hazard maps and illustrates how they can be used in the classroom from kindergarten to college level. Depicts ways that hazard maps of floods, landslides, earthquakes, volcanoes, and multi-hazards can be integrated into classroom instruction. Tells how maps may be obtained. (SLM)

  20. Coastal dynamics studies for evaluation of hazard and vulnerability for coastal erosion. case study the town La Bocana, Buenaventura, colombian pacific

    NASA Astrophysics Data System (ADS)

    Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza

    2015-04-01

    The analysis of the hazard and vulnerability in coastal areas caused for erosion is based on studies of coastal dynamics since that allows having a better information detail that is useful for decision-making in aspects like prevention, mitigation, disaster reduction and integrated risk management. The Town of La Bocana, located in Buenaventura (Colombian Pacific) was selected to carry out the threat assessment for coastal erosion based on three components: i) magnitude, ii) occurrence and iii) susceptibility. Vulnerability meanwhile, is also composed of three main components for its evaluation: i) exposure ii) fragility and iii) resilience, which in turn are evaluated in 6 dimensions of vulnerability: physical, social, economic, ecological, institutional and cultural. The hazard analysis performed used a semi-quantitative approach, and an index of variables such as type of geomorphological unit, type of beach, exposure of the surfing coast, occurrence, among others. Quantitative data of coastal retreat was measured through the use of DSAS (Digital Shoreline Analysis System) an application of ArcGIS, as well as the development of digital elevation models from the beach and 6 beach profiles strategically located on the coast obtained with GNSS technology. Sediment samples collected from these beaches, medium height and wave direction were used as complementary data. The information was integrated across the coast line into segments of 250 x 250 meters. 4 sectors are part of the coastal area of La Bocana: Pianguita, Vistahermosa, Donwtown and Shangay. 6 vulnerability dimensions units were taken from these population, as well as its density for exposure, wich was analyzed through a multi-array method that include variables such as, land use, population, type of structure, education, basic services, among others, to measure frailty, and their respective indicator of resilience. The hazard analysis results indicate that Vistahermosa is in very high threat, while Donwtown and Pianguita are in a medium hazard. Particularly these two sectors have the mayor population density and the biggest hotel development and services infraestructure; meanwhile Shangay was scored with low hazard because the wave action has no direct impact on it. Vulnerability analysis suggest that the sector of Shangay has a very high vulnerability status because it is a sector that does not have any basic services and have low levels of schooling, meanwhile Downtown, Vistahermosa and Pianguita are in the average of vulnerability. Additionally, it was determined that in recent years the sector of Vista hermosa the erosion rates are up to -xx m yr-1, while in other sectors the regression of the coastline can be associated with local tidal peaks that occur during April and October, while other months of the year are typically for recovery and stability processes.

  1. Fuels planning: science synthesis and integration; social issues fact sheet 08: The "Golden Rule" and other lessons on communicating about hazards

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Other fact sheets identified considerations for communicating about hazards, talked about the importance of working locally, and discussed the seven laws of effective hazard communication. This fact sheet introduces the "Golden Rule" of hazard communication and shares some final lessons from hazard educators.

  2. A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis

    DTIC Science & Technology

    2012-01-01

    probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY

  3. Human System Integration: Regulatory Analysis

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document was intended as an input to the Access 5 Policy Integrated Product team. Using a Human System Integration (HIS) perspective, a regulatory analyses of the FARS (specifically Part 91), the Airman s Information Manual (AIM) and the FAA Controllers Handbook (7110.65) was conducted as part of a front-end approach needed to derive HSI requirements for Unmanned Aircraft Systems (UAS) operations in the National Airspace System above FL430. The review of the above aviation reference materials yielded eighty-four functions determined to be necessary or highly desirable for flight within the Air Traffic Management System. They include categories for Flight, Communications, Navigation, Surveillance, and Hazard Avoidance.

  4. Integrated hazard assessment of Cirenmaco glacial lake in Zhangzangbo valley, Central Himalayas

    NASA Astrophysics Data System (ADS)

    Wang, Weicai; Gao, Yang; Iribarren Anacona, Pablo; Lei, Yanbin; Xiang, Yang; Zhang, Guoqing; Li, Shenghai; Lu, Anxin

    2018-04-01

    Glacial lake outburst floods (GLOFs) have recently become one of the primary natural hazards in the Himalayas. There is therefore an urgent need to assess GLOF hazards in the region. Cirenmaco, a moraine-dammed lake located in the upstream portion of Zhangzangbo valley, Central Himalayas, has received public attention after its damaging 1981 outburst flood. Here, by combining remote sensing methods, bathymetric survey and 2D hydraulic modeling, we assessed the hazard posed by Cirenmaco in its current status. Inter-annual variation of Cirenmaco lake area indicates a rapid lake expansion from 0.10 ± 0.08 km2 in 1988 to 0.39 ± 0.04 km2 in 2013. Bathymetric survey shows the maximum water depth of the lake in 2012 was 115 ± 2 m and the lake volume was calculated to be 1.8 × 107 m3. Field geomorphic analysis shows that Cirenmaco glacial lake is prone to GLOFs as mass movements and ice and snow avalanches can impact the lake and the melting of the dead ice in the moraine can lower the dam level. HEC-RAS 2D model was then used to simulate moraine dam failure of the Cirenmaco and assess GLOF impacts downstream. Reconstruction of Cirenmaco 1981 GLOF shows that HEC-RAS can produce reasonable flood extent and water depth, thus demonstrate its ability to effectively model complex GLOFs. GLOF modeling results presented can be used as a basis for the implementation of disaster prevention and mitigation measures. As a case study, this work shows how we can integrate different methods to GLOF hazard assessment.

  5. Nowcast model for hazardous material spill prevention and response, San Francisco Bay, California

    USGS Publications Warehouse

    Cheng, Ralph T.; Wilmot, Wayne L.; Galt, Jerry A.

    1997-01-01

    The National Oceanic and Atmospheric Administration (NOAA) installed the Physical Oceanographic Real-time System (PORTS) in San Francisco Bay, California, to provide real-time observations of tides, tidal currents, and meteorological conditions to, among other purposes, guide hazardous material spill prevention and response. Integrated with nowcast modeling techniques and dissemination of real-time data and the nowcasting results through the Internet on the World Wide Web, emerging technologies used in PORTS for real-time data collection forms a nowcast modeling system. Users can download tides and tidal current distribution in San Francisco Bay for their specific applications and/or for further analysis.

  6. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    NASA Astrophysics Data System (ADS)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  7. Spaceflight Ground Support Equipment Reliability & System Safety Data

    NASA Technical Reports Server (NTRS)

    Fernandez, Rene; Riddlebaugh, Jeffrey; Brinkman, John; Wilkinson, Myron

    2012-01-01

    Presented were Reliability Analysis, consisting primarily of Failure Modes and Effects Analysis (FMEA), and System Safety Analysis, consisting of Preliminary Hazards Analysis (PHA), performed to ensure that the CoNNeCT (Communications, Navigation, and Networking re- Configurable Testbed) Flight System was safely and reliably operated during its Assembly, Integration and Test (AI&T) phase. A tailored approach to the NASA Ground Support Equipment (GSE) standard, NASA-STD-5005C, involving the application of the appropriate Requirements, S&MA discipline expertise, and a Configuration Management system (to retain a record of the analysis and documentation) were presented. Presented were System Block Diagrams of selected GSE and the corresponding FMEA, as well as the PHAs. Also discussed are the specific examples of the FMEAs and PHAs being used during the AI&T phase to drive modifications to the GSE (via "redlining" of test procedures, and the placement of warning stickers to protect the flight hardware) before being interfaced to the Flight System. These modifications were necessary because failure modes and hazards were identified during the analysis that had not been properly mitigated. Strict Configuration Management was applied to changes (whether due to upgrades or expired calibrations) in the GSE by revisiting the FMEAs and PHAs to reflect the latest System Block Diagrams and Bill Of Material. The CoNNeCT flight system has been successfully assembled, integrated, tested, and shipped to the launch site without incident. This demonstrates that the steps taken to safeguard the flight system when it was interfaced to the various GSE were successful.

  8. A Synthetic Vision Preliminary Integrated Safety Analysis

    NASA Technical Reports Server (NTRS)

    Hemm, Robert; Houser, Scott

    2001-01-01

    This report documents efforts to analyze a sample of aviation safety programs, using the LMI-developed integrated safety analysis tool to determine the change in system risk resulting from Aviation Safety Program (AvSP) technology implementation. Specifically, we have worked to modify existing system safety tools to address the safety impact of synthetic vision (SV) technology. Safety metrics include reliability, availability, and resultant hazard. This analysis of SV technology is intended to be part of a larger effort to develop a model that is capable of "providing further support to the product design and development team as additional information becomes available". The reliability analysis portion of the effort is complete and is fully documented in this report. The simulation analysis is still underway; it will be documented in a subsequent report. The specific goal of this effort is to apply the integrated safety analysis to SV technology. This report also contains a brief discussion of data necessary to expand the human performance capability of the model, as well as a discussion of human behavior and its implications for system risk assessment in this modeling environment.

  9. Potential postwildfire debris-flow hazards—A prewildfire evaluation for the Jemez Mountains, north-central New Mexico

    USGS Publications Warehouse

    Tillery, Anne C.; Haas, Jessica R.

    2016-08-11

    Wildfire can substantially increase the probability of debris flows, a potentially hazardous and destructive form of mass wasting, in landscapes that have otherwise been stable throughout recent history. Although the exact location, extent, and severity of wildfire or subsequent rainfall intensity and duration cannot be known, probabilities of fire and debris‑flow occurrence for given locations can be estimated with geospatial analysis and modeling. The purpose of this report is to provide information on which watersheds might constitute the most serious potential debris-flow hazards in the event of a large-scale wildfire and subsequent rainfall in the Jemez Mountains. Potential probabilities and estimated volumes of postwildfire debris flows in both the unburned and previously burned areas of the Jemez Mountains and surrounding areas were estimated using empirical debris-flow models developed by the U.S. Geological Survey in combination with fire behavior and burn probability models developed by the U.S. Forest Service.Of the 4,998 subbasins modeled for this study, computed debris-flow probabilities in 671 subbasins were greater than 80 percent in response to the 100-year recurrence interval, 30-minute duration rainfall event. These subbasins ranged in size from 0.01 to 6.57 square kilometers (km2), with an average area of 0.29 km2, and were mostly steep, upstream tributaries to larger channels in the area. Modeled debris-flow volumes in 465 subbasins were greater than 10,000 cubic meters (m3), and 14 of those subbasins had modeled debris‑flow volumes greater than 100,000 m3.The rankings of integrated relative debris-flow hazard indexes for each subbasin were generated by multiplying the individual subbasin values for debris-flow volume, debris‑flow probability, and average burn probability. The subbasins with integrated hazard index values in the top 2 percent typically are large, upland tributaries to canyons and channels primarily in the Upper Rio Grande and Rio Grande-Santa Fe watershed areas. No subbasins in this group have basin areas less than 1.0 km2. Many of these areas already had significant mass‑wasting episodes following the Las Conchas Fire in 2011. Other subbasins with integrated hazard index values in the top 2 percent are scattered throughout the Jemez River watershed area, including some subbasins in the interior of the Valles Caldera. Only a few subbasins in the top integrated hazard index group are in the Rio Chama watershed area.This prewildfire assessment approach is valuable to resource managers because the analysis of the debris-flow threat is made before a wildfire occurs, which facilitates prewildfire management, planning, and mitigation. In north‑central New Mexico, widespread watershed restoration efforts are being done to safeguard vital watersheds against the threat of catastrophic wildfire. This study was designed to help select ideal locations for the restoration efforts that could have the best return on investment.

  10. An integrated decision model for the application of airborne sensors for improved response to accidental and terrorist chemical vapor releases

    NASA Astrophysics Data System (ADS)

    Kapitan, Loginn

    This research created a new model which provides an integrated approach to planning the effective selection and employment of airborne sensor systems in response to accidental or intentional chemical vapor releases. The approach taken was to use systems engineering and decision analysis methods to construct a model architecture which produced a modular structure for integrating both new and existing components into a logical procedure to assess the application of airborne sensor systems to address chemical vapor hazards. The resulting integrated process model includes an internal aggregation model which allowed differentiation among alternative airborne sensor systems. Both models were developed and validated by experts and demonstrated using appropriate hazardous chemical release scenarios. The resultant prototype integrated process model or system fills a current gap in capability allowing improved planning, training and exercise for HAZMAT teams and first responders when considering the selection and employment of airborne sensor systems. Through the research process, insights into the current response structure and how current airborne capability may be most effectively used were generated. Furthermore, the resultant prototype system is tailorable for local, state, and federal application, and can potentially be modified to help evaluate investments in new airborne sensor technology and systems. Better planning, training and preparedness exercising holds the prospect for the effective application of airborne assets for improved response to large scale chemical release incidents. Improved response will result in fewer casualties and lives lost, reduced economic impact, and increased protection of critical infrastructure when faced with accidental and intentional terrorist release of hazardous industrial chemicals. With the prospect of more airborne sensor systems becoming available, this prototype system integrates existing and new tools into an effective process for the selection and employment of airborne sensors to better plan, train and exercise ahead of potential chemical release events.

  11. Integrated approach for coastal hazards and risks in Sri Lanka

    NASA Astrophysics Data System (ADS)

    Garcin, M.; Desprats, J. F.; Fontaine, M.; Pedreros, R.; Attanayake, N.; Fernando, S.; Siriwardana, C. H. E. R.; de Silva, U.; Poisson, B.

    2008-06-01

    The devastating impact of the tsunami of 26 December 2004 on the shores of the Indian Ocean recalled the importance of knowledge and the taking into account of coastal hazards. Sri Lanka was one of the countries most affected by this tsunami (e.g. 30 000 dead, 1 million people homeless and 70% of the fishing fleet destroyed). Following this tsunami, as part of the French post-tsunami aid, a project to establish a Geographical Information System (GIS) on coastal hazards and risks was funded. This project aims to define, at a pilot site, a methodology for multiple coastal hazards assessment that might be useful for the post-tsunami reconstruction and for development planning. This methodology could be applied to the whole coastline of Sri Lanka. The multi-hazard approach deals with very different coastal processes in terms of dynamics as well as in terms of return period. The first elements of this study are presented here. We used a set of tools integrating a GIS, numerical simulations and risk scenario modelling. While this action occurred in response to the crisis caused by the tsunami, it was decided to integrate other coastal hazards into the study. Although less dramatic than the tsunami these remain responsible for loss of life and damage. Furthermore, the establishment of such a system could not ignore the longer-term effects of climate change on coastal hazards in Sri Lanka. This GIS integrates the physical and demographic data available in Sri Lanka that is useful for assessing the coastal hazards and risks. In addition, these data have been used in numerical modelling of the waves generated during periods of monsoon as well as for the December 2004 tsunami. Risk scenarios have also been assessed for test areas and validated by field data acquired during the project. The results obtained from the models can be further integrated into the GIS and contribute to its enrichment and to help in better assessment and mitigation of these risks. The coastal-hazards-and-risks GIS coupled with modelling thus appears to be a very useful tool that can constitute the skeleton of a coastal zone management system. Decision makers will be able to make informed choices with regards to hazards during reconstruction and urban planning projects.

  12. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Research Team

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.

  13. Architecting the Safety Assessment of Large-scale Systems Integration

    DTIC Science & Technology

    2009-12-01

    Electromagnetic Radiation to Ordnance ( HERO )  Hazards of Electromagnetic Radiation to Fuel (HERF) The main reason that this particular safety study... radiation , high voltage electric shocks and explosives safety. 1. Radiation Hazards (RADHAZ) RADHAZ describes the hazards of electromagnetic radiation ...OP3565/NAVAIR 16-1-529 [19 and 20], these hazards are segregated as follows:  Hazards of Electromagnetic

  14. A review of multi-risk methodologies for natural hazards: Consequences and challenges for a climate change impact assessment.

    PubMed

    Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio

    2016-03-01

    This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. The evolution of a health hazard assessment database management system for military weapons, equipment, and materiel.

    PubMed

    Murnyak, George R; Spencer, Clark O; Chaney, Ann E; Roberts, Welford C

    2002-04-01

    During the 1970s, the Army health hazard assessment (HHA) process developed as a medical program to minimize hazards in military materiel during the development process. The HHA Program characterizes health hazards that soldiers and civilians may encounter as they interact with military weapons and equipment. Thus, it is a resource for medical planners and advisors to use that can identify and estimate potential hazards that soldiers may encounter as they train and conduct missions. The U.S. Army Center for Health Promotion and Preventive Medicine administers the program, which is integrated with the Army's Manpower and Personnel Integration program. As the HHA Program has matured, an electronic database has been developed to record and monitor the health hazards associated with military equipment and systems. The current database tracks the results of HHAs and provides reporting designed to assist the HHA Program manager in daily activities.

  16. Aircraft Loss of Control: Problem Analysis for the Development and Validation of Technology Solutions

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Newman, Richard L.; Crider, Dennis A.; Klyde, David H.; Foster, John V.; Groff, Loren

    2016-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes. LOC can result from a wide spectrum of precursors (or hazards), often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and the validation process must provide a means of assessing system effectiveness and coverage of these hazards. This paper provides a detailed description of a methodology for analyzing LOC as a dynamics and control problem for the purpose of developing effective technology solutions. The paper includes a definition of LOC based on several recent publications, a detailed description of a refined LOC accident analysis process that is illustrated via selected example cases, and a description of planned follow-on activities for identifying future potential LOC risks and the development of LOC test scenarios. Some preliminary considerations for LOC of Unmanned Aircraft Systems (UAS) and for their safe integration into the National Airspace System (NAS) are also discussed.

  17. Regional landslide susceptibility assessment using multi-stage remote sensing data along the coastal range highway in northeastern Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Fang; Huang, Wei-Kai; Chang, Yu-Lin; Chi, Shu-Yeong; Liao, Wu-Chang

    2018-01-01

    Typhoons Megi (2010) and Saola (2012) brought torrential rainfall which triggered regional landslides and flooding hazards along Provincial Highway No. 9 in northeastern Taiwan. To reduce property loss and saving lives, this study combines multi-hazard susceptibility assessment with environmental geology map a rock mass rating system (RMR), remote sensing analysis, and micro-topography interpretation to develop an integrated landslide hazard assessment approach and reflect the intrinsic state of slopeland from the past toward the future. First, the degree of hazard as indicated by historical landslides was used to determine many landslide regions in the past. Secondly, geo-mechanical classification of rock outcroppings was performed by in-situ investigation along the vulnerable road sections. Finally, a high-resolution digital elevation model was extracted from airborne LiDAR and multi-temporal remote sensing images which was analyzed to discover possible catastrophic landslide hotspot shortly. The results of the analysis showed that 37% of the road sections in the study area were highly susceptible to landslide hazards. The spatial distribution of the road sections revealed that those characterized by high susceptibility were located near the boundaries of fault zones and in areas of lithologic dissimilarity. Headward erosion of gullies and concave-shaped topographic features had an adverse effect and was the dominant factor triggering landslides. Regional landslide reactivation on this coastal highway are almost related to the past landslide region based on hazard statistics. The final results of field validation demonstrated that an accuracy of 91% could be achieved for forecasting geohazard followed by intense rainfall events and typhoons.

  18. Map Your Hazards! - an Interdisciplinary, Place-Based Educational Approach to Assessing Natural Hazards, Social Vulnerability, Risk and Risk Perception.

    NASA Astrophysics Data System (ADS)

    Brand, B. D.; McMullin-Messier, P. A.; Schlegel, M. E.

    2014-12-01

    'Map your Hazards' is an educational module developed within the NSF Interdisciplinary Teaching about Earth for a Sustainable Future program (InTeGrate). The module engages students in place-based explorations of natural hazards, social vulnerability, and the perception of natural hazards and risk. Students integrate geoscience and social science methodologies to (1) identify and assess hazards, vulnerability and risk within their communities; (2) distribute, collect and evaluate survey data (designed by authors) on the knowledge, risk perception and preparedness within their social networks; and (3) deliver a PPT presentation to local stakeholders detailing their findings and recommendations for development of a prepared, resilient community. 'Map your Hazards' underwent four rigorous assessments by a team of geoscience educators and external review before being piloted in our classrooms. The module was piloted in a 300-level 'Volcanoes and Society' course at Boise State University, a 300-level 'Environmental Sociology' course at Central Washington University, and a 100-level 'Natural Disasters and Environmental Geology' course at the College of Western Idaho. In all courses students reported a fascination with learning about the hazards around them and identifying the high risk areas in their communities. They were also surprised at the low level of knowledge, inaccurate risk perception and lack of preparedness of their social networks. This successful approach to engaging students in an interdisciplinary, place-based learning environment also has the broad implications of raising awareness of natural hazards (survey participants are provided links to local hazard and preparedness information). The data and preparedness suggestions can be shared with local emergency managers, who are encouraged to attend the student's final presentations. All module materials are published at serc.carleton.edu/integrate/ and are appropriate to a wide range of classrooms.

  19. Integrated Geo Hazard Management System in Cloud Computing Technology

    NASA Astrophysics Data System (ADS)

    Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.

    2016-11-01

    Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.

  20. Safer Schools: Achieving a Healthy Learning Environment through Integrated Pest Management.

    ERIC Educational Resources Information Center

    2003

    Integrated pest management (IPM) is a program of prevention, monitoring, and control that offers the opportunity to eliminate or drastically reduce hazardous pesticide use. IPM is intended to establish a program that uses cultural, mechanical, biological, and other non-toxic practices, and only introduces least-hazardous chemicals as a last…

  1. Assessing Natural Hazard Vulnerability Through Marmara Region Using GIS

    NASA Astrophysics Data System (ADS)

    Sabuncu, A.; Garagon Dogru, A.; Ozener, H.

    2013-12-01

    Natural hazards are natural phenomenon occured in the Earth's system that include geological and meteorological events such as earthquakes, floods, landslides, droughts, fires and tsunamis. The metropolitan cities are vulnerable to natural hazards due to their population densities, industrial facilities and proporties. The urban layout of the megacities are complex since industrial facilities are interference with residential area. The Marmara region is placed in North-western Turkey suffered from natural hazards (earthquakes, floods etc.) for years. After 1999 Kocaeli and Duzce earthquakes and 2009 Istanbul flash floods, dramatic number of casualities and economic losses were reported by the authorities. Geographic information systems (GIS) have substantial capacity in order to develop natural disaster management. As these systems provide more efficient and reliable analysis and evaluation of the data in the management, and also convenient and better solutions for the decision making before during and after the natural hazards. The Earth science data and socio-economic data can be integrated into a GIS as different layers. Additionally, satellite data are used to understand the changes pre and post the natural hazards. GIS is a powerful software for the combination of different type of digital data. A natural hazard database for the Marmara region provides all different types of digital data to the users. All proper data collection processing and analysing are critical to evaluate and identify hazards. The natural hazard database allows users to monitor, analyze and query past and recent disasters in the Marmara Region. The long term aim of this study is to develop geodatabase and identify the natural hazard vulnerabilities of the metropolitan cities.

  2. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    NASA Astrophysics Data System (ADS)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.

  3. Long term volcanic hazard analysis in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit having enough quality information to map volcanic hazards and to run more reliable models of volcanic hazards, but in addition it aims to become a sharing system, improving communication between researchers, reducing redundant work and to be the reference for geological research in the Canary Islands.

  4. The Gars Programme And The Integrated Global Observing Strategy For Geohazards

    NASA Astrophysics Data System (ADS)

    Marsh, S.; Paganini, M.; Missotten, R.; Palazzo, F.

    UNESCO and the IUGS have funded the Geological Applications of Remote Sensing Programme (GARS) since 1984. Its aim is to assess the value and utility of remotely sensed data for geoscience, whilst at the same time building capacity in developing countries. It has run projects in Africa on geological mapping, in Latin America on landslide hazards and in Asia on volcanic hazards. It is a main sponsor of the Integrated Global Observing Strategy (IGOS) for Geohazards. The societal impact of geological and related geophysical hazards is enormous. Every year volcanoes, earthquakes, landslides and subsidence claim thousands of lives, injure thousands more, devastate homes and destroy livelihoods. Damaged infrastructure and insurance premiums increase these costs. As population increases, more people live in hazardous areas and the impact grows. The World Summit on Sustainable Development recognised that systematic, joint international observations under initiatives like the Integrated Global Observing Strategy form the basis for an integrated approach to hazard mitigation and preparedness. In this context, the IGOS Partners developed this geohazards theme. Its goal is to integrate disparate, multidisciplinary, applied research into global, operational systems by filling gaps in organisation, observation and knowledge. It has four strategic objectives; building global capacity to mitigate geohazards; improving mapping, monitoring and forecasting, based on satellite and ground-based observations; increasing preparedness, using integrated geohazards information products and improved geohazards models; and promoting global take-up of local best practice in geohazards management. Gaps remain between what is known and the knowledge required to answer citizen's questions, what is observed and what must be observed to provide the necessary information for hazard mitigation and current data integration and the integration needed to make useful geohazard information products. An action plan is proposed that is designed to close these gaps. Priority actions are to: begin networking within the geohazards community; improve topographic data provision using existing observations and secure continuity of C- and L-Band radar interferometry with the space agencies; assess the potential for existing data to be integrated into geohazard products and services; evaluate ways to improve databases with their managing agencies; and initiate research that increases geohazards knowledge. This paper presents the strategy and describes the action plan that will implement it over the next decade, as a key part of the GARS Programme.

  5. Community clusters of tsunami vulnerability in the US Pacific Northwest

    USGS Publications Warehouse

    Wood, Nathan J.; Jones, Jeanne M.; Spielman, Seth; Schmidtlein, Mathew C.

    2015-01-01

    Many coastal communities throughout the world are threatened by local (or near-field) tsunamis that could inundate low-lying areas in a matter of minutes after generation. Although the hazard and sustainability literature often frames vulnerability conceptually as a multidimensional issue involving exposure, sensitivity, and resilience to a hazard, assessments often focus on one element or do not recognize the hazard context. We introduce an analytical framework for describing variations in population vulnerability to tsunami hazards that integrates (i) geospatial approaches to identify the number and characteristics of people in hazard zones, (ii) anisotropic path distance models to estimate evacuation travel times to safety, and (iii) cluster analysis to classify communities with similar vulnerability. We demonstrate this approach by classifying 49 incorporated cities, 7 tribal reservations, and 17 counties from northern California to northern Washington that are directly threatened by tsunami waves associated with a Cascadia subduction zone earthquake. Results suggest three primary community groups: (i) relatively low numbers of exposed populations with varied demographic sensitivities, (ii) high numbers of exposed populations but sufficient time to evacuate before wave arrival, and (iii) moderate numbers of exposed populations but insufficient time to evacuate. Results can be used to enhance general hazard-awareness efforts with targeted interventions, such as education and outreach tailored to local demographics, evacuation training, and/or vertical evacuation refuges.

  6. Passive imaging based multi-cue hazard detection spacecraft safe landing

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Cheng, Yang; Madison, Richard

    2006-01-01

    Accurate assessment of potentially damaging ground hazards during the spacecraft EDL (Entry, Descent and Landing) phase is crucial to insure a high probability of safe landing. A lander that encounters a large rock, falls off a cliff, or tips over on a steep slope can sustain mission ending damage. Guided entry is expected to shrink landing ellipses from 100-300 km to -10 km radius for the second generation landers as early as 2009. Regardless of size and location, however, landing ellipses will almost always contain hazards such as craters, discontinuities, steep slopes, and large rocks. It is estimated that an MSL (Mars Science Laboratory)-sized lander should detect and avoid 16- 150m diameter craters, vertical drops similar to the edges of 16m or 3.75m diameter crater, for high and low altitude HAD (Hazard Detection and Avoidance) respectively. It should also be able to detect slopes 20' or steeper, and rocks 0.75m or taller. In this paper we will present a passive imaging based, multi-cue hazard detection and avoidance (HDA) system suitable for Martian and other lander missions. This is the first passively imaged HDA system that seamlessly integrates multiple algorithm-crater detection, slope estimation, rock detection and texture analysis, and multicues- crater morphology, rock distribution, to detect these hazards in real time.

  7. Microzonation Mapping Of The Yanbu Industrial City, Western Saudi Arabia: A Multicriteria Decision Analysis Approach

    NASA Astrophysics Data System (ADS)

    Moustafa, Sayed, Sr.; Alarifi, Nassir S.; Lashin, Aref A.

    2016-04-01

    Urban areas along the western coast of Saudi Arabia are susceptible to natural disasters and environmental damages due to lack of planning. To produce a site-specific microzonation map of the rapidly growing Yanbu industrial city, spatial distribution of different hazard entities are assessed using the Analytical Hierarchal Process (AHP) together with Geographical Information System (GIS). For this purpose six hazard parameter layers are considered, namely; fundamental frequency, site amplification, soil strength in terms of effective shear-wave velocity, overburden sediment thickness, seismic vulnerability index and peak ground acceleration. The weight and rank values are determined during AHP and are assigned to each layer and its corresponding classes, respectively. An integrated seismic microzonation map was derived using GIS platform. Based on the derived map, the study area is classified into five hazard categories: very low, low, moderate high, and very high. The western and central parts of the study area, as indicated from the derived microzonation map, are categorized as a high hazard zone as compared to other surrounding places. The produced microzonation map of the current study is envisaged as a first-level assessment of the site specific hazards in the Yanbu city area, which can be used as a platform by different stakeholders in any future land-use planning and environmental hazard management.

  8. Community clusters of tsunami vulnerability in the US Pacific Northwest.

    PubMed

    Wood, Nathan J; Jones, Jeanne; Spielman, Seth; Schmidtlein, Mathew C

    2015-04-28

    Many coastal communities throughout the world are threatened by local (or near-field) tsunamis that could inundate low-lying areas in a matter of minutes after generation. Although the hazard and sustainability literature often frames vulnerability conceptually as a multidimensional issue involving exposure, sensitivity, and resilience to a hazard, assessments often focus on one element or do not recognize the hazard context. We introduce an analytical framework for describing variations in population vulnerability to tsunami hazards that integrates (i) geospatial approaches to identify the number and characteristics of people in hazard zones, (ii) anisotropic path distance models to estimate evacuation travel times to safety, and (iii) cluster analysis to classify communities with similar vulnerability. We demonstrate this approach by classifying 49 incorporated cities, 7 tribal reservations, and 17 counties from northern California to northern Washington that are directly threatened by tsunami waves associated with a Cascadia subduction zone earthquake. Results suggest three primary community groups: (i) relatively low numbers of exposed populations with varied demographic sensitivities, (ii) high numbers of exposed populations but sufficient time to evacuate before wave arrival, and (iii) moderate numbers of exposed populations but insufficient time to evacuate. Results can be used to enhance general hazard-awareness efforts with targeted interventions, such as education and outreach tailored to local demographics, evacuation training, and/or vertical evacuation refuges.

  9. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  10. Guidance Manual for Integrating Hazardous Material Control and Management into System Acquisition Programs

    DTIC Science & Technology

    1993-04-01

    34 in the remainder of this "• IPS. Ensure that system safety, Section refer to the DoD format paragraph health hazards, and environmental for the...hazardous materials is controlled in the manner which protects human health and the environment at the least cost. Hazardous Material Control and Management...of hazardous materials is controlled in a manner which protects human health and the environment at the least cost. Hazardous Material Control and

  11. Probabilistic tsunami hazard analysis: Multiple sources and global applications

    USGS Publications Warehouse

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-01-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  12. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    NASA Astrophysics Data System (ADS)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  13. Geological hazard zonation in a marble exploitation area (Apuan Alps, Italy)

    NASA Astrophysics Data System (ADS)

    Francioni, M.; Salvini, R.; Riccucci, S.

    2011-12-01

    The present paper describes the hazard mapping of an exploitation area sited in the Apuan Alps marble district (Italy) carried out by the integration of various survey and analysis methodologies. The research, supported by the Massa and Carrara Local Sanitary Agency responsible for workplace health and safety activities, aimed to reduce the high degree hazard of rock fall caused by the presence of potentially unstable blocks located on slopes overhanging the marble quarries. The study of rocky fronts bases on the knowledge of both the structural setting and the physical-mechanical properties of intact material and its discontinuities. In this work the main difficulty in obtaining this information was the inaccessibility of the slope overhanging the area (up to 500 meters high). For this reason, the structural and geological-engineering surveys were integrated by outcomes from digital photogrammetry carried out through terrestrial stereoscopic photos acquired from an aerostatic balloon and a helicopter. In this way, it was possible to derive the geometrical characteristics of joints (such as discontinuities dip, dip direction, spacing and persistence), blocks volumes and slopes morphology also in inaccessible areas. This information, combined with data coming from the geological-engineering survey, was used to perform the stability analysis of the slope. Subsequently, using the topographic map at the scale of 1:2,000, the Digital Terrain Model (DTM) of the slopes and several topographic profiles along it were produced. Assuming that there is a good correspondence between travelling paths and maximum down slope angle, probable trajectories of rock fall along the slope were calculated on the DTM by means of a GIS procedure which utilizes the ArcHydro module of EsriTM ArcMap software. When performing such a 2D numerical modelling of rock falls, lateral dispersion of trajectories has often been hampered by the "a priori" choice of the travelling path. Such a choice can be assessed largely subjective and it leads to possible errors. Thus, rock fall hazard zonation needs spatially distributed analyses including a reliable modelling of lateral dispersion. In this research Conefall software, a freeware QuanterraTM code that estimates the potential run out areas by means of a "so-called" cone method, was used to compute the spatial distribution of rock falls frequency, velocities and kinetic energies. In this way, a modelling approach based on local morphologies was employed to assess the accuracy of the 2D analysis by profiles created "a priori" along the maximum down slope angle. Final results about slope stability and run out analysis allowed to create rock fall hazard map and to advise the most suitable protection works to mitigate the hazard in the most risky sites.

  14. Locating Errors Through Networked Surveillance: A Multimethod Approach to Peer Assessment, Hazard Identification, and Prioritization of Patient Safety Efforts in Cardiac Surgery.

    PubMed

    Thompson, David A; Marsteller, Jill A; Pronovost, Peter J; Gurses, Ayse; Lubomski, Lisa H; Goeschel, Christine A; Gosbee, John W; Wahr, Joyce; Martinez, Elizabeth A

    2015-09-01

    The objectives were to develop a scientifically sound and feasible peer-to-peer assessment model that allows health-care organizations to evaluate patient safety in cardiovascular operating rooms and to establish safety priorities for improvement. The locating errors through networked surveillance study was conducted to identify hazards in cardiac surgical care. A multidisciplinary team, composed of organizational sociology, organizational psychology, applied social psychology, clinical medicine, human factors engineering, and health services researchers, conducted the study. We used a transdisciplinary approach, which integrated the theories, concepts, and methods from each discipline, to develop comprehensive research methods. Multiple data collection was involved: focused literature review of cardiac surgery-related adverse events, retrospective analysis of cardiovascular events from a national database in the United Kingdom, and prospective peer assessment at 5 sites, involving survey assessments, structured interviews, direct observations, and contextual inquiries. A nominal group methodology, where one single group acts to problem solve and make decisions was used to review the data and develop a list of the top priority hazards. The top 6 priority hazard themes were as follows: safety culture, teamwork and communication, infection prevention, transitions of care, failure to adhere to practices or policies, and operating room layout and equipment. We integrated the theories and methods of a diverse group of researchers to identify a broad range of hazards and good clinical practices within the cardiovascular surgical operating room. Our findings were the basis for a plan to prioritize improvements in cardiac surgical care. These study methods allowed for the comprehensive assessment of a high-risk clinical setting that may translate to other clinical settings.

  15. [Design of a Hazard Analysis and Critical Control Points (HACCP) plan to assure the safety of a bologna product produced by a meat processing plant].

    PubMed

    Bou Rached, Lizet; Ascanio, Norelis; Hernández, Pilar

    2004-03-01

    The Hazard Analysis and Critical Control Point (HACCP) is a systematic integral program used to identify and estimate the hazards (microbiological, chemical and physical) and the risks generated during the primary production, processing, storage, distribution, expense and consumption of foods. To establish a program of HACCP has advantages, being some of them: to emphasize more in the prevention than in the detection, to diminish the costs, to minimize the risk of manufacturing faulty products, to allow bigger trust to the management, to strengthen the national and international competitiveness, among others. The present work is a proposal based on the design of an HACCP program to guarantee the safety of the Bologna Special Type elaborated by a meat products industry, through the determination of hazards (microbiological, chemical or physical), the identification of critical control points (CCP), the establishment of critical limits, plan corrective actions and the establishment of documentation and verification procedures. The used methodology was based in the application of the seven basic principles settled down by the Codex Alimentarius, obtaining the design of this program. In view of the fact that recently the meat products are linked with pathogens like E. coli O157:H7 and Listeria monocytogenes, these were contemplated as microbiological hazard for the establishment of the HACCP plan whose application will guarantee the obtaining of a safe product.

  16. An information diffusion technique to assess integrated hazard risks.

    PubMed

    Huang, Chongfu; Huang, Yundong

    2018-02-01

    An integrated risk is a scene in the future associated with some adverse incident caused by multiple hazards. An integrated probability risk is the expected value of disaster. Due to the difficulty of assessing an integrated probability risk with a small sample, weighting methods and copulas are employed to avoid this obstacle. To resolve the problem, in this paper, we develop the information diffusion technique to construct a joint probability distribution and a vulnerability surface. Then, an integrated risk can be directly assessed by using a small sample. A case of an integrated risk caused by flood and earthquake is given to show how the suggested technique is used to assess the integrated risk of annual property loss. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kienhuis, Anne S., E-mail: anne.kienhuis@rivm.nl; RIKILT, Institute of Food Safety, Wageningen UR, PO Box 230, 6700 AE, Wageningen; Netherlands Toxicogenomics Centre

    Hepatic systems toxicology is the integrative analysis of toxicogenomic technologies, e.g., transcriptomics, proteomics, and metabolomics, in combination with traditional toxicology measures to improve the understanding of mechanisms of hepatotoxic action. Hepatic toxicology studies that have employed toxicogenomic technologies to date have already provided a proof of principle for the value of hepatic systems toxicology in hazard identification. In the present review, acetaminophen is used as a model compound to discuss the application of toxicogenomics in hepatic systems toxicology for its potential role in the risk assessment process, to progress from hazard identification towards hazard characterization. The toxicogenomics-based parallelogram is usedmore » to identify current achievements and limitations of acetaminophen toxicogenomic in vivo and in vitro studies for in vitro-to-in vivo and interspecies comparisons, with the ultimate aim to extrapolate animal studies to humans in vivo. This article provides a model for comparison of more species and more in vitro models enhancing the robustness of common toxicogenomic responses and their relevance to human risk assessment. To progress to quantitative dose-response analysis needed for hazard characterization, in hepatic systems toxicology studies, generation of toxicogenomic data of multiple doses/concentrations and time points is required. Newly developed bioinformatics tools for quantitative analysis of toxicogenomic data can aid in the elucidation of dose-responsive effects. The challenge herein is to assess which toxicogenomic responses are relevant for induction of the apical effect and whether perturbations are sufficient for the induction of downstream events, eventually causing toxicity.« less

  18. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.

  19. A new web-based course: dealing with glaciers and permafrost hazards

    NASA Astrophysics Data System (ADS)

    Oswald, S.; Kaeaeb, A.; Haeberli, W.

    2003-04-01

    The intensive human use of high mountains intersects more and more with the hazard zones of such environments. Because of the complexity of such processes and impacts, dealing with such risks requires a broad education in many sub-domains of the earth sciences and the socio-economic field. Inter- and trans-disciplinary training and education of professionals is therefore essential. Thus the goal of the Swiss Virtual Campus project "Dealing with Natural Hazards" is to provide such a course program covering the basics of dealing with natural hazards, including technical, environmental and social aspects. In the field of natural hazards and risk management, education at the Swiss universities is mostly structured in narrow sectors. Using the advantages of the internet, the Virtual Campus provides teachers and students an interdisciplinary discussion platform on the integral approach and the handling with natural hazards. The course content is organised in 5 modules: 1 basic knowledge and tools, 2 hydrological / meteorological hazards, 3 geological hazards, 4 vulnerability of property and of socio-economic systems and 5 integral natural risk management. To ensure a national and international access the courses are designed in English and published on the internet. Within the scope of this project we are developing lessons in the subject area of natural hazards related to glaciers and permafrost. These are ice avalanches, glacier floods, glacier length variations and permafrost. The content is divided into chapters, which are consistent over the entire module: (1) processes: characterisation of the different processes, (2) triggering: initiating events, (3) data acquisition, mapping and monitoring: appropriate methods, (4) estimation models: application of the adequate model, (5) combinations and interactions: interrelation and impacts of different hazards, (6) long-term effects: global change effects, (7) integral hazard recognition and assessment: integral proceedings, (8) measures: appropriate protection measures, (9) examples: different cases from throughout the world. It is our goal to design the e-lessons in interactive way, to utilise the benefits of computer-based learning. The course will replace the classical way of "ex-cathedra" teaching by problem-based learning. After working out the basics individually the students shell have the opportunity to discuss and apply their acquired knowledge by editing case studies. It is also planned to use the course for capacity building in development countries.

  20. Toxics Release Inventory Chemical Hazard Information Profiles (TRI-CHIP) Dataset

    EPA Pesticide Factsheets

    The Toxics Release Inventory (TRI) Chemical Hazard Information Profiles (TRI-CHIP) dataset contains hazard information about the chemicals reported in TRI. Users can use this XML-format dataset to create their own databases and hazard analyses of TRI chemicals. The hazard information is compiled from a series of authoritative sources including the Integrated Risk Information System (IRIS). The dataset is provided as a downloadable .zip file that when extracted provides XML files and schemas for the hazard information tables.

  1. A comparative evaluation of five hazard screening tools.

    PubMed

    Panko, J M; Hitchcock, K; Fung, M; Spencer, P J; Kingsbury, T; Mason, A M

    2017-01-01

    An increasing number of hazard assessment tools and approaches are being used in the marketplace as a means to differentiate products and ingredients with lower versus higher hazards or to certify what some call greener chemical ingredients in consumer products. Some leading retailers have established policies for product manufacturers and their suppliers to disclose chemical ingredients and their related hazard characteristics often specifying what tools to use. To date, no data exists that show a tool's reliability to provide consistent, credible screening-level hazard scores that can inform greener product selection. We conducted a small pilot study to understand and compare the hazard scoring of several hazard screening tools to determine if hazard and toxicity profiles for chemicals differ. Seven chemicals were selected that represent both natural and man-made chemistries as well as a range of toxicological activity. We conducted the assessments according to each tool provider's guidelines, which included factors such as endpoints, weighting preferences, sources of information, and treatment of data gaps. The results indicate the tools varied in the level of discrimination seen in the scores for these 7 chemicals and that tool classifications of the same chemical varied widely between the tools, ranging from little or no hazard or toxicity to very high hazard or toxicity. The results also highlight the need for transparency in describing the basis for the tool's hazard scores and suggest possible enhancements. Based on this pilot study, tools should not be generalized to fit all situations because their evaluations are context-specific. Before choosing a tool or approach, it is critical that the assessment rationale be clearly defined and matches the selected tool or approach. Integr Environ Assess Manag 2017;13:139-154. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC.

  2. Architecture Design for the Space Situational Awareness System in the Preparedness Plan for Space Hazards of Republic of Korea

    NASA Astrophysics Data System (ADS)

    Choi, E.; Cho, S.; Shin, S.; Park, J.; Kim, J.; Kim, D.

    The threat posed by asteroids and comets has become one of the important issues. Jinju meteorite discovered in March 2014 has expanded the interest of the people of the fall of the natural space objects. Furthermore, the growing quantity of space debris is a serious threat to satellites and other spacecraft, which risk being damaged or even destroyed. In May of 2014, Korea established the preparedness plan for space hazards according to the space development promotion act which is amended to take action with respect to hazards from space. This plan is largely composed of 3 items such as system, technology and infrastructure. System is included the establishment and management of national space hazards headquarters at risk situation. Korea Astronomy and Space Science Institute (KASI) was designated as a space environment monitoring agency under the ministry of science, ICT and future planning (MSIP). Technology is supposed to develop the space situational awareness system that can monitor and detect space objects. For infrastructure, research and development of core technology will be promoted for capabilities improvement of space hazards preparedness such as software tools, application and data systems. This paper presents the architectural design for building space situational awareness system. The trade-off study of space situational awareness system for the Korea situation was performed. The results have shown the proposed architectural design. The baseline architecture is composed of Integrated Analysis System and Space Objects Monitoring System. Integrated Analysis System collects the status data from Space Objects Monitoring System and analyzes the space risk information through a data processing. For Space Objects Monitoring System, the all-sky surveillance camera, array radar and meteoroid surveillance sensor networks were considered. This system focuses on not only the threat of a large artificial satellite and natural space objects such as asteroids that crashed to Earth but also the prediction of potential collisions between space objects. Especially, array radar aims to accurately track space objects. By analyzing performance for radar system and sensor networks, several feasible approaches for such a space objects monitoring system will be presented in this paper.

  3. Integrated flood hazard assessment based on spatial ordered weighted averaging method considering spatial heterogeneity of risk preference.

    PubMed

    Xiao, Yangfan; Yi, Shanzhen; Tang, Zhongqian

    2017-12-01

    Flood is the most common natural hazard in the world and has caused serious loss of life and property. Assessment of flood prone areas is of great importance for watershed management and reduction of potential loss of life and property. In this study, a framework of multi-criteria analysis (MCA) incorporating geographic information system (GIS), fuzzy analytic hierarchy process (AHP) and spatial ordered weighted averaging (OWA) method was developed for flood hazard assessment. The factors associated with geographical, hydrological and flood-resistant characteristics of the basin were selected as evaluation criteria. The relative importance of the criteria was estimated through fuzzy AHP method. The OWA method was utilized to analyze the effects of different risk attitudes of the decision maker on the assessment result. The spatial ordered weighted averaging method with spatially variable risk preference was implemented in the GIS environment to integrate the criteria. The advantage of the proposed method is that it has considered spatial heterogeneity in assigning risk preference in the decision-making process. The presented methodology has been applied to the area including Hanyang, Caidian and Hannan of Wuhan, China, where flood events occur frequently. The outcome of flood hazard distribution presents a tendency of high risk towards populated and developed areas, especially the northeast part of Hanyang city, which has suffered frequent floods in history. The result indicates where the enhancement projects should be carried out first under the condition of limited resources. Finally, sensitivity of the criteria weights was analyzed to measure the stability of results with respect to the variation of the criteria weights. The flood hazard assessment method presented in this paper is adaptable for hazard assessment of a similar basin, which is of great significance to establish counterplan to mitigate life and property losses. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    NASA Astrophysics Data System (ADS)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the earthquake wave propagation to tsunami mitigation would be feasible once the user community support is in place.

  5. Patterns of Risk Using an Integrated Spatial Multi-Hazard Model (PRISM Model)

    EPA Science Inventory

    Multi-hazard risk assessment has long centered on small scale needs, whereby a single community or group of communities’ exposures are assessed to determine potential mitigation strategies. While this approach has advanced the understanding of hazard interactions, it is li...

  6. Application of probabilistic risk assessment: Evaluating remedial alternatives at the Portland Harbor Superfund Site, Portland, Oregon, USA.

    PubMed

    Ruffle, Betsy; Henderson, James; Murphy-Hagan, Clare; Kirkwood, Gemma; Wolf, Frederick; Edwards, Deborah A

    2018-01-01

    A probabilistic risk assessment (PRA) was performed to evaluate the range of potential baseline and postremedy health risks to fish consumers at the Portland Harbor Superfund Site (the "Site"). The analysis focused on risks of consuming fish resident to the Site containing polychlorinated biphenyls (PCBs), given that this exposure scenario and contaminant are the primary basis for US Environmental Protection Agency's (USEPA's) selected remedy per the January 2017 Record of Decision (ROD). The PRA used probability distributions fit to the same data sets used in the deterministic baseline human health risk assessment (BHHRA) as well as recent sediment and fish tissue data to evaluate the range and likelihood of current baseline cancer risks and noncancer hazards for anglers. Areas of elevated PCBs in sediment were identified on the basis of a geospatial evaluation of the surface sediment data, and the ranges of risks and hazards associated with pre- and postremedy conditions were calculated. The analysis showed that less active remediation (targeted to areas with the highest concentrations) compared to the remedial alternative selected by USEPA in the ROD can achieve USEPA's interim risk management benchmarks (cancer risk of 10 -4 and noncancer hazard index [HI] of 10) immediately postremediation for the vast majority of subsistence anglers that consume smallmouth bass (SMB) fillet tissue. In addition, the same targeted remedy achieves USEPA's long-term benchmarks (10 -5 and HI of 1) for the majority of recreational anglers. Additional sediment remediation would result in negligible additional risk reduction due to the influence of background. The PRA approach applied here provides a simple but adaptive framework for analysis of risks and remedial options focused on variability in exposures. It can be updated and refined with new data to evaluate and reduce uncertainty, improve understanding of the Site and target populations, and foster informed remedial decision making. Integr Environ Assess Manag 2018;14:63-78. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  7. A metric-based assessment of flood risk and vulnerability of rural communities in the Lower Shire Valley, Malawi

    NASA Astrophysics Data System (ADS)

    Adeloye, A. J.; Mwale, F. D.; Dulanya, Z.

    2015-06-01

    In response to the increasing frequency and economic damages of natural disasters globally, disaster risk management has evolved to incorporate risk assessments that are multi-dimensional, integrated and metric-based. This is to support knowledge-based decision making and hence sustainable risk reduction. In Malawi and most of Sub-Saharan Africa (SSA), however, flood risk studies remain focussed on understanding causation, impacts, perceptions and coping and adaptation measures. Using the IPCC Framework, this study has quantified and profiled risk to flooding of rural, subsistent communities in the Lower Shire Valley, Malawi. Flood risk was obtained by integrating hazard and vulnerability. Flood hazard was characterised in terms of flood depth and inundation area obtained through hydraulic modelling in the valley with Lisflood-FP, while the vulnerability was indexed through analysis of exposure, susceptibility and capacity that were linked to social, economic, environmental and physical perspectives. Data on these were collected through structured interviews of the communities. The implementation of the entire analysis within GIS enabled the visualisation of spatial variability in flood risk in the valley. The results show predominantly medium levels in hazardousness, vulnerability and risk. The vulnerability is dominated by a high to very high susceptibility. Economic and physical capacities tend to be predominantly low but social capacity is significantly high, resulting in overall medium levels of capacity-induced vulnerability. Exposure manifests as medium. The vulnerability and risk showed marginal spatial variability. The paper concludes with recommendations on how these outcomes could inform policy interventions in the Valley.

  8. Analysis of Compound Water Hazard in Coastal Urbanized Areas under the Future Climate

    NASA Astrophysics Data System (ADS)

    Shibuo, Y.; Taniguchi, K.; Sanuki, H.; Yoshimura, K.; Lee, S.; Tajima, Y.; Koike, T.; Furumai, H.; Sato, S.

    2017-12-01

    Several studies indicate the increased frequency and magnitude of heavy rainfalls as well as the sea level rise under the future climate, which implies that coastal low-lying urbanized areas may experience increased risk against flooding. In such areas, where river discharge, tidal fluctuation, and city drainage networks altogether influence urban inundation, it is necessary to consider their potential interference to understand the effect of compound water hazard. For instance, pump stations cannot pump out storm water when the river water level is high, and in the meantime the river water level shall increase when it receives pumped water from cities. At the further downstream, as the tidal fluctuation regulates the water levels in the river, it will also affect the functionality of pump stations and possible inundation from rivers. In this study, we estimate compound water hazard in the coastal low-lying urbanized areas of the Tsurumi river basin under the future climate. We developed the seamlessly integrated river, sewerage, and coastal hydraulic model that can simulate river water levels, water flow in sewerage network, and inundation from the rivers and/or the coast to address the potential interference issue. As a forcing, the pseudo global warming method, which applies the changes in GCM anomaly to re-analysis data, is employed to produce ensemble typhoons to drive the seamlessly integrated model. The results show that heavy rainfalls caused by the observed typhoon generally become stronger under the pseudo global climate condition. It also suggests that the coastal low-lying areas become extensively inundated if the onset of river flooding and storm surge coincides.

  9. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    NASA Astrophysics Data System (ADS)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to arrive at a list of key initiating events that can be used to address vulnerabilities during the design phases of LFTR development.

  10. Unmanned Aerial Vehicle (UAV) associated DTM quality evaluation and hazard assessment

    NASA Astrophysics Data System (ADS)

    Huang, Mei-Jen; Chen, Shao-Der; Chao, Yu-Jui; Chiang, Yi-Lin; Chang, Kuo-Jen

    2014-05-01

    Taiwan, due to the high seismicity and high annual rainfall, numerous landslides triggered every year and severe impacts affect the island. Concerning to the catastrophic landslides, the key information of landslide, including range of landslide, volume estimation and the subsequent evolution are important when analyzing the triggering mechanism, hazard assessment and mitigation. Thus, the morphological analysis gives a general overview for the landslides and been considered as one of the most fundamental information. We try to integrate several technologies, especially by Unmanned Aerial Vehicle (UAV) and multi-spectral camera, to decipher the consequence and the potential hazard, and the social impact. In recent years, the remote sensing technology improves rapidly, providing a wide range of image, essential and precious information. Benefited of the advancing of informatics, remote-sensing and electric technologies, the Unmanned Aerial Vehicle (UAV) photogrammetry mas been improve significantly. The study tries to integrate several methods, including, 1) Remote-sensing images gathered by Unmanned Aerial Vehicle (UAV) and by aerial photos taken in different periods; 2) field in-situ geologic investigation; 3) Differential GPS, RTK GPS and Ground LiDAR field in-site geoinfomatics measurements; 4) Construct the DTMs before and after landslide, as well as the subsequent periods using UAV and aerial photos; 5) Discrete element method should be applied to understand the geomaterial composing the slope failure, for predicting earthquake-induced and rainfall-induced landslides displacement. First at all, we evaluate the Microdrones MD4-1000 UAV airphotos derived Digital Terrain Model (DTM). The ground resolution of the DSM point cloud of could be as high as 10 cm. By integrated 4 ground control point within an area of 56 hectares, compared with LiDAR DSM and filed RTK-GPS surveying, the mean error is as low as 6cm with a standard deviation of 17cm. The quality of the UAV DSM could be as good as LiDAR data, and is ready for other applications. The quality of the data set provides not only geoinfomatics and GIS dataset of the hazards, but also for essential geomorphologic information for other study, and for hazard mitigation and planning, as well.

  11. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  12. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  13. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  14. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  15. A prototype web-GIS application for risk analysis of natural hazards in Switzerland

    NASA Astrophysics Data System (ADS)

    Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien

    2016-04-01

    Following changes in the system of Swiss subsidy in January 2008, the Swiss cantons and the Federal Office for the Environment (FOEN) were forced to prioritize different natural hazard protection projects based on their cost-effectiveness, as a response to limited financial resources (Bründl et al., 2009). For this purpose, applications such as EconoMe (OFEV, 2016) and Valdorisk (DGE, 2016) were developed for risk evaluation and prioritization of mitigation projects. These tools serve as a useful decision-making instrument to the community of practitioners and responsible authorities for natural hazard risk management in Switzerland. However, there are several aspects which could be improved, in particular, the integration and visualization of spatial information interactively through a web-GIS interface for better risk planning and evaluation. Therefore, in this study, we aim to develop an interactive web-GIS application based on the risk concepts applied in Switzerland. The purpose of this tool is to provide a rapid evaluation of risk before and after protection measures, and to test the efficiency of measures by using a simplified cost-benefit analysis within the context of different protection projects. This application allows to integrate different layers which are necessary to calculate risk, in particular, hazard intensity (vector) maps for different scenarios (such as 30, 100 and 300 years of return periods based on Swiss guidelines), exposed objects (such as buildings) and vulnerability information of these objects. Based on provided information and additional parameters, risk is calculated automatically and results are visualized within the web-GIS interface of the application. The users can modify these input information and parameters to create different risk scenarios. Based on the resultant risk scenarios, the users can propose and visualize (preliminary) risk reduction measures before realizing the actual design and dimensions of such protective measures in the area. After designing measures, the users can re-calculate risk by updating hazard intensity and object layers. This is achieved by manual editing of shape (vector) layers in the web-GIS interface interactively. Within the application, a cost-benefit analysis tool is also integrated to support the decision-making process for the selection of different protection measures. Finally, the resultant risk information (vector layers and data) can be exported in the form of shapefiles and excel sheets. A prototype application is realized using open-source geospatial software and technologies. Boundless framework with its client-side SDK environment is applied for the rapid prototyping. Free and open source components such as PostGIS spatial database, GeoServer and GeoWebCache, GeoExt and OpenLayers are used for the development of the platform. This developed prototype is demonstrated with a case study area located in Les Diablerets, Switzerland. This research work is carried out within a project funded by the Canton of Vaud, Switzerland. References: Bründl, M., Romang, H. E., Bischof, N., and Rheinberger, C. M.: The risk concept and its application in natural hazard risk management in Switzerland, Nat. Hazards Earth Syst. Sci., 9, 801-813, 2009. DGE: Valdorisk - Direction Générale de l'Environnement, www.vd.ch, accessed 9 January 2016, 2016. OFEV: EconoMe - Office fédéral de l'environnement, www.econome.admin.ch, accessed 9 January 2016, 2016.

  16. Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henneke, Dennis W.; Robinson, James

    In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less

  17. Canister Storage Building (CSB) Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    POWERS, T.B.

    2000-03-16

    This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less

  18. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Research Team . Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage (horizontal and vertical tail). This report contains the Appendices to Volume I.

  19. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system. PMID:24467813

  20. Multi scale modelling of landslide hazard and risk assessment in data scarce area - a case study on Dhalai District, Tripura, India

    NASA Astrophysics Data System (ADS)

    Ghosh, Kapil; De, Sunil Kumar

    2017-04-01

    Successful landslide management plans and policy depends on in-depth knowledge about the hazard and associated risk. Thus, the present research is intended to present an integrated approach involving uses of geospatial technologies for landslide hazard and risk assessment at different scales (site specific to regional level). The landslide hazard map at regional scale (district level) is prepared by using weight-rating based method. To analyze landslide manifestation in the Dhalai district of Tripura different causative factor maps (lithology, road buffer, slope, relative relief, rainfall, fault buffer, landuse/landcover and drainage density) are derived. The analysis revealed that the geological structure and human interference have more influence than other considered factors on the landslide occurrences. The landslide susceptibility zonation map shows that about 1.64 and 16.68% of the total study area is falling under very high and high susceptibility zones respectively. The landslide risk assessment at district level is generated by integrating hazard scouring and resource damage potential scouring (fuzzy membership values) maps. The values of landslide risk matrix are varying within the range of 0.001 to 0.18 and the risk assessment map shows that only 0.45% (10.80 km2) of the district is under very high risk zone, whereas, about 50% pixels of existing road section are under very high to high level of landslide risk. The major part (94.06%) of the district is under very low to low risk zone. Landslide hazard and risk assessment at site specific level have been carried out through intensive field investigation in which it is found that the Ambassa landslide is located within 150 m buffer zone of fault line. Variation of geo-electrical resistivity (2.2Ωm to 31.4Ωm) indicates the complex geological character in this area. Based on the obtained geo-technical result which helps to identify the degree of risk to the existing resource, it is appropriate to implement the management plans such as construction of sub-surface drainage, extension of retaining walls, cutting/filling of slope in scientific manner. Keywords: landslide, hazard, risk, fuzzy set theory

  1. Ground subsidence information as a valuable layer in GIS analysis

    NASA Astrophysics Data System (ADS)

    Murdzek, Radosław; Malik, Hubert; Leśniak, Andrzej

    2018-04-01

    Among the technologies used to improve functioning of local governments the geographic information systems (GIS) are widely used. GIS tools allow to simultaneously integrate spatial data resources, analyse them, process and use them to make strategic decisions. Nowadays GIS analysis is widely used in spatial planning or environmental protection. In these applications a number of spatial information are utilized, but rarely it is an information about environmental hazards. This paper includes information about ground subsidence that occurred in USCB mining area into GIS analysis. Monitoring of this phenomenon can be carried out using the radar differential interferometry (DInSAR) method.

  2. Structured Analysis for the Logistic Support Analysis (LSA) Task and the Integrated Logistic Support (ILS) Element. Structured Analysis - ILS Assessment Element E11. Design Influence

    DTIC Science & Technology

    1989-01-01

    format size of this report, the full identifying entry may well be forcibly shortened, thereby introducing the possibility of misunderstanding. Therefore...OF MATERIEL" 3d . "AR 570-9, "MANPOWER AND EQUIPMENT CONTROL - HOST NATION SUPPORT" 2. AR 700-9, "POLICIES OF THE ARMY LOGISTIC SYSTEM" 3. AR 700-82...PERSONNEL 4. TRAINING 5. SYSTEM SAFETY 6. HEALTH HAZARDS. TEE ASSESSMENT Or MANPRINT INFLUENCE ON DESIGNS IS ADDRESSED IN SIX (6) SPECIFIC AREAS IN

  3. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less

  4. Fuels planning: science synthesis and integration; forest structure and fire hazard fact sheet 02: fire hazard

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Fire hazard reflects the potential fire behavior and magnitude of effects as a function of fuel conditions. This fact sheet discusses crown fuels, surface fuels, and ground fuels and their contribution and involvement in wildland fire.Other publications in this series...

  5. Bit of History and Some Lessons Learned in Using NASA Remote Sensing Data in Public Health Applications

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Estes, Sue

    2011-01-01

    The NASA Applied Sciences Program's public health initiative began in 2004 to illustratethe potential benefits for using remote sensing in public health applications. Objectives/Purpose: The CDC initiated a st udy with NASA through the National Center for Environmental Health (NCEH) to establish a pilot effort to use remote sensing data as part of its Environmental Public Health Tracking Network (EPHTN). As a consequence, the NCEH and NASA developed a project called HELIX-Atlanta (Health and Environment Linkage for Information Exchange) to demonstrate a process for developing a local environmental public health tracking and surveillance network that integrates non-infectious health and environment systems for the Atlanta metropolitan area. Methods: As an ongo ing, systematic integration, analysis and interpretation of data, an EPHTN focuses on: 1 -- environmental hazards; 2 -- human exposure to environmental hazards; and 3 -- health effects potentially related to exposure to environmental hazards. To satisfy the definition of a surveillance system the data must be disseminated to plan, implement, and evaluate environmental public health action. Results: A close working r elationship developed with NCEH where information was exchanged to assist in the development of an EPHTN that incorporated NASA remote sensing data into a surveillance network for disseminating public health tracking information to users. This project?s success provided NASA with the opportunity to work with other public health entities such as the University of Mississippi Medical Center, the University of New Mexico and the University of Arizona. Conclusions: HELIX-Atlanta became a functioning part of the national EPHTN for tracking environmental hazards and exposure, particularly as related to air quality over Atlanta. Learning Objectives: 1 -- remote sensing data can be integral to an EPHTN; 2 -- public tracking objectives can be enhanced through remote sensing data; 3 -- NASA's involvement in public health applications can have wider benefits in the future.

  6. Rapid test for the detection of hazardous microbiological material

    NASA Astrophysics Data System (ADS)

    Mordmueller, Mario; Bohling, Christian; John, Andreas; Schade, Wolfgang

    2009-09-01

    After attacks with anthrax pathogens have been committed since 2001 all over the world the fast detection and determination of biological samples has attracted interest. A very promising method for a rapid test is Laser Induced Breakdown Spectroscopy (LIBS). LIBS is an optical method which uses time-resolved or time-integrated spectral analysis of optical plasma emission after pulsed laser excitation. Even though LIBS is well established for the determination of metals and other inorganic materials the analysis of microbiological organisms is difficult due to their very similar stoichiometric composition. To analyze similar LIBS-spectra computer assisted chemometrics is a very useful approach. In this paper we report on first results of developing a compact and fully automated rapid test for the detection of hazardous microbiological material. Experiments have been carried out with two setups: A bulky one which is composed of standard laboratory components and a compact one consisting of miniaturized industrial components. Both setups work at an excitation wavelength of λ=1064nm (Nd:YAG). Data analysis is done by Principal Component Analysis (PCA) with an adjacent neural network for fully automated sample identification.

  7. Integration of landslide susceptibility products in the environmental plans

    NASA Astrophysics Data System (ADS)

    Fiorucci, Federica; Reichenbach, Paola; Rossi, Mauro; Cardinali, Mauro; Guzzetti, Fausto

    2015-04-01

    Landslides are one of the most destructive natural hazard that causes damages to urban area worldwide. The knowledge of where a landslide could occur is essential for the strategic management of the territory and for a good urban planning . In this contest landslide susceptibility zoning (LSZ) is crucial to provide information on the degree to which an area can be affected by future slope movements. Despite landslide susceptibility maps have been prepared extensively during the last decades, there are few examples of application is in the environmental plans (EP). In this work we present a proposal for the integration of the landslide inventory map with the following landslide susceptibility products: (i) landslide susceptibility zonation , (ii) the associated error map and (iii) the susceptibility uncertainty map. Moreover we proposed to incorporate detailed morphological studies for the evaluation of landslide risk associated to local parceling plan. The integration of all this information is crucial for the management of landslide risk in urban expansions forecasts. Municipality, province and regional administration are often not able to support the costs of landslide risk evaluation for extensive areas but should concentrate their financial resources to specific hazardous and unsafe situations defined by the result of the integration of landslide susceptibility products. Zonation and detail morphological analysis should be performed taking into account the existing laws and regulations, and could become a starting point to discuss new regulations for the landslide risk management.

  8. IRIS Toxicological Review of Ethylene Glycol Mono-Butyl ...

    EPA Pesticide Factsheets

    EPA has conducted a peer review of the scientific basis supporting the human health hazard and dose-response assessment of ethylene glycol monobutyl ether that will appear on the Integrated Risk Information System (IRIS) database. EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of propionaldehyde that will appear on the Integrated Risk Information System (IRIS) database.

  9. Assessing natural hazard risk using images and data

    NASA Astrophysics Data System (ADS)

    Mccullough, H. L.; Dunbar, P. K.; Varner, J. D.; Mungov, G.

    2012-12-01

    Photographs and other visual media provide valuable pre- and post-event data for natural hazard assessment. Scientific research, mitigation, and forecasting rely on visual data for risk analysis, inundation mapping and historic records. Instrumental data only reveal a portion of the whole story; photographs explicitly illustrate the physical and societal impacts from the event. Visual data is rapidly increasing as the availability of portable high resolution cameras and video recorders becomes more attainable. Incorporating these data into archives ensures a more complete historical account of events. Integrating natural hazards data, such as tsunami, earthquake and volcanic eruption events, socio-economic information, and tsunami deposits and runups along with images and photographs enhances event comprehension. Global historic databases at NOAA's National Geophysical Data Center (NGDC) consolidate these data, providing the user with easy access to a network of information. NGDC's Natural Hazards Image Database (ngdc.noaa.gov/hazardimages) was recently improved to provide a more efficient and dynamic user interface. It uses the Google Maps API and Keyhole Markup Language (KML) to provide geographic context to the images and events. Descriptive tags, or keywords, have been applied to each image, enabling easier navigation and discovery. In addition, the Natural Hazards Map Viewer (maps.ngdc.noaa.gov/viewers/hazards) provides the ability to search and browse data layers on a Mercator-projection globe with a variety of map backgrounds. This combination of features creates a simple and effective way to enhance our understanding of hazard events and risks using imagery.

  10. MEditerranean Supersite Volcanoes (MED-SUV) project: from objectives to results

    NASA Astrophysics Data System (ADS)

    Puglisi, Giuseppe; Spampinato, Letizia

    2017-04-01

    The MEditerranean Supersite Volcanoes (MED-SUV) was a FP7 3-year lasting project aimed at improving the assessment of volcanic hazards at two of the most active European volcanic areas - Campi Flegrei/Vesuvius and Mt. Etna. More than 3 million people are exposed to potential hazards in the two areas, and the geographic location of the volcanoes increases the number of people extending the impact to a wider region. MED-SUV worked on the (1) optimisation and integration of the existing and new monitoring systems, (2) understanding of volcanic processes, and on the (3) relationship between the scientific and end-user communities. MED-SUV fully exploited the unique multidisciplinary long-term in-situ datasets available for these volcanoes and integrated them with Earth observations. Technological developments and implemented algorithms allowed better constraint of pre-, sin- and post-eruptive phases. The wide range of styles and intensities of the volcanic phenomena observed at the targeted volcanoes - archetypes of 'closed' and 'open' conduit systems - observed by using the long-term multidisciplinary datasets, exceptionally upgraded the understanding of a variety of geo-hazards. Proper experiments and studies were carried out to advance the understanding of the volcanoes' internal structure and processes, and to recognise signals related to impending unrest/eruptive phases. Indeed, the hazard quantitative assessment benefitted from the outcomes of these studies and from their integration with cutting edge monitoring approaches, thus leading to step-changes in hazard awareness and preparedness, and leveraging the close relationship between scientists, SMEs, and end-users. Among the MED-SUV achievements, we can list the (i) implementation of a data policy compliant with the GEO Open Data Principles for ruling the exploitation and shared use of the project outcomes; (ii) MED-SUV e-infrastructure creation as test bed for designing an interoperable infrastructure to manage different data sources, applying the data policy, and envisaging sustainability strategies after the project in a coherent national and international framework; (iii) improvement of the SAR capability in detecting and monitoring ground deformation; (iv) development/implementation and testing of prototypes and software for measuring and retrieving more accurate/novel parameters; (v) integration of satellite and in-situ data; and (vi) novel methods of data analysis increasing the knowledge of volcanic process dynamics and improving alert systems. The project has fostered the assessment of short-term volcanic hazard in the Italian Supersites, and exploitation of the information provided by the monitoring. The main breakthroughs in the hazard focused on fine-tuning the Bayesian approach for the probabilistic evaluation of the occurrence of eruptive events at Campi Flegrei and its effects in the area, and the preliminary application to assess the occurrence of flank eruptions and the effects of volcanic plume fallout at Mt. Etna. Indeed, MED-SUV worked also on the communication between scientists and decision makers by evaluating the suitability of scientific outcomes (e.g. hazard maps) to be informative for this goal. Dissemination of the outcomes aimed at spreading new volcanology knowledge among the scientific community, as well as among decision-maker bodies and public, and allowing the end-user community access to the two Italian Supersites' data through a proper implemented e-infrastructure.

  11. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less

  12. Earthquake Intensity and Strong Motion Analysis Within SEISCOMP3

    NASA Astrophysics Data System (ADS)

    Becker, J.; Weber, B.; Ghasemi, H.; Cummins, P. R.; Murjaya, J.; Rudyanto, A.; Rößler, D.

    2017-12-01

    Measuring and predicting ground motion parameters including seismic intensities for earthquakes is crucial and subject to recent research in engineering seismology.gempa has developed the new SIGMA module for Seismic Intensity and Ground Motion Analysis. The module is based on the SeisComP3 framework extending it in the field of seismic hazard assessment and engineering seismology. SIGMA may work with or independently of SeisComP3 by supporting FDSN Web services for importing earthquake or station information and waveforms. It provides a user-friendly and modern graphical interface for semi-automatic and interactive strong motion data processing. SIGMA provides intensity and (P)SA maps based on GMPE's or recorded data. It calculates the most common strong motion parameters, e.g. PGA/PGV/PGD, Arias intensity and duration, Tp, Tm, CAV, SED and Fourier-, power- and response spectra. GMPE's are configurable. Supporting C++ and Python plug-ins, standard and customized GMPE's including the OpenQuake Hazard Library can be easily integrated and compared. Originally tailored to specifications by Geoscience Australia and BMKG (Indonesia) SIGMA has become a popular tool among SeisComP3 users concerned with seismic hazard and strong motion seismology.

  13. Integrating Economic Strengthening and Family Coaching to Reduce Work-Related Health Hazards Among Children of Poor Households: Burkina Faso.

    PubMed

    Karimli, Leyla; Rost, Lucia; Ismayilova, Leyla

    2018-01-01

    This is the first randomized controlled trial in Burkina Faso testing the effect of economic strengthening alone and in combination with family coaching on child's hazardous work and work-related health outcomes. The study also tests the association between different forms of hazardous work and child's health outcomes. A total of 360 households from 12 villages participated in the study. Villages were randomly assigned to three study arms: economic intervention alone, economic intervention integrated with family coaching, and control. In each household, one female caregiver and one child aged 10-15 years were interviewed. Data were collected at baseline, 12 months, and 24 months. We ran multilevel mixed-effects models that account for both within-individual correlation over time and clustering of subjects within villages. Compared with the control group, at 24 months, children in the integrated arm experienced significant reduction in exposure to hazardous work and some forms of hazards and abuse. Results for children in the economic strengthening-only arm were more modest. In most cases, child's health was significantly associated not with specific forms of work per se, but with child's exposure to hazards and abuse while doing this form of work. We found no significant effect of intervention on child's work-related health. Economic strengthening combined with family coaching on child protection issues, rather than implemented alone, may be more effective in reducing child's exposure to hazardous work. Additional research is needed to understand gender differences and causal links between different forms of child work and health hazards. Copyright © 2017. Published by Elsevier Inc.

  14. Association between Exposure to p,p′-DDT and Its Metabolite p,p′-DDE with Obesity: Integrated Systematic Review and Meta-Analysis

    PubMed Central

    Cano-Sancho, German; Salmon, Andrew G.

    2017-01-01

    Background: The prevalence of obesity is increasing in all countries, becoming a substantial public health concern worldwide. Increasing evidence has associated obesity with persistent pollutants such as the pesticide DDT and its metabolite p,p′-DDE. Objectives: Our objective was to systematically review the literature on the association between exposure to the pesticide DDT and its metabolites and obesity to develop hazard identification conclusions. Methods: We applied a systematic review-based strategy to identify and integrate evidence from epidemiological, in vivo, and in vitro studies. The evidence from prospective epidemiological studies was quantitatively synthesized by meta-analysis. We rated the body of evidence and integrated the streams of evidence to systematically develop hazard identification conclusions. Results: We identified seven epidemiological studies reporting prospective associations between exposure to p,p′-DDE and adiposity assessed by body mass index (BMI) z-score. The results from the meta-analysis revealed positive associations between exposure to p,p′-DDE and BMI z-score (β=0.13 BMI z-score (95% CI: 0.01, 0.25) per log increase of p,p′-DDE). Two studies constituted the primary in vivo evidence. Both studies reported positive associations between exposure to p,p′-DDT and increased adiposity in rodents. We identified 19 in vivo studies and 7 in vitro studies that supported the biological plausibility of the obesogenic effects of p,p′-DDT and p,p′-DDE. Conclusions: We classified p,p′-DDT and p,p′-DDE as “presumed” to be obesogenic for humans, based on a moderate level of primary human evidence, a moderate level of primary in vivo evidence, and a moderate level of supporting evidence from in vivo and in vitro studies. https://doi.org/10.1289/EHP527 PMID:28934091

  15. Association between Exposure to p,p'-DDT and Its Metabolite p,p'-DDE with Obesity: Integrated Systematic Review and Meta-Analysis.

    PubMed

    Cano-Sancho, German; Salmon, Andrew G; La Merrill, Michele A

    2017-09-18

    The prevalence of obesity is increasing in all countries, becoming a substantial public health concern worldwide. Increasing evidence has associated obesity with persistent pollutants such as the pesticide DDT and its metabolite p,p '-DDE. Our objective was to systematically review the literature on the association between exposure to the pesticide DDT and its metabolites and obesity to develop hazard identification conclusions. We applied a systematic review-based strategy to identify and integrate evidence from epidemiological, in vivo , and in vitro studies. The evidence from prospective epidemiological studies was quantitatively synthesized by meta-analysis. We rated the body of evidence and integrated the streams of evidence to systematically develop hazard identification conclusions. We identified seven epidemiological studies reporting prospective associations between exposure to p,p' -DDE and adiposity assessed by body mass index (BMI) z -score. The results from the meta-analysis revealed positive associations between exposure to p,p' -DDE and BMI z -score (β=0.13 BMI z -score (95% CI: 0.01, 0.25) per log increase of p,p' -DDE). Two studies constituted the primary in vivo evidence. Both studies reported positive associations between exposure to p,p' -DDT and increased adiposity in rodents. We identified 19 in vivo studies and 7 in vitro studies that supported the biological plausibility of the obesogenic effects of p,p' -DDT and p,p' -DDE. We classified p,p' -DDT and p,p' -DDE as "presumed" to be obesogenic for humans, based on a moderate level of primary human evidence, a moderate level of primary in vivo evidence, and a moderate level of supporting evidence from in vivo and in vitro studies. https://doi.org/10.1289/EHP527.

  16. Community clusters of tsunami vulnerability in the US Pacific Northwest

    PubMed Central

    Wood, Nathan J.; Jones, Jeanne; Spielman, Seth; Schmidtlein, Mathew C.

    2015-01-01

    Many coastal communities throughout the world are threatened by local (or near-field) tsunamis that could inundate low-lying areas in a matter of minutes after generation. Although the hazard and sustainability literature often frames vulnerability conceptually as a multidimensional issue involving exposure, sensitivity, and resilience to a hazard, assessments often focus on one element or do not recognize the hazard context. We introduce an analytical framework for describing variations in population vulnerability to tsunami hazards that integrates (i) geospatial approaches to identify the number and characteristics of people in hazard zones, (ii) anisotropic path distance models to estimate evacuation travel times to safety, and (iii) cluster analysis to classify communities with similar vulnerability. We demonstrate this approach by classifying 49 incorporated cities, 7 tribal reservations, and 17 counties from northern California to northern Washington that are directly threatened by tsunami waves associated with a Cascadia subduction zone earthquake. Results suggest three primary community groups: (i) relatively low numbers of exposed populations with varied demographic sensitivities, (ii) high numbers of exposed populations but sufficient time to evacuate before wave arrival, and (iii) moderate numbers of exposed populations but insufficient time to evacuate. Results can be used to enhance general hazard-awareness efforts with targeted interventions, such as education and outreach tailored to local demographics, evacuation training, and/or vertical evacuation refuges. PMID:25870283

  17. Trial Implementation of a Multihazard Risk Assessment Framework for High-Impact Low-Frequency Power Grid Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.

    The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less

  18. Trial Implementation of a Multihazard Risk Assessment Framework for High-Impact Low-Frequency Power Grid Events

    DOE PAGES

    Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.; ...

    2017-08-25

    The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less

  19. Hazard Analysis Guidelines for Transit Projects

    DOT National Transportation Integrated Search

    2000-01-01

    These hazard analysis guidelines discuss safety critical systems and subsystems, types of hazard analyses, when hazard analyses should be performed, and the hazard analysis philosophy. These guidelines are published by FTA to assist the transit indus...

  20. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less

  1. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  2. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  3. Fuels planning: science synthesis and integration; forest structure and fire hazard fact sheet 01: forest structure and fire hazard overview

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Many managers and policymakers guided by the National Environmental Policy Act process want to understand the scientific principles on which they can base fuel treatments for reducing the size and severity of wildfires. These Forest Structure and Fire Hazard fact sheets discuss how to estimate fire hazard, how to visualize fuel treatments, and how the role of...

  4. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  5. 40 CFR 265.340 - Applicability.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... (b) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) and (b)(3) of this... 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I), corrosive (Hazard Code... because it is reactive (Hazard Code R) for characteristics other than those listed in § 261.23(a) (4) and...

  6. 40 CFR 265.340 - Applicability.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... (b) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) and (b)(3) of this... 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I), corrosive (Hazard Code... because it is reactive (Hazard Code R) for characteristics other than those listed in § 261.23(a) (4) and...

  7. 40 CFR 264.340 - Applicability.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) through (b)(4) of this section... hazardous waste in part 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I... chapter solely because it is reactive (Hazard Code R) for characteristics other than those listed in § 261...

  8. 40 CFR 265.340 - Applicability.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... (b) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) and (b)(3) of this... 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I), corrosive (Hazard Code... because it is reactive (Hazard Code R) for characteristics other than those listed in § 261.23(a) (4) and...

  9. 78 FR 5861 - National Plan of Integrated Airport Systems: Clarification of Wildlife Hazard Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... Property Act for airport purposes (``Subject Airports''), to conduct Wildlife Hazard Site Visits (WHSVs) or... of land under the Surplus Property Act for airport purposes to identify and mitigate wildlife hazards.... These airports are typically smaller and have less air traffic, more piston-powered aircraft, and...

  10. Fuels planning: science synthesis and integration; social issues fact sheet 04: Three critical topics to cover when talking about hazards

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    The amount of science applicable to the management of wildfire hazards is increasing daily. In addition, the attitudes of landowners and policymakers about fire and fuels management are changing. This fact sheet discusses three critical keys to communicating about wildfire hazards.

  11. Pursuing the Delta -- Maximizing Opportunities to Integrate Sustainability in the Funding Processes

    DTIC Science & Technology

    2011-03-03

    that may contain safety and health hazards. This is not an all-inclusive list: a. Fire protection issues b. Toxic fumes (i.e., engine exhaust...hazards shall be reported as part of the SAR. A.6 Hazardous Materials. The contractor shall not use cadmium, hexavalent chromium , or other

  12. Identifying Hazards

    EPA Pesticide Factsheets

    The federal government has established a system of labeling hazardous materials to help identify the type of material and threat posed. Summaries of information on over 300 chemicals are maintained in the Envirofacts Master Chemical Integrator.

  13. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  14. Landslide hazard assessment : LIFE+IMAGINE project methodology and Liguria region use case

    NASA Astrophysics Data System (ADS)

    Spizzichino, Daniele; Campo, Valentina; Congi, Maria Pia; Cipolloni, Carlo; Delmonaco, Giuseppe; Guerrieri, Luca; Iadanza, Carla; Leoni, Gabriele; Trigila, Alessandro

    2015-04-01

    Scope of the work is to present a methodology developed for analysis of potential impacts in areas prone to landslide hazard in the framework of the EC project LIFE+IMAGINE. The project aims to implement a web services-based infrastructure addressed to environmental analysis, that integrates, in its own architecture, specifications and results from INSPIRE, SEIS and GMES. Existing web services has been customized to provide functionalities for supporting environmental integrated management. The implemented infrastructure has been applied to landslide risk scenarios, developed in selected pilot areas, aiming at: i) application of standard procedures to implement a landslide risk analysis; ii) definition of a procedure for assessment of potential environmental impacts, based on a set of indicators to estimate the different exposed elements with their specific vulnerability in the pilot area. The landslide pilot and related scenario are focused at providing a simplified Landslide Risk Assessment (LRA) through: 1) a landslide inventory derived from available historical and recent databases and maps; 2) landslide susceptibility and hazard maps; 3) assessment of exposure and vulnerability on selected typologies of elements at risk; 4) implementation of a landslide risk scenario for different sets of exposed elements 5) development of a use case; 6) definition of guidelines, best practices and production of thematic maps. The LRA has been implemented in Liguria region, Italy, in two different catchment areas located in the Cinque Terre National Park, characterized by a high landslide susceptibility and low resilience. The landslide risk impact analysis has been calibrated taking into account the socio-economic damage caused by landslides triggered by the October 2011 meteorological event. During this event, over 600 landslides were triggered in the selected pilot area. Most of landslides affected the diffuse system of anthropogenic terraces and caused the direct disruption of the walls as well as transportation of a large amount of loose sediments along the slopes and channels as induced consequence of the event. Application of a spatial analysis detected ca. 400 critical point along the road network with an average length of about 200 m. Over 1,000 buildings were affected and damaged by the event. The exposed population in the area involved by the event has been estimated in ca. 2,600 inhabitants (people?). In the pilot area, 19 different typologies of Cultural Heritage were affected by landslide phenomena or located in zones classified as high landslide hazard. The final scope of the landslide scenario is to improve the awareness on hazard, exposure, vulnerability and landslide risk in the Cinque Terre National Park to the benefit of local authorities and population. In addition, the results of the application will be used for updating the land planning process in order to improve the resilience of local communities, ii) implementing cost-benefit analysis aimed at the definition of guidelines for sustainable landslide risk mitigation strategies, iii) suggesting a general road map for the implementation of a local adaptation plan.

  15. Gemitis : an integrated and participative risk reduction strategy for the sustainable development of cities

    NASA Astrophysics Data System (ADS)

    Masure, P.

    2003-04-01

    The GEMITIS method has been implemented since 1995 into a global and integrated Risk Reduction Strategy for improving the seismic risk-assessment effectiveness in urban areas, including the generation of crisis scenarios and mid- to long term- seismic impact assessment. GEMITIS required us to provide more precise definitions of notions in common use by natural-hazard specialists, such as elements at risk and vulnerability. Until then, only the physical and human elements had been considered, and analysis of their vulnerability referred to their fragility in the face of aggression by nature. We have completed this approach by also characterizing the social and cultural vulnerability of a city and its inhabitants, and, with a wider scope, the functional vulnerability of the "urban system". This functional vulnerability depends upon the relations between the system elements (weak links in chains, functional relays, and defense systems) and upon the city's relations with the outside world (interdependence). Though well developed in methods for evaluating industrial risk (fault-tree analysis, event-tree analysis, multiple defense barriers, etc.), this aspect had until now been ignored by the "hard-science" specialists working on natural hazards. Based on the implementation of an Urban System Exposure methodology, we were able to identify specific human, institutional, or functional vulnerability factors for each urban system, which until had been very little discussed by risk-analysis and civil-protection specialists. In addition, we have defined the new concept of "main stakes" of the urban system, ranked by order of social value (or collective utility). Obviously, vital or strategic issues must be better resistant or protected against natural hazards than issues of secondary importance. The ranking of exposed elements of a city in terms of "main stakes" provides a very useful guide for adapting vulnerability studies and for orienting preventive actions. For this, GEMITIS is based on a systemic approach of the city and on value analysis of exposed elements. It facilitates a collective expertise for the definition of a preventive action plan based on the participation of the main urban actors (crisis preparedness, construction, land-use, etc.).

  16. GPS Position Time Series @ JPL

    NASA Technical Reports Server (NTRS)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  17. Alert generation and cockpit presentation for an integrated microburst alerting system

    NASA Technical Reports Server (NTRS)

    Wanke, Craig; Hansman, R. John, Jr.

    1991-01-01

    Alert generation and cockpit presentation issues for low level wind shear (microburst) alerts are investigated. Alert generation issues center on the development of a hazard criterion which allows integration of both ground based and airborne wind shear detection systems to form an accurate picture of the aviation hazard posed by a particular wind shear situation. A methodology for the testing of a hazard criteria through flight simulation has been developed, and has been used to examine the effectiveness and feasibility of several possible criteria. Also, an experiment to evaluate candidate graphical cockpit displays for microburst alerts using a piloted simulator has been designed.

  18. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...

  19. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...

  20. EPA Facility Registry Service (FRS): RCRA

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of hazardous waste facilities that link to the Resource Conservation and Recovery Act Information System (RCRAInfo). EPA's comprehensive information system in support of the Resource Conservation and Recovery Act (RCRA) of 1976 and the Hazardous and Solid Waste Amendments (HSWA) of 1984, RCRAInfo tracks many types of information about generators, transporters, treaters, storers, and disposers of hazardous waste. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to RCRAInfo hazardous waste facilities once the RCRAInfo data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs

  1. Natural-technological risk assessment and management

    NASA Astrophysics Data System (ADS)

    Burova, Valentina; Frolova, Nina

    2016-04-01

    EM-DAT statistical data on human impact and economic damages in the 1st semester 2015 are the highest since 2011: 41% of disasters were floods, responsible for 39% of economic damage and 7% of events were earthquakes responsible for 59% of total death toll. This suggests that disaster risk assessment and management still need to be improved and stay the principle issue in national and international related programs. The paper investigates the risk assessment and management practice in the Russian Federation at different levels. The method is proposed to identify the territories characterized by integrated natural-technological hazard. The maps of the Russian Federation zoning according to the integrated natural-technological hazard level are presented, as well as the procedure of updating the integrated hazard level taking into account the activity of separate processes. Special attention is paid to data bases on past natural and technological processes consequences, which are used for verification of current hazard estimation. The examples of natural-technological risk zoning for the country and some regions territory are presented. Different output risk indexes: both social and economic, are estimated taking into account requirements of end-users. In order to increase the safety of population of the Russian Federation the trans-boundaries hazards are also taken into account.

  2. Mechanisms and mediation in survival analysis: towards an integrated analytical framework.

    PubMed

    Pratschke, Jonathan; Haase, Trutz; Comber, Harry; Sharp, Linda; de Camargo Cancela, Marianna; Johnson, Howard

    2016-02-29

    A wide-ranging debate has taken place in recent years on mediation analysis and causal modelling, raising profound theoretical, philosophical and methodological questions. The authors build on the results of these discussions to work towards an integrated approach to the analysis of research questions that situate survival outcomes in relation to complex causal pathways with multiple mediators. The background to this contribution is the increasingly urgent need for policy-relevant research on the nature of inequalities in health and healthcare. The authors begin by summarising debates on causal inference, mediated effects and statistical models, showing that these three strands of research have powerful synergies. They review a range of approaches which seek to extend existing survival models to obtain valid estimates of mediation effects. They then argue for an alternative strategy, which involves integrating survival outcomes within Structural Equation Models via the discrete-time survival model. This approach can provide an integrated framework for studying mediation effects in relation to survival outcomes, an issue of great relevance in applied health research. The authors provide an example of how these techniques can be used to explore whether the social class position of patients has a significant indirect effect on the hazard of death from colon cancer. The results suggest that the indirect effects of social class on survival are substantial and negative (-0.23 overall). In addition to the substantial direct effect of this variable (-0.60), its indirect effects account for more than one quarter of the total effect. The two main pathways for this indirect effect, via emergency admission (-0.12), on the one hand, and hospital caseload, on the other, (-0.10) are of similar size. The discrete-time survival model provides an attractive way of integrating time-to-event data within the field of Structural Equation Modelling. The authors demonstrate the efficacy of this approach in identifying complex causal pathways that mediate the effects of a socio-economic baseline covariate on the hazard of death from colon cancer. The results show that this approach has the potential to shed light on a class of research questions which is of particular relevance in health research.

  3. Loss of Control Prevention and Recovery: Onboard Guidance, Control, and Systems Technologies

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.

    2012-01-01

    Loss of control (LOC) is one of the largest contributors to fatal aircraft accidents worldwide. LOC accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. These LOC hazards include vehicle impairment conditions, external disturbances; vehicle upset conditions, and inappropriate crew actions or responses. Hence, there is no single intervention strategy to prevent these accidents. NASA previously defined a comprehensive research and technology development approach for reducing LOC accidents and an associated integrated system concept. Onboard technologies for improved situation awareness, guidance, and control for LOC prevention and recovery are needed as part of this approach. Such systems should include: LOC hazards effects detection and mitigation; upset detection, prevention and recovery; and mitigation of combined hazards. NASA is conducting research in each of these areas. This paper provides an overview of this research, including the near-term LOC focus and associated analysis, as well as preliminary flight system architecture.

  4. Using In Vitro High-Throughput Screening Data for Predicting Benzo[k]Fluoranthene Human Health Hazards.

    PubMed

    Burgoon, Lyle D; Druwe, Ingrid L; Painter, Kyle; Yost, Erin E

    2017-02-01

    Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values, and risk screening values. We aim to use computational toxicology and quantitative high-throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene may induce DNA damage and steatosis using qHTS data and two separate adverse outcome pathways (AOPs). We also demonstrate how bootstrap natural spline-based meta-regression can be used to integrate data across multiple assay replicates to generate a concentration-response curve. We used this analysis to calculate an in vitro point of departure of 0.751 μM and risk-specific in vitro concentrations of 0.29 μM and 0.28 μM for 1:1,000 and 1:10,000 risk, respectively, for DNA damage. Based on the available evidence, and considering that only a single HSD17B4 assay is available, we have low overall confidence in the steatosis hazard identification. This case study suggests that coupling qHTS assays with AOPs and ontologies will facilitate hazard identification. Combining this with quantitative evidence integration methods, such as bootstrap meta-regression, may allow risk assessors to identify points of departure and risk-specific internal/in vitro concentrations. These results are sufficient to prioritize the chemicals; however, in the longer term we will need to estimate external doses for risk screening purposes, such as through margin of exposure methods. © 2016 Society for Risk Analysis.

  5. DECATASTROPHIZE - Use of SDSS and MCDA to prepare for disasters or plan for multiple hazards

    NASA Astrophysics Data System (ADS)

    Damalas, Andreas; Mettas, Christodoulos; Evagorou, Evagoras; Hadjimitsis, Diofantos

    2017-04-01

    This project presents effective early warning and alert systems in order to ensure lives and protect citizens, property and the environment in regards to natural and also man-made disasters. Civil protection can be rewarded from developed analysis tools in order to manage the resources available at all levels within the organization. The utilization of Geo-Spatial Early-warning Decision Support Systems (GE-DSS) combined with integrated Geographic Information System (GIS) solutions and multi-criteria decision analysis (MCDA) fuses text and geographic information into one view. DECAT' s purpose is the use of GE-DSS for rapid preparation ability and sustainability to assess and respond to multiple natural, man-made hazards disasters and environmental circumstances. This will be achieved by using existing models / systems in the direction of one multiplatform, which is distributed and integrated framework known as DECAT. The project is expected to create better prerequisites for, and improve preparedness, as well as enhance awareness of, civil protection, natural hazard and marine pollution professionals and volunteers. It intends to support and equilibrate the efforts of the participating states for the protection of citizens, environment and property in regards to natural and man-made disaster. Moreover, the respective project is pointing out the importance exchanging information and experience in meanings of improving the operations of all parties involved in civil protection (private and public professionals and volunteers). DECATASTROPHIZE targets for the support of the EU coordinate countries and potentials who do not participate in the ''Mechanisms and European Neighborhood Policy'' countries in the view of disaster Preparedness. Enhancing their cooperation their cooperation within the union Civil Protection Mechanism is also of high importance.

  6. Evaluation of the role of risk perception in stakeholder engagement to prevent lead exposure in an urban setting.

    PubMed

    Harclerode, Melissa A; Lal, Pankaj; Vedwan, Neeraj; Wolde, Bernabas; Miller, Michael E

    2016-12-15

    Stakeholder engagement is a vital sustainable remediation practice for obtaining useful feedback and identifying societal needs. Evaluating and integrating risk perception of stakeholders into remediation and outreach efforts allows for greater insight, increases the likelihood of success and ultimately, benefits the community by protecting its members from environmental hazards. In this study, we identified risk perception factors that influenced residents' level of concern for mitigating their exposure to elevated concentrations of lead in household paint and historic fill material. Risk perception factors were assessed by an in-person survey conducted in public green spaces. The analysis of survey participants' responses indicated that their perception of risk to exposed lead was mostly influenced by the presence of hazardous materials in close proximity to their residence, the ability to address pollution, and awareness, interest, and individual accountability in mitigating environmental risks. Responses also revealed that residents considered risk of lead and soil pollution as less menacing than the presence of more immediate and perceptible risks posed by factors such as air and water pollution. In addition, the community seemed to exhibit "optimism bias" and did not identify itself at high risk to susceptible and immediate hazards, including lead exposure. This lack of concern over lead exposure created a significant obstacle to community participation in state-led education and outreach programs. By integrating risk perception analysis and increasing stakeholder engagement, we can bring more attention to this issue, educate the public about the threat of lead pollution, and efficiently use financial resources to implement a more sustainable solution. Copyright © 2016. Published by Elsevier Ltd.

  7. Integrated detoxification methodology of hazardous phenolic wastewaters in environmentally based trickle-bed reactors: Experimental investigation and CFD simulation.

    PubMed

    Lopes, Rodrigo J G; Almeida, Teresa S A; Quinta-Ferreira, Rosa M

    2011-05-15

    Centralized environmental regulations require the use of efficient detoxification technologies for the secure disposal of hazardous wastewaters. Guided by federal directives, existing plants need reengineering activities and careful analysis to improve their overall effectiveness and to become environmentally friendly. Here, we illustrate the application of an integrated methodology which encompasses the experimental investigation of catalytic wet air oxidation and CFD simulation of trickle-bed reactors. As long as trickle-bed reactors are determined by the flow environment coupled with chemical kinetics, first, on the optimization of prominent numerical solution parameters, the CFD model was validated with experimental data taken from a trickle bed pilot plant specifically designed for the catalytic wet oxidation of phenolic wastewaters. Second, several experimental and computational runs were carried out under unsteady-state operation to evaluate the dynamic performance addressing the TOC concentration and temperature profiles. CFD computations of total organic carbon conversion were found to agree better with experimental data at lower temperatures. Finally, the comparison of test data with simulation results demonstrated that this integrated framework was able to describe the mineralization of organic matter in trickle beds and the validated consequence model can be exploited to promote cleaner remediation technologies of contaminated waters. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Integrated risk assessment and screening analysis of drinking water safety of a conventional water supply system.

    PubMed

    Sun, F; Chen, J; Tong, Q; Zeng, S

    2007-01-01

    Management of drinking water safety is changing towards an integrated risk assessment and risk management approach that includes all processes in a water supply system from catchment to consumers. However, given the large number of water supply systems in China and the cost of implementing such a risk assessment procedure, there is a necessity to first conduct a strategic screening analysis at a national level. An integrated methodology of risk assessment and screening analysis is thus proposed to evaluate drinking water safety of a conventional water supply system. The violation probability, indicating drinking water safety, is estimated at different locations of a water supply system in terms of permanganate index, ammonia nitrogen, turbidity, residual chlorine and trihalomethanes. Critical parameters with respect to drinking water safety are then identified, based on which an index system is developed to prioritize conventional water supply systems in implementing a detailed risk assessment procedure. The evaluation results are represented as graphic check matrices for the concerned hazards in drinking water, from which the vulnerability of a conventional water supply system is characterized.

  9. Evaluating the Benefits of Adaptation of Critical Infrastructures to Hydrometeorological Risks.

    PubMed

    Thacker, Scott; Kelly, Scott; Pant, Raghav; Hall, Jim W

    2018-01-01

    Infrastructure adaptation measures provide a practical way to reduce the risk from extreme hydrometeorological hazards, such as floods and windstorms. The benefit of adapting infrastructure assets is evaluated as the reduction in risk relative to the "do nothing" case. However, evaluating the full benefits of risk reduction is challenging because of the complexity of the systems, the scarcity of data, and the uncertainty of future climatic changes. We address this challenge by integrating methods from the study of climate adaptation, infrastructure systems, and complex networks. In doing so, we outline an infrastructure risk assessment that incorporates interdependence, user demands, and potential failure-related economic losses. Individual infrastructure assets are intersected with probabilistic hazard maps to calculate expected annual damages. Protection measure costs are integrated to calculate risk reduction and associated discounted benefits, which are used to explore the business case for investment in adaptation. A demonstration of the methodology is provided for flood protection of major electricity substations in England and Wales. We conclude that the ongoing adaptation program for major electricity assets is highly cost beneficial. © 2017 Society for Risk Analysis.

  10. Reducing pesticide risks to farming communities: cotton farmer field schools in Mali

    PubMed Central

    Settle, William; Soumaré, Mohamed; Sarr, Makhfousse; Garba, Mohamed Hama; Poisot, Anne-Sophie

    2014-01-01

    We provide results from a study of two separate sectors within the cotton-growing region of southern Mali. In one sector, farmers have engaged in a farmer field school (FFS) training programme since 2003—the other not. One goal of the training was the adoption of alternatives to the use of hazardous insecticides, through integrated pest management (IPM) methods. Over an 8-year period, analysis showed that with roughly 20% of the 4324 cotton-growing farm households having undergone training, hazardous insecticide use for the entire sector fell by 92.5% compared with earlier figures and with the second (control) sector. Yields for cotton in both sectors were highly variable over time, but no evidence was found for changes in yield owing to shifts in pest management practices. Evidence is presented for a likely diffusion of new practices having taken place, from FFS participants to non-participants. We discuss strengths and weaknesses of the FFS approach, in general, and highlight the need for improved baseline survey and impact analyses to be integrated into FFS projects. PMID:24535387

  11. Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project

    NASA Astrophysics Data System (ADS)

    van Eck, T.; Giardini, D.

    2010-12-01

    The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.

  12. Towards a Multi-Resolution Model of Seismic Risk in Central Asia. Challenge and perspectives

    NASA Astrophysics Data System (ADS)

    Pittore, M.; Wieland, M.; Bindi, D.; Parolai, S.

    2011-12-01

    Assessing seismic risk, defined as the probability of occurrence of economical and social losses as consequence of an earthquake, both at regional and at local scale is a challenging, multi-disciplinary task. In order to provide a reliable estimate, diverse information must be gathered by seismologists, geologists, engineers and civil authorities, and carefully integrated keeping into account the different levels of uncertainty. The research towards an integrated methodology, able to seamlessly describe seismic risk at different spatial scales is challenging, but discloses new application perspectives, particularly in those countries which suffer from a relevant seismic hazard but do not have resources for a standard assessment. Central Asian countries in particular, which exhibit one of the highest seismic hazard in the world, are experiencing a steady demographic growth, often accompanied by informal settlement and urban sprawling. A reliable evaluation of how these factors affect the seismic risk, together with a realistic assessment of the assets exposed to seismic hazard and their structural vulnerability is of particular importance, in order to undertake proper mitigation actions and to promptly and efficiently react to a catastrophic event. New strategies are needed to efficiently cope with systematic lack of information and uncertainties. An original approach is presented to assess seismic risk based on integration of information coming from remote-sensing and ground-based panoramic imaging, in situ measurements, expert knowledge and already available data. Efficient sampling strategies based on freely available medium-resolution multi-spectral satellite images are adopted to optimize data collection and validation, in a multi-scale approach. Panoramic imaging is also considered as a valuable ground-based visual data collection technique, suitable both for manual and automatic analysis. A full-probabilistic framework based on Bayes Network is proposed to integrate available information taking into account both aleatory and epistemic uncertainties. An improved risk model for the capital of Kyrgyz Republic, Biskek, has been developed following this approach and tested based on different earthquake scenarios. Preliminary results will be presented and discussed.

  13. A general framework for parametric survival analysis.

    PubMed

    Crowther, Michael J; Lambert, Paul C

    2014-12-30

    Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Developing a safe on-orbit cryogenic depot

    NASA Technical Reports Server (NTRS)

    Bahr, Nicholas J.

    1992-01-01

    New U.S. space initiatives will require technology to realize planned programs such as piloted lunar and Mars missions. Key to the optimal execution of such missions are high performance orbit transfer vehicles and propellant storage facilities. Large amounts of liquid hydrogen and oxygen demand a uniquely designed on-orbit cryogenic propellant depot. Because of the inherent dangers in propellant storage and handling, a comprehensive system safety program must be established. This paper shows how the myriad and complex hazards demonstrate the need for an integrated safety effort to be applied from program conception through operational use. Even though the cryogenic depot is still in the conceptual stage, many of the hazards have been identified, including fatigue due to heavy thermal loading from environmental and operating temperature extremes, micrometeoroid and/or depot ancillary equipment impact (this is an important problem due to the large surface area needed to house the large quantities of propellant), docking and maintenance hazards, and hazards associated with extended extravehicular activity. Various safety analysis techniques were presented for each program phase. Specific system safety implementation steps were also listed. Enhanced risk assessment was demonstrated through the incorporation of these methods.

  15. Heat wave hazard classification and risk assessment using artificial intelligence fuzzy logic.

    PubMed

    Keramitsoglou, Iphigenia; Kiranoudis, Chris T; Maiheu, Bino; De Ridder, Koen; Daglis, Ioannis A; Manunta, Paolo; Paganini, Marc

    2013-10-01

    The average summer temperatures as well as the frequency and intensity of hot days and heat waves are expected to increase due to climate change. Motivated by this consequence, we propose a methodology to evaluate the monthly heat wave hazard and risk and its spatial distribution within large cities. A simple urban climate model with assimilated satellite-derived land surface temperature images was used to generate a historic database of urban air temperature fields. Heat wave hazard was then estimated from the analysis of these hourly air temperatures distributed at a 1-km grid over Athens, Greece, by identifying the areas that are more likely to suffer higher temperatures in the case of a heat wave event. Innovation lies in the artificial intelligence fuzzy logic model that was used to classify the heat waves from mild to extreme by taking into consideration their duration, intensity and time of occurrence. The monthly hazard was subsequently estimated as the cumulative effect from the individual heat waves that occurred at each grid cell during a month. Finally, monthly heat wave risk maps were produced integrating geospatial information on the population vulnerability to heat waves calculated from socio-economic variables.

  16. IRIS Toxicological Review of Methanol (Non-Cancer) ...

    EPA Pesticide Factsheets

    EPA is conducting a peer review and public comment of the scientific basis supporting the human health hazard and dose-response assessment of methanol (non-cancer) that when finalized will appear on the Integrated Risk Information System (IRIS) database. EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of methanol (non-cancer) that will appear in the Integrated Risk Information System (IRIS) database.

  17. Integrating SAR with Optical and Thermal Remote Sensing for Operational Near Real-Time Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Webley, P.; Dehn, J.; Arko, S. A.; McAlpin, D. B.

    2013-12-01

    Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing techniques have become established in operational forecasting, monitoring, and managing of volcanic hazards. Monitoring organizations, like the Alaska Volcano Observatory (AVO), are nowadays heavily relying on remote sensing data from a variety of optical and thermal sensors to provide time-critical hazard information. Despite the high utilization of these remote sensing data to detect and monitor volcanic eruptions, the presence of clouds and a dependence on solar illumination often limit their impact on decision making processes. Synthetic Aperture Radar (SAR) systems are widely believed to be superior to optical sensors in operational monitoring situations, due to the weather and illumination independence of their observations and the sensitivity of SAR to surface changes and deformation. Despite these benefits, the contributions of SAR to operational volcano monitoring have been limited in the past due to (1) high SAR data costs, (2) traditionally long data processing times, and (3) the low temporal sampling frequencies inherent to most SAR systems. In this study, we present improved data access, data processing, and data integration techniques that mitigate some of the above mentioned limitations and allow, for the first time, a meaningful integration of SAR into operational volcano monitoring systems. We will introduce a new database interface that was developed in cooperation with the Alaska Satellite Facility (ASF) and allows for rapid and seamless data access to all of ASF's SAR data holdings. We will also present processing techniques that improve the temporal frequency with which hazard-related products can be produced. These techniques take advantage of modern signal processing technology as well as new radiometric normalization schemes, both enabling the combination of multiple observation geometries in change detection procedures. Additionally, it will be shown how SAR-based hazard information can be integrated with data from optical satellites, thermal sensors, webcams and models to create near-real time volcano hazard information. We will introduce a prototype monitoring system that integrates SAR-based hazard information into the near real-time volcano hazard monitoring system of the Alaska Volcano Observatory. This prototype system was applied to historic eruptions of the volcanoes Okmok and Augustine, both located in the North Pacific. We will show that for these historic eruptions, the addition of SAR data lead to a significant improvement in activity detection and eruption monitoring, and improved the accuracy and timeliness of eruption alerts.

  18. Potential postwildfire debris-flow hazards: a prewildfire evaluation for the Sandia and Manzano Mountains and surrounding areas, central New Mexico

    USGS Publications Warehouse

    Tillery, Anne C.; Haas, Jessica R.; Miller, Lara W.; Scott, Joe H.; Thompson, Matthew P.

    2014-01-01

    Wildfire can drastically increase the probability of debris flows, a potentially hazardous and destructive form of mass wasting, in landscapes that have otherwise been stable throughout recent history. Although there is no way to know the exact location, extent, and severity of wildfire, or the subsequent rainfall intensity and duration before it happens, probabilities of fire and debris-flow occurrence for different locations can be estimated with geospatial analysis and modeling efforts. The purpose of this report is to provide information on which watersheds might constitute the most serious, potential, debris-flow hazards in the event of a large-scale wildfire and subsequent rainfall in the Sandia and Manzano Mountains. Potential probabilities and estimated volumes of postwildfire debris flows in the unburned Sandia and Manzano Mountains and surrounding areas were estimated using empirical debris-flow models developed by the U.S. Geological Survey in combination with fire behavior and burn probability models developed by the U.S. Department of Agriculture Forest Service. The locations of the greatest debris-flow hazards correlate with the areas of steepest slopes and simulated crown-fire behavior. The four subbasins with the highest computed debris-flow probabilities (greater than 98 percent) were all in the Manzano Mountains, two flowing east and two flowing west. Volumes in sixteen subbasins were greater than 50,000 square meters and most of these were in the central Manzanos and the western facing slopes of the Sandias. Five subbasins on the west-facing slopes of the Sandia Mountains, four of which have downstream reaches that lead into the outskirts of the City of Albuquerque, are among subbasins in the 98th percentile of integrated relative debris-flow hazard rankings. The bulk of the remaining subbasins in the 98th percentile of integrated relative debris-flow hazard rankings are located along the highest and steepest slopes of the Manzano Mountains. One of the subbasins is several miles upstream from the community of Tajique and another is several miles upstream from the community of Manzano, both on the eastern slopes of the Manzano Mountains. This prewildfire assessment approach is valuable to resource managers because the analysis of the debris-flow threat is made before a wildfire occurs, which facilitates prewildfire management, planning, and mitigation. In northern New Mexico, widespread watershed restoration efforts are being carried out to safeguard vital watersheds against the threat of catastrophic wildfire. This study was initiated to help select ideal locations for the restoration efforts that could have the best return on investment.

  19. The spatiotemporal characteristics of environmental hazards caused by offshore oil and gas operations in the Gulf of Mexico.

    PubMed

    Meng, Qingmin

    2016-09-15

    Marine ecosystems are home to a host of numerous species ranging from tiny planktonic organisms, fishes, and birds, to large mammals such as the whales, manatees, and seals. However, human activities such as offshore oil and gas operations increasingly threaten marine and coastal ecosystems, for which there has been little exploration into the spatial and temporal risks of offshore oil operations. Using the Gulf of Mexico, one of the world's hottest spots of offshore oil and gas mining, as the study area, we propose a spatiotemporal approach that integrates spatial statistics and geostatistics in a geographic information system environment to provide insight to environmental management and decision making for oil and gas operators, coastal communities, local governments, and the federal government. We use the records from 1995 to 2015 of twelve types of hazards caused by offshore oil and gas operations, and analyze them spatially over a five year period. The spatial clusters of these hazards are analyzed and mapped using Getis-Ord Gi and local Moran's I statistics. We then design a spatial correlation coefficient matrix for multivariate spatial correlation, which is the ratio of the cross variogram of two types of hazards to the product of the variograms of the two hazards, showing a primary understanding of the degrees of spatial correlation among the twelve types hazards. To the best of our knowledge, it is the first application of spatiotemporal analysis methods to environmental hazards caused by offshore oil and gas operations; the proposed methods can be applied to other regions for the management and monitoring of environmental hazards caused by offshore oil operations. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Vulnerability and risk of deltaic social-ecological systems exposed to multiple hazards.

    PubMed

    Hagenlocher, Michael; Renaud, Fabrice G; Haas, Susanne; Sebesvari, Zita

    2018-08-01

    Coastal river deltas are hotspots of global change impacts. Sustainable delta futures are increasingly threatened due to rising hazard exposure combined with high vulnerabilities of deltaic social-ecological systems. While the need for integrated multi-hazard approaches has been clearly articulated, studies on vulnerability and risk in deltas either focus on local case studies or single hazards and do not apply a social-ecological systems perspective. As a result, vulnerabilities and risks in areas with strong social and ecological coupling, such as coastal deltas, are not fully understood and the identification of risk reduction and adaptation strategies are often based on incomplete assumptions. To overcome these limitations, we propose an innovative modular indicator library-based approach for the assessment of multi-hazard risk of social-ecological systems across and within coastal deltas globally, and apply it to the Amazon, Ganges-Brahmaputra-Meghna (GBM), and Mekong deltas. Results show that multi-hazard risk is highest in the GBM delta and lowest in the Amazon delta. The analysis reveals major differences between social and environmental vulnerability across the three deltas, notably in the Mekong and the GBM deltas where environmental vulnerability is significantly higher than social vulnerability. Hotspots and drivers of risk vary spatially, thus calling for spatially targeted risk reduction and adaptation strategies within the deltas. Ecosystems have been identified as both an important element at risk as well as an entry point for risk reduction and adaptation strategies. Copyright © 2018. Published by Elsevier B.V.

  1. Computer Analysis of Electromagnetic Field Exposure Hazard for Space Station Astronauts during Extravehicular Activity

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Kelley, James S.; Panneton, Robert B.; Arndt, G. Dickey

    1995-01-01

    In order to estimate the RF radiation hazards to astronauts and electronics equipment due to various Space Station transmitters, the electric fields around the various Space Station antennas are computed using the rigorous Computational Electromagnetics (CEM) techniques. The Method of Moments (MoM) was applied to the UHF and S-band low gain antennas. The Aperture Integration (AI) method and the Geometrical Theory of Diffraction (GTD) method were used to compute the electric field intensities for the S- and Ku-band high gain antennas. As a result of this study, The regions in which the electric fields exceed the specified exposure levels for the Extravehicular Mobility Unit (EMU) electronics equipment and Extravehicular Activity (EVA) astronaut are identified for various Space Station transmitters.

  2. Summaries of BFRL fire research in-house projects and grants, 1993

    NASA Astrophysics Data System (ADS)

    Jason, Nora H.

    1993-09-01

    The report describes the fire research projects performed in the Building and Fire Research Laboratory (BFRL) and under its extramural grants program during fiscal year 1993. The BFRL Fire Research Program has directed its efforts under three program thrusts. The in-house priority projects, grants, and externally-funded efforts thus form an integrated, focussed ensemble. The publication is organized along those lines: fire risk and hazard prediction - carbon monoxide prediction, turbulent combustion, soot, engineering analysis, fire hazard assessment, and large fires; fire safety of products and materials - materials combustion, furniture flammability, and wall and ceiling fires; and advanced technologies for fire sensing and control - fire detection and fire suppression. For the convenience of the reader, an alphabetical listing of all grants is contained in Part 2.0.

  3. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  4. Recent changes in flood damage in the United States from observations and ACME model

    NASA Astrophysics Data System (ADS)

    Leng, G.; Leung, L. R.

    2017-12-01

    Despite efforts to mitigate flood hazards in flood-prone areas, survey- and report-based flood databases show that flood damage has increased and emerged as one of the most costly disaster in the United States since the 1990s. Understanding the mechanism driving the changes in flood damage is therefore critical for reducing flood risk. In this study, we first conduct a comprehensive analysis of the changing characteristics of flood damage at local, state and country level. Results show a significant increasing trend in the number of flood hazards, causing economic losses of up to $7 billion per year. The ratio of flood events that caused tangible economical cost to the total flood events has exhibited a non-significant increasing trend before 2007 followed by a significant decrease, indicating a changing vulnerability to floods. Analysis also reveals distinct spatial and temporal patterns in the threshold intensity of flood hazards with tangible economical cost. To understand the mechanism behind the increasing flood damage, we develop a flood damage economic model coupled with the integrated hydrological modeling system of ACME that features a river routing model with an inundation parameterization and a water use and regulation model. The model is evaluated over the country against historical records. Several numerical experiments are then designed to explore the mechanisms behind the recent changes in flood damage from the perspective of flood hazard, exposure and vulnerability, which constitute flood damage. The role of human activities such as reservoir operations and water use in modifying regional floods are also explored using the new tool, with the goal of improving understanding and modeling of vulnerability to flood hazards.

  5. A New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.

    2017-12-01

    We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.

  6. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey.

    PubMed

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.

  7. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey

    NASA Astrophysics Data System (ADS)

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.

  8. Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes

    NASA Astrophysics Data System (ADS)

    Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.

    2012-07-01

    Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.

  9. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis

    PubMed Central

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011

  10. Download TRIM.Risk

    EPA Pesticide Factsheets

    TRIM.Risk is used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.

  11. Performance Analysis: Work Control Events Identified January - August 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Grange, C E; Freeman, J W; Kerr, C E

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting inmore » each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete. The second most frequent cause was unclear, incomplete or confusing documents directing the work. Together, these two causes were mentioned 17 times and contributed to 13 of the events. All of the events with the cause of ''workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete'' had this error in the first two ISMS functions: define the work and analyze the hazard. This means that these causes result in the scope of work being ill-defined or the hazard(s) improperly analyzed. Incomplete implementation of these functional steps leads to the hazards not being controlled. The causes are then manifested in events when the work is conducted. The process to operate safely relies on accurately defining the scope of work. This review has identified a number of examples of latent organizational weakness in the execution of work control processes.« less

  12. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  13. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  14. Using Integrated Earth and Social Science Data for Disaster Risk Assessment

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.; Yetman, G.

    2016-12-01

    Society faces many different risks from both natural and technological hazards. In some cases, disaster risk managers focus on only a few risks, e.g., in regions where a single hazard such as earthquakes dominate. More often, however, disaster risk managers deal with multiple hazards that pose diverse threats to life, infrastructure, and livelihoods. From the viewpoint of scientists, hazards are often studied based on traditional disciplines such as seismology, hydrology, climatology, and epidemiology. But from the viewpoint of disaster risk managers, data are needed on all hazards in a specific region and on the exposure and vulnerability of population, infrastructure, and economic resources and activity. Such managers also need to understand how hazards, exposures, and vulnerabilities may interact, and human and environmental systems respond, to hazard events, as in the case of the Fukushima nuclear disaster that followed from the Sendai earthquake and tsunami. In this regard, geospatial tools that enable visualization and analysis of both Earth and social science data can support the use case of disaster risk managers who need to quickly assess where specific hazard events occur relative to population and critical infrastructure. Such information can help them assess the potential severity of actual or predicted hazard events, identify population centers or key infrastructure at risk, and visualize hazard dynamics, e.g., earthquakes and their aftershocks or the paths of severe storms. This can then inform efforts to mitigate risks across multiple hazards, including reducing exposure and vulnerability, strengthening system resiliency, improving disaster response mechanisms, and targeting mitigation resources to the highest or most critical risks. We report here on initial efforts to develop hazard mapping tools that draw on open web services and support simple spatial queries about population exposure. The NASA Socioeconomic Data and Applications Center (SEDAC) Hazards Mapper, a web-based mapping tool, enables users to estimate population living in areas subject to flood or tornado warnings, near recent earthquakes, or around critical infrastructure. The HazPop mobile app, implemented for iOS devices, utilizes location services to support disaster risk managers working in field conditions.

  15. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less

  16. Definition and GIS-based characterization of an integral risk index applied to a chemical/petrochemical area.

    PubMed

    Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L

    2006-08-01

    A risk map of the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain) was designed following a two-stage procedure. The first step was the creation of a ranking system (Hazard Index) for a number of different inorganic and organic pollutants: heavy metals, polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs), polychlorinated biphenyls (PCBs) and polychlorinated aromatic hydrocarbons (PAHs) by applying self-organizing maps (SOM) to persistence, bioaccumulation and toxicity properties of the chemicals. PCBs seemed to be the most hazardous compounds, while the light PAHs showed the minimum values. Subsequently, an Integral Risk Index was developed taking into account the Hazard Index and the concentrations of all pollutants in soil samples collected in the assessed area of Tarragona. Finally, a risk map was elaborated by representing the spatial distribution of the Integral Risk Index with a geographic information system (GIS). The results of the present study seem to indicate that the development of an integral risk map might be useful to help in making-decision processes concerning environmental pollutants.

  17. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  18. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  19. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less

  20. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    NASA Technical Reports Server (NTRS)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  1. Primary human polarized small intestinal epithelial barriers respond differently to a hazardous and an innocuous protein.

    PubMed

    Eaton, A D; Zimmermann, C; Delaney, B; Hurley, B P

    2017-08-01

    An experimental platform employing human derived intestinal epithelial cell (IEC) line monolayers grown on permeable Transwell ® filters was previously investigated to differentiate between hazardous and innocuous proteins. This approach was effective at distinguishing these types of proteins and perturbation of monolayer integrity, particularly transepithelial electrical resistance (TEER), was the most sensitive indicator. In the current report, in vitro indicators of monolayer integrity, cytotoxicity, and inflammation were evaluated using primary (non-transformed) human polarized small intestinal epithelial barriers cultured on Transwell ® filters to compare effects of a hazardous protein (Clostridium difficile Toxin A [ToxA]) and an innocuous protein (bovine serum albumin [BSA]). ToxA exerted a reproducible decrease on barrier integrity at doses comparable to those producing effects observed from cell line-derived IEC monolayers, with TEER being the most sensitive indicator. In contrast, BSA, tested at concentrations substantially higher than ToxA, did not cause changes in any of the tested variables. These results demonstrate a similarity in response to certain proteins between cell line-derived polarized IEC models and a primary human polarized small intestinal epithelial barrier model, thereby reinforcing the potential usefulness of cell line-derived polarized IECs as a valid experimental platform to differentiate between hazardous and non-hazardous proteins. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. J-SHIS - an integrated system for knowing seismic hazard information in Japan

    NASA Astrophysics Data System (ADS)

    Azuma, H.; Fujiwara, H.; Kawai, S.; Hao, K. X.; Morikawa, N.

    2015-12-01

    An integrated system of Japan seismic hazard information station (J-SHIS) was established in 2005 for issuing and exchanging information of the National Seismic Hazard Maps for Japan that are based on seismic hazard assessment (SHA). A simplified app, also named J-SHIS, for smartphones is popularly used in Japan based on the integrated system of http://www.j-shis.bosai.go.jp/map/?lang=en. "Smartphone tells hazard" is realized on a cellphone, a tablet and/or a PC. At a given spot, the comprehensive information of SHA map can be easily obtained as below: 1) A SHA probability at given intensity (JMA=5-, 5+, 6-, 6+) within 30 years. 2) A site amplification factor varies within 0.5 ~ 3.0 and expectation is 1 based on surface geology map information. 3) A depth of seismic basement down to ~3,000m based on deeper borehole and geological structure. 4) Scenario earthquake maps: By choosing an active fault, one got the average case for different parameters of the modeling. Then choose a case, you got the shaking map of intensity with color scale. "Seismic Hazard Karte tells more hazard" is another app based on website of http://www.j-shis.bosai.go.jp/labs/karte/. (1) For every mesh of 250m x 250m, professional service SHA information is provided over national-world. (2) With five ranks for eight items, comprehensive SHA information could be delivered. (3) Site amplification factor with an average index is given. (4) Deeper geologic structure modeling is provided with borehole profiling. (5) A SHA probability is assessed within 30 and/or 50 years for the given site. (6) Seismic Hazard curves are given for earthquake sources from inland active fault, subduction zone, undetermined and their summarization. (7) The JMA seismic intensities are assessed in long-term averaged periods of 500-years to ~100,000 years. The app of J-SHIS can be downloaded freely from http://www.j-shis.bosai.go.jp/app-jshis.

  3. Knowledge to Action - Understanding Natural Hazards-Induced Power Outage Scenarios for Actionable Disaster Responses

    NASA Astrophysics Data System (ADS)

    Kar, B.; Robinson, C.; Koch, D. B.; Omitaomu, O.

    2017-12-01

    The Sendai Framework for Disaster Risk Reduction 2015-2030 identified the following four priorities to prevent and reduce disaster risks: i) understanding disaster risk; ii) strengthening governance to manage disaster risk; iii) investing in disaster risk reduction for resilience and; iv) enhancing disaster preparedness for effective response, and to "Build Back Better" in recovery, rehabilitation and reconstruction. While forecasting and decision making tools are in place to predict and understand future impacts of natural hazards, the knowledge to action approach that currently exists fails to provide updated information needed by decision makers to undertake response and recovery efforts following a hazard event. For instance, during a tropical storm event advisories are released every two to three hours, but manual analysis of geospatial data to determine potential impacts of the event tends to be time-consuming and a post-event process. Researchers at Oak Ridge National Laboratory have developed a Spatial Decision Support System that enables real-time analysis of storm impact based on updated advisory. A prototype of the tool that focuses on determining projected power outage areas and projected duration of outages demonstrates the feasibility of integrating science with decision making for emergency management personnel to act in real time to protect communities and reduce risk.

  4. Community exposure to potential climate-driven changes to coastal-inundation hazards for six communities in Essex County, Massachusetts

    USGS Publications Warehouse

    Abdollahian, Nina; Ratliff, Jamie L.; Wood, Nathan J.

    2016-11-09

    IntroductionUnderstanding if and how community exposure to coastal hazards may change over time is crucial information for coastal managers tasked with developing climate adaptation plans. This report summarizes estimates of population and asset exposure to coastal-inundation hazards associated with sea-level-rise and storm scenarios in six coastal communities of the Great Marsh region of Essex County, Massachusetts. This U.S. Geological Survey (USGS) analysis was conducted in collaboration with National Wildlife Federation (NWF) representatives, who are working with local stakeholders to develop local climate adaptation plans for the Towns of Salisbury, Newbury, Rowley, Ipswich, and Essex and the City of Newburyport (hereafter referred to as communities). Community exposure was characterized by integrating various community indicators (land cover and land use, population, economic assets, critical facilities, and infrastructure) with coastal-hazard zones that estimate inundation extents and water depth for three time periods.Estimates of community exposure are based on the presence of people, businesses, and assets in hazard zones that are calculated from geospatial datasets using geographic-information-system (GIS) tools. Results are based on current distributions of people and assets in hazard zones and do not take into account projections of human population, asset, or land-use changes over time. Results are not loss estimates based on engineering analysis or field surveys for any particular facility and do not take into account aspects of individual and household preparedness before an extreme event, adaptive capacity of a community during an event, or long-term resilience of individuals and communities after an event. Potential losses would match reported inventories only if all residents, business owners, public managers, and elected officials were unaware of what to do if warned of an imminent threat, failed to take protective measures during an extreme event, or failed to implement any long-term strategies to mitigate potential impacts. This analysis is intended to serve as a foundation for additional risk-related studies, plans, and mitigation efforts that are tailored to local needs. After a summary of the geospatial methods used in the analysis, results are organized by community so that local officials can easily use them in their local adaptation planning efforts.

  5. Economic and Ethical Consequences of Natural Hazards in Alpine Valleys (EE-Con)

    NASA Astrophysics Data System (ADS)

    Ortner, Florian; Brantl, Dirk; Meyer, Lukas; Steininger, Karl; Sass, Oliver

    2015-04-01

    The Alps and their population are particularly vulnerable to geomorphological and hydrological hazards and this problem might be amplified by ongoing climate change. Natural disasters cause severe monetary damage which often leads to the difficult question whether it socially pays to protect settlements at high costs or whether alternatively settlement areas should better be abandoned. By investigations in the Johnsbachtal and the Kleinsölktal (Styria), the interdisciplinary project "Economic and Ethical Consequences of Natural Hazards in Alpine Valleys" (EE-Con), funded by the Austrian Academy of Sciences, seeks to answer the following questions: (1) Are natural hazards and associated damages in fact increasing, and is this due to meteorological triggers, to anthropogenic factors or to internal process dynamics? (2) What is the perception and knowledge of local people, how is risk and risk prevention communicated? (3) What is the respective cost ratio between protection infrastructure, soft measures of adaptation and other options (e.g. reduction of settlement area)? (4) What legitimate claims to compensation do people have, how far does societal responsibility go and where does individual responsibility start if parts of the settlement area had to be abandoned? These questions will be tackled in an interdisciplinary cooperation between geography, economics and normative theory (philosophy). EE-Con will follow broadly the path of risk analysis and risk assessment, focusing on the temporal dimension (past - present - future) with the aim to unravel the history of natural hazards in the areas and to analyse the economic values involved. In the following, natural hazard scenarios for the future (2050 and 2100) will be developed considering the economic consequences. Besides this, the project deals with local knowledge, risk perception and risk communication, which will be investigated via group interviews and stakeholder workshops and be integrated into a human-ecological model. Therefore, local people and stakeholders are going to be involved in a transdisciplinary approach from the start of the project. The geographic and economic information will then be used to find proper weighing mechanisms answering the normative questions mentioned above. The questions of enduring the respective costs and of responsibility for protecting the respective entitlements will be investigated from an ethical, a legal, and an economic viewpoint. The overall outcome of the results should provide an integrative view on the economic and ethical consequences of natural hazards in alpine valleys.

  6. Helicopter Field Testing of NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) System fully Integrated with the Morpheus Vertical Test Bed Avionics

    NASA Technical Reports Server (NTRS)

    Epp, Chirold D.; Robertson, Edward A.; Ruthishauser, David K.

    2013-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second Morpheus vehicle.

  7. Helicopter Field Testing of NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) System fully integrated with the Morpheus Vertical Test Bed Avionics

    NASA Technical Reports Server (NTRS)

    Rutishauser, David; Epp, Chirold; Robertson, Edward

    2013-01-01

    The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second Morpheus vehicle.

  8. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  9. Cost Benefit Analysis of Integrated COTS Energy- Related Technologies for Army’s Force Provider Module

    DTIC Science & Technology

    2009-09-01

    Year Defense Plan (FYDP), on which the Department of Defense operates, subsequently needs 26 Richard G. Lugar, U.S. Senator for Indiana , “U.S...mature thin-film technologies exist such as Amorphous Silicon (a-Si), Cadmium Telluride (CdTe), and Copper Indium Gallium (di) Selenide (CIGS), all...cheaper processing, lower material costs, and is free of the environmental and health hazard issues of cadmium . Amorphous silicon coupled with

  10. Hazard assessment for small torrent catchments - lessons learned

    NASA Astrophysics Data System (ADS)

    Eisl, Julia; Huebl, Johannes

    2013-04-01

    The documentation of extreme events as a part of the integral risk management cycle is an important basis for the analysis and assessment of natural hazards. In July 2011 a flood event occurred in the Wölzer-valley in the province of Styria, Austria. For this event at the "Wölzerbach" a detailed event documentation was carried out, gathering data about rainfall, runoff and sediment transport as well as information on damaged objects, infrastructure or crops using various sources. The flood was triggered by heavy rainfalls in two tributaries of the Wölzer-river. Though a rain as well as a discharge gaging station exists for the Wölzer-river, the torrents affected by the high intensity rainfalls are ungaged. For these ungaged torrent catchments the common methods for hazard assessment were evaluated. The back-calculation of the rainfall event was done using a new approach for precipitation analysis. In torrent catchments especially small-scale and high-intensity rainfall events are mainly responsible for extreme events. Austria's weather surveillance radar is operated by the air traffic service "AustroControl". The usually available dataset is interpreted and shows divergences especially when it comes to high intensity rainfalls. For this study the raw data of the radar were requested and analysed. Further on the event was back-calculated with different rainfall-runoff models, hydraulic models and sediment transport models to obtain calibration parameters for future use in hazard assessment for this region. Since there are often problems with woody debris different scenarios were simulated. The calibrated and plausible results from the runoff models were used for the comparison with empirical approaches used in the practical sector. For the planning of mitigation measures of the Schöttl-torrent, which is one of the affected tributaries of the Wölzer-river, a physical scale model was used in addition to the insights of the event analysis to design a check dam for sediment retention. As far as the transport capacity of the lower reaches is limited a balance had to be found between protection on the one hand and sediment connectivity to the Wölzer-river on the other. The lessons learned kicked off discussions for future hazard assessment especially concerning the use of rainfall data and design precipitation values for small torrent catchments. Also the comparison with empirical values showed the need for differentiated concepts for hazard analysis. Therefor recommendations for the use of spatial rainfall reduction factors as well as the demarcation of hazard maps using different event scenarios are proposed.

  11. Developing strategies for maintaining tank car integrity during train accidents

    DOT National Transportation Integrated Search

    2007-09-11

    Accidents that lead to rupture of tank cars carrying : hazardous materials can cause serious public safety hazards and : substantial economic losses. The desirability of improved tank : car designs that are better equipped to keep the commodity : con...

  12. 46 CFR 111.105-5 - System integrity.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false System integrity. 111.105-5 Section 111.105-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING ELECTRIC SYSTEMS-GENERAL REQUIREMENTS Hazardous Locations § 111.105-5 System integrity. In order to maintain system integrity, each...

  13. 46 CFR 111.105-5 - System integrity.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false System integrity. 111.105-5 Section 111.105-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING ELECTRIC SYSTEMS-GENERAL REQUIREMENTS Hazardous Locations § 111.105-5 System integrity. In order to maintain system integrity, each...

  14. 46 CFR 111.105-5 - System integrity.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false System integrity. 111.105-5 Section 111.105-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING ELECTRIC SYSTEMS-GENERAL REQUIREMENTS Hazardous Locations § 111.105-5 System integrity. In order to maintain system integrity, each...

  15. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...

  16. Integrated risk reduction framework to improve railway hazardous materials transportation safety.

    PubMed

    Liu, Xiang; Saat, M Rapik; Barkan, Christopher P L

    2013-09-15

    Rail transportation plays a critical role to safely and efficiently transport hazardous materials. A number of strategies have been implemented or are being developed to reduce the risk of hazardous materials release from train accidents. Each of these risk reduction strategies has its safety benefit and corresponding implementation cost. However, the cost effectiveness of the integration of different risk reduction strategies is not well understood. Meanwhile, there has been growing interest in the U.S. rail industry and government to best allocate resources for improving hazardous materials transportation safety. This paper presents an optimization model that considers the combination of two types of risk reduction strategies, broken rail prevention and tank car safety design enhancement. A Pareto-optimality technique is used to maximize risk reduction at a given level of investment. The framework presented in this paper can be adapted to address a broader set of risk reduction strategies and is intended to assist decision makers for local, regional and system-wide risk management of rail hazardous materials transportation. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Integrated pest management policies in America's schools: is federal legislation needed?

    PubMed

    Taylor, Andrea Kidd; Esdaille, Kyle

    2010-01-01

    America's school children are at risk of developing asthma and other respiratory illnesses as a result of exposure to hazardous pesticides. Integrated pest management (IPM) policies are being implemented in states and school districts across the country; however, the content and regulation of these policies vary. The need for standardization of such policies and a federal IPM law is the only way to ensure that children in America's schools are adequately protected from exposure to hazardous pesticides used to control pests.

  18. Scale is not an Issue : Opportunities for Collaboration among Geoscientists in Latin America and the Caribbean

    NASA Astrophysics Data System (ADS)

    Carby, B. E.

    2015-12-01

    Latin American and Caribbean (LAC) countries face multiple hazards such as earthquakes, volcanoes, accelerated erosion, landslides, drought, flooding, windstorms and the effects of climate variability and change. World Bank (2005) data indicate that seventeen of the top thirty-five countries with relatively high mortality risk from 3 or more hazards are located in LAC, El Salvador has the second highest per cent of its population at risk - 77.7% and 7 of the top 10 countries for population exposure to multiple hazards are in LAC. All LAC countries have half or more of GDP exposed to at least one hazard. The report underscores the need for better data and information on hazards and disasters to inform disaster risk reduction (DRR) and supports the view that reduction of disaster risk is essential for achieving Sustainable Development (SD). This suggests that DRR must be integrated into development planning of countries. However the Global Assessment Report notes that globally, there has been little progress in mainstreaming DRR in national development (UNISDR 2009). Without this, countries will not realise development goals. DRR efforts in LAC require an integrated approach including societal input in deciding priority DRR research themes and interdisciplinary, multi-hazard research informing DRR policy and practice. Jiminez (2015) from a study of countries across LAC reports that efforts are being made to link research to national planning through inclusion of policy makers in some university-led research projects. Research by the author in Jamaica reveals that the public sector has started to apply research on hazards to inform DRR policy, programmes and plans. As most research is done by universities, there is collaboration between the public sector and academia. Despite differences in scale among countries across the region, similarities in exposure to multiple hazards and potential hazard impacts suggest that collaboration among researchers in LAC could be beneficial. It is proposed here that this collaboration should go beyond the scientific community and should include sharing of experiences in linking DRR research to national development needs, inclusion of policy makers in research design and implementation and integration of research results in policy and programme development.

  19. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    NASA Technical Reports Server (NTRS)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  20. 77 FR 31815 - Hazardous Materials Regulations: Combustible Liquids

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-30

    ... are: Safety (hazard communication and packaging integrity); International commerce (frustration/delay... exempt seasonal workers from the Federal Motor Carrier Safety Administration's Commercial Driver's...: Anyone is able to search the electronic form of any written communications and comments received into any...

  1. [Hazard function and life table: an introduction to the failure time analysis].

    PubMed

    Matsushita, K; Inaba, H

    1987-04-01

    Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.

  2. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-26

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...

  3. ARISTOTLE (All Risk Integrated System TOwards The hoListic Early-warning)

    NASA Astrophysics Data System (ADS)

    Michelini, Alberto; Wotawa, Gerhard; Arnold-Arias, Delia

    2017-04-01

    The Emergency Response Coordination Centre (ERCC) is the EU coordination office for humanitarian aid and civil protection operations of DG ECHO (EU Humanitarian Aid and Civil Protection). ERCC needs rapidly authoritative multi-hazard scientific expertise and analysis on 24*7 basis since, when a disaster strikes, every minute counts for saving lives and immediate, coordinated and pre-planned response is essential. The EU is committed to providing disaster response in a timely and efficient manner and to ensure European assistance meets the real needs in the population affected, whether in Europe or beyond. The ARISTOTLE consortium was awarded the European Commission's DG ECHO "Pilot project in the area of Early Warning System for natural disasters" (OJ 2015 S/154-283349). The tender articulates the needs and expectations of DG ECHO in respect of the provision of multi-hazard advice to the Emergency Response & Coordination Centre in Brussels. Specifically, the tender aims to fill the gap in knowledge that exists in the: • first 3 hours immediately after an event that has the potential to require a country to call on international help • provision of longer term advice following an emergency • provision of advice when a potential hazardous event is starting to form; this will usually be restricted to severe weather and flooding events and when possible to volcanic events. The ARISTOTLE Consortium was awarded the tender and the project effectively started on February 1st, 2016, for a duration of 2 years. ARISTOTLE (aristotle.ingv.it) is a multi-hazard partnership created by combining expertise from of total of 5 hazard groups [4 main hazard groups plus a sub-hazard - Severe Weather, Floods, Volcanos (only for ashes and gases hazard deriving from eruptions), Earthquakes and the related Tsunamis as a sub-hazard given its peculiarities and potential huge impact]. Each Hazard Group brings together experts from the particular hazard domain to deliver a 'collective analysis' which is then fed into the partnership multi-hazard discussions. The hazards are very different and have very diverse timelines for phenomenological occurrence (Figure 1). The ARISTOTLE consortium includes 15 partner institutions (11 from EU Countries; 2 from non-EU countries and 2 European organizations) operating in the Meteorological and Geophysical domains. The project coordination is shared among INGV and ZAMG for the geophysical and meteorological communities, respectively. Primary target of the tender project is the prototyping and the implementation of a scalable system (in terms of number of partners and hazards) capable of providing to ERCC the "desiderata" above. To this end, the activities of the project have been focusing on the establishment of a multi-hazard operational board (MHOB) that is assigned the 24*7 operational duty regulated by a "Standard Operating Protocol". The presentation will illustrate the different modes of operation envisaged and the status and the solutions found by the project consortium to respond to the ERCC requirements.

  4. Multi-hazard risk analysis related to hurricanes

    NASA Astrophysics Data System (ADS)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.

  5. Coupling effect analysis between landslides, river channel changes and sediment budgets - extreme climate events in Laishe River, southern Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, Kuo-Jen; Huang, Mei-Jen; Tseng, Chih-Ming

    2016-04-01

    Taiwan, due to the high seismicity and high annual rainfall, numerous landslides triggered every year and severe impacts affect the island. Concerning to the catastrophic landslides, the key information of landslide, including range of landslide, volume estimation and the subsequent evolution are important when analyzing the triggering mechanism, hazard assessment and mitigation. Thus, the morphological analysis gives a general overview for the landslides and been considered as one of the most fundamental information. Typhoon Morakot brought extreme and long-time rainfall for Taiwan in August 2009, and caused severe disasters. In this study we integrate several technologies, especially by Unmanned Aerial Vehicle (UAV) and multi-spectral camera, to decipher the consequence and the potential hazard, and the social impact. In recent years, the remote sensing technology improves rapidly, providing a wide range of image, essential and precise information. This study integrates several methods, including, 1) Remote-sensing images gathered by Unmanned Aerial Vehicle (UAV) and by aerial photos taken in different periods; 2) field in-situ geologic investigation; 3) Differential GPS, RTK GPS geomatic measurements. The methods allow to constructing the DTMs before and after landslide, as well as the subsequent periods by using aerial photos and UAV derived images. The data sets permits to analysis the morphological changes. In the past, the study of sediment budgets usually relies on field investigation, but due to inconvenient transportation, topographical barriers, or located in remote areas, etc. the survey is hardly to be completed sometimes. In recent years, the rapid development of remote sensing technology improves image resolution and quality significantly. Remote sensing technology can provide a wide range of image data, and provide essential and precious information. The purpose of this study is to investigate the phenomenon of river migration and to evaluate the amount of migration along Laishe River by analyzing the 3D DEM before and after the typhoon Morakot. The DEMs are built by using the aerial images taken by digital mapping camera (DMC) and by airborne digital scanner 40 (ADS40) before and after typhoon event. Recently, this research integrates Unmanned Aerial Vehicle (UAV) and oblique photogrammetric technologies for image acquisition by 5-10cm GSD photos. This approach permits to construct true 3D model so as to decipher ground information more realistically. 10-20cm DSM and DEM, and field GPS, were compiled together to decipher the morphologic changes. All the information, especially by means of true 3D model, the datasets provides detail ground information that may use to evaluate the landslide triggering mechanism and river channel evolution. The goals of this study is to integrates the UAS system and to decipher the sliding process and morphologic changes of large landslide areas, sediment transport and budgets, and to investigate the phenomenon of river migration. The results of this study provides not only geomatics and GIS dataset of the hazards, but also for essential geomorphologic information for other study, and for hazard mitigation and planning, as well.

  6. Towards sets of hazardous waste indicators. Essential tools for modern industrial management.

    PubMed

    Peterson, Peter J; Granados, Asa

    2002-01-01

    Decision-makers require useful tools, such as indicators, to help them make environmentally sound decisions leading to effective management of hazardous wastes. Four hazardous waste indicators are being tested for such a purpose by several countries within the Sustainable Development Indicator Programme of the United Nations Commission for Sustainable Development. However, these indicators only address the 'down-stream' end-of-pipe industrial situation. More creative thinking is clearly needed to develop a wider range of indicators that not only reflects all aspects of industrial production that generates hazardous waste but considers socio-economic implications of the waste as well. Sets of useful and innovative indicators are proposed that could be applied to the emerging paradigm shift away from conventional end-of-pipe management actions and towards preventive strategies that are being increasingly adopted by industry often in association with local and national governments. A methodological and conceptual framework for the development of a core-set of hazardous waste indicators has been developed. Some of the indicator sets outlined quantify preventive waste management strategies (including indicators for cleaner production, hazardous waste reduction/minimization and life cycle analysis), whilst other sets address proactive strategies (including changes in production and consumption patterns, eco-efficiency, eco-intensity and resource productivity). Indicators for quantifying transport of hazardous wastes are also described. It was concluded that a number of the indicators proposed could now be usefully implemented as management tools using existing industrial and economic data. As cleaner production technologies and waste minimization approaches are more widely deployed, and industry integrates environmental concerns at all levels of decision-making, it is expected that the necessary data for construction of the remaining indicators will soon become available.

  7. Recent Progress in Understanding Natural-Hazards-Generated TEC Perturbations: Measurements and Modeling Results

    NASA Astrophysics Data System (ADS)

    Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.

    2015-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.

  8. [Experience in using cluster analysis and the prognosis of the stages of epizootic risk for Newcastle disease in poultry].

    PubMed

    Elitsina, P; Parannzhilova, M; Vukov, M

    1985-01-01

    An attempt was made to use clastic analysis and projection of the steps of the epizootic hazard with regard to Newcastle disease. There were no records of the disease after 1982 but still it is the object of consideration on the part of specialists. A clastic analysis was made on the base of a factorial model of Newcastle disease in this country for the 1970-1979. Used was a programme already worked out and practised at the Pilot Computor Center of the Medical Academy, Sofia. The districts in the country were grouped in a dendogramme, 16 groups being distinguished out of a total of 27 districts. This showed that regardless of the small territory of the country the districts are sufficiently differing between each other (due to the various degrees of integration) so that they could not be grouped together by similar values of intensity of poultry breeding and epizootic conjuncture with regard to Newcastle disease. Districts of epizootic hazard ranging from 1st to 5th degree substantiate the use of a differential approach in building up the tactics of prophylaxis and control of the disease, while measures concerning the poultry dressing combines should be at a high level regardless of the category of the respective district. The grouping of neighboring districts on a territorial principle disclosed the existence of reasons of climatic-and-geographic nature that could predispose equal degrees of intensity of poultry breeding and epizootic hazard.

  9. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2017-04-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  10. Molecular Classification Substitutes for the Prognostic Variables Stage, Age, and MYCN Status in Neuroblastoma Risk Assessment.

    PubMed

    Rosswog, Carolina; Schmidt, Rene; Oberthuer, André; Juraeva, Dilafruz; Brors, Benedikt; Engesser, Anne; Kahlert, Yvonne; Volland, Ruth; Bartenhagen, Christoph; Simon, Thorsten; Berthold, Frank; Hero, Barbara; Faldum, Andreas; Fischer, Matthias

    2017-12-01

    Current risk stratification systems for neuroblastoma patients consider clinical, histopathological, and genetic variables, and additional prognostic markers have been proposed in recent years. We here sought to select highly informative covariates in a multistep strategy based on consecutive Cox regression models, resulting in a risk score that integrates hazard ratios of prognostic variables. A cohort of 695 neuroblastoma patients was divided into a discovery set (n=75) for multigene predictor generation, a training set (n=411) for risk score development, and a validation set (n=209). Relevant prognostic variables were identified by stepwise multivariable L1-penalized least absolute shrinkage and selection operator (LASSO) Cox regression, followed by backward selection in multivariable Cox regression, and then integrated into a novel risk score. The variables stage, age, MYCN status, and two multigene predictors, NB-th24 and NB-th44, were selected as independent prognostic markers by LASSO Cox regression analysis. Following backward selection, only the multigene predictors were retained in the final model. Integration of these classifiers in a risk scoring system distinguished three patient subgroups that differed substantially in their outcome. The scoring system discriminated patients with diverging outcome in the validation cohort (5-year event-free survival, 84.9±3.4 vs 63.6±14.5 vs 31.0±5.4; P<.001), and its prognostic value was validated by multivariable analysis. We here propose a translational strategy for developing risk assessment systems based on hazard ratios of relevant prognostic variables. Our final neuroblastoma risk score comprised two multigene predictors only, supporting the notion that molecular properties of the tumor cells strongly impact clinical courses of neuroblastoma patients. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. An Integrated Approach for Urban Earthquake Vulnerability Analyses

    NASA Astrophysics Data System (ADS)

    Düzgün, H. S.; Yücemen, M. S.; Kalaycioglu, H. S.

    2009-04-01

    The earthquake risk for an urban area has increased over the years due to the increasing complexities in urban environments. The main reasons are the location of major cities in hazard prone areas, growth in urbanization and population and rising wealth measures. In recent years physical examples of these factors are observed through the growing costs of major disasters in urban areas which have stimulated a demand for in-depth evaluation of possible strategies to manage the large scale damaging effects of earthquakes. Understanding and formulation of urban earthquake risk requires consideration of a wide range of risk aspects, which can be handled by developing an integrated approach. In such an integrated approach, an interdisciplinary view should be incorporated into the risk assessment. Risk assessment for an urban area requires prediction of vulnerabilities related to elements at risk in the urban area and integration of individual vulnerability assessments. However, due to complex nature of an urban environment, estimating vulnerabilities and integrating them necessities development of integrated approaches in which vulnerabilities of social, economical, structural (building stock and infrastructure), cultural and historical heritage are estimated for a given urban area over a given time period. In this study an integrated urban earthquake vulnerability assessment framework, which considers vulnerability of urban environment in a holistic manner and performs the vulnerability assessment for the smallest administrative unit, namely at neighborhood scale, is proposed. The main motivation behind this approach is the inability to implement existing vulnerability assessment methodologies for countries like Turkey, where the required data are usually missing or inadequate and decision makers seek for prioritization of their limited resources in risk reduction in the administrative districts from which they are responsible. The methodology integrates socio-economical, structural, coastal, ground condition, organizational vulnerabilities, as well as accessibility to critical services within the framework. The proposed framework has the following eight components: Seismic hazard analysis, soil response analysis, tsunami inundation analysis, structural vulnerability analysis, socio-economic vulnerability analysis, accessibility to critical services, GIS-based integrated vulnerability assessment, and visualization of vulnerabilities in 3D virtual city model The integrated model for various vulnerabilities obtained for the urban area is developed in GIS environment by using individual vulnerability assessments for considered elements at risk and serve for establishing the backbone of the spatial decision support system. The stages followed in the model are: Determination of a common mapping unit for each aspect of urban earthquake vulnerability, formation of a geo-database for the vulnerabilities, evaluation of urban vulnerability based on multi attribute utility theory with various weighting algorithms, mapping of the evaluated integrated earthquake risk in geographic information systems (GIS) in the neighborhood scale. The framework is also applicable to larger geographical mapping scales, for example, the building scale. When illustrating the results in building scale, 3-D visualizations with remote sensing data is used so that decision-makers can easily interpret the outputs. The proposed vulnerability assessment framework is flexible and can easily be applied to urban environments at various geographical scales with different mapping units. The obtained total vulnerability maps for the urban area provide a baseline for the development of risk reduction strategies for the decision makers. Moreover, as several aspects of elements at risk for an urban area is considered through vulnerability analyses, effect on changes in vulnerability conditions on the total can easily be determined. The developed approach also enables decision makers to monitor temporal and spatial changes in the urban environment due to implementation of risk reduction strategies.

  12. Evaluation of Prototype Head Shield for Hazardous Material Tank Car

    DOT National Transportation Integrated Search

    1976-12-01

    The structural integrity of a prototype tank car head shield for hazardous material railroad tank cars was evaluated under conditions of freight car coupling at moderate to high speeds. This is one of the most severe environments encountered in norma...

  13. Using the Triad Approach to Improve the Cost-effectiveness of Hazardous Waste Site Cleanups

    EPA Pesticide Factsheets

    U.S. EPA's Office of Solid Waste and Emergency Response is promoting more effective strategies for characterizing, monitoring, and cleaning up hazardous waste sites. In particular, a paradigm based on using an integrated triad of systematic planning...

  14. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  15. Integrated risk analysis of a heavy-metal-contaminated site in Taiwan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ching-Tsan Tsai; Wang, J.H.C.

    1996-12-31

    The Love Canal episode began the long battle on hazardous wastes in the United States. Obviously, the potential danger of hazardous wastes is one of the hottest issues among environmental professionals as well as the public. The problems of hazardous wastes in economically booming Taiwan are also alarming. Several farmlands in northern Taiwan were contaminated heavily by industrial effluents containing heavy metals (cadmium and lead) in the early 1980s. Regardless of the many studies that have been conducted about these polluted farmlands, there has not been any remediation - just a passive abandonment of farming activities with minimal compensation. Thismore » paper addresses a heavy-metal-contaminated fanning area. A pollution profile across time is delineated using information from the abundance of reports, and the contamination is modeled mathematically. The past, the present, and future exposures are also modeled. The results are presented in terms of societal impacts and health effects. Reasonable soil guidelines for cleanup are estimated, and recommendations for rational mitigation solutions are presented. The current strategies for cleanup actions are also described. 23 refs., 4 figs., 5 tabs.« less

  16. Integrating Behavioral Health in Primary Care Using Lean Workflow Analysis: A Case Study

    PubMed Central

    van Eeghen, Constance; Littenberg, Benjamin; Holman, Melissa D.; Kessler, Rodger

    2016-01-01

    Background Primary care offices are integrating behavioral health (BH) clinicians into their practices. Implementing such a change is complex, difficult, and time consuming. Lean workflow analysis may be an efficient, effective, and acceptable method for integration. Objective Observe BH integration into primary care and measure its impact. Design Prospective, mixed methods case study in a primary care practice. Measurements Change in treatment initiation (referrals generating BH visits within the system). Secondary measures: primary care visits resulting in BH referrals, referrals resulting in scheduled appointments, time from referral to scheduled appointment, and time from referral to first visit. Providers and staff were surveyed on the Lean method. Results Referrals increased from 23 to 37/1000 visits (P<.001). Referrals resulted in more scheduled (60% to 74%, P<.001) and arrived visits (44% to 53%, P=.025). Time from referral to first scheduled visit decreased (Hazard Ratio (HR) 1.60; 95% Confidence Interval (CI) 1.37, 1.88; P<0.001) as did time to first arrived visit (HR 1.36; 95% CI 1.14, 1.62; P=0.001). Surveys and comments were positive. Conclusions This pilot integration of BH showed significant improvements in treatment initiation and other measures. Strengths of Lean included workflow improvement, system perspective, and project success. Further evaluation is indicated. PMID:27170796

  17. Re-evaluation and updating of the seismic hazard of Lebanon

    NASA Astrophysics Data System (ADS)

    Huijer, Carla; Harajli, Mohamed; Sadek, Salah

    2016-01-01

    This paper presents the results of a study undertaken to evaluate the implications of the newly mapped offshore Mount Lebanon Thrust (MLT) fault system on the seismic hazard of Lebanon and the current seismic zoning and design parameters used by the local engineering community. This re-evaluation is critical, given that the MLT is located at close proximity to the major cities and economic centers of the country. The updated seismic hazard was assessed using probabilistic methods of analysis. The potential sources of seismic activities that affect Lebanon were integrated along with any/all newly established characteristics within an updated database which includes the newly mapped fault system. The earthquake recurrence relationships of these sources were developed from instrumental seismology data, historical records, and earlier studies undertaken to evaluate the seismic hazard of neighboring countries. Maps of peak ground acceleration contours, based on 10 % probability of exceedance in 50 years (as per Uniform Building Code (UBC) 1997), as well as 0.2 and 1 s peak spectral acceleration contours, based on 2 % probability of exceedance in 50 years (as per International Building Code (IBC) 2012), were also developed. Finally, spectral charts for the main coastal cities of Beirut, Tripoli, Jounieh, Byblos, Saida, and Tyre are provided for use by designers.

  18. Interdisciplinary modeling and analysis to reduce loss of life from tsunamis

    NASA Astrophysics Data System (ADS)

    Wood, N. J.

    2016-12-01

    Recent disasters have demonstrated the significant loss of life and community impacts that can occur from tsunamis. Minimizing future losses requires an integrated understanding of the range of potential tsunami threats, how individuals are specifically vulnerable to these threats, what is currently in place to improve their chances of survival, and what risk-reduction efforts could be implemented. This presentation will provide a holistic perspective of USGS research enabled by recent advances in geospatial modeling to assess and communicate population vulnerability to tsunamis and the range of possible interventions to reduce it. Integrated research includes efforts to characterize the magnitude and demography of at-risk individuals in tsunami-hazard zones, their evacuation potential based on landscape conditions, nature-based mitigation to improve evacuation potential, evacuation pathways and population demand at assembly areas, siting considerations for vertical-evacuation refuges, community implications of multiple evacuation zones, car-based evacuation modeling for distant tsunamis, and projected changes in population exposure to tsunamis over time. Collectively, this interdisciplinary research supports emergency managers in their efforts to implement targeted risk-reduction efforts based on local conditions and needs, instead of generic regional strategies that only focus on hazard attributes.

  19. Assessing exposure risks for freshwater tilapia species posed by mercury and methylmercury.

    PubMed

    Cheng, Yi-Hsien; Lin, Yi-Jun; You, Shu-Han; Yang, Ying-Fei; How, Chun Ming; Tseng, Yi-Ting; Chen, Wei-Yu; Liao, Chung-Min

    2016-08-01

    Waterborne and dietborne exposures of freshwater fish to mercury (Hg) in the forms of inorganic (Hg(II)) and organic (methylmercury or MeHg) affect their growth, development, and reproduction. However, an integrated mechanistic risk model framework to predict the impact of Hg(II)/MeHg on freshwater fish is lacking. Here, we integrated biokinetic, physiological and biogeographic data to calibrate and then establish key risk indices-hazardous quotient and exceedance risk-for freshwater tilapia species across geographic ranges of several major rivers in Taiwan. We found that Hg(II) burden was highest in kidney followed by gill, intestine, liver, blood, and muscle. Our results showed that Hg was less likely to pose mortality risk (mortality rate less than 5 %) for freshwater tilapia species. However, Hg is likely to pose the potential hazard to aquatic environments constrained by safety levels for aquatic organisms. Sensitivity analysis showed that amount of Hg accumulated in tilapia was most influenced by sediment uptake rate. Our approach opens up new possibilities for predicting future fish population health with the impacts of continued Hg exposure to provide information on which fish are deemed safe for human consumption.

  20. 77 FR 55371 - System Safety Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-07

    ...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...

  1. An Inertial Dual-State State Estimator for Precision Planetary Landing with Hazard Detection and Avoidance

    NASA Technical Reports Server (NTRS)

    Bishop, Robert H.; DeMars, Kyle; Trawny, Nikolas; Crain, Tim; Hanak, Chad; Carson, John M.; Christian, John

    2016-01-01

    The navigation filter architecture successfully deployed on the Morpheus flight vehicle is presented. The filter was developed as a key element of the NASA Autonomous Landing and Hazard Avoidance Technology (ALHAT) project and over the course of 15 free fights was integrated into the Morpheus vehicle, operations, and flight control loop. Flight testing completed by demonstrating autonomous hazard detection and avoidance, integration of an altimeter, surface relative velocity (velocimeter) and hazard relative navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman flter software, and landing within 2 meters of the vertical testbed GPS-based navigation solution at the safe landing site target. Morpheus followed a trajectory that included an ascent phase followed by a partial descent-to-landing, although the proposed filter architecture is applicable to more general planetary precision entry, descent, and landings. The main new contribution is the incorporation of a sophisticated hazard relative navigation sensor-originally intended to locate safe landing sites-into the navigation system and employed as a navigation sensor. The formulation of a dual-state inertial extended Kalman filter was designed to address the precision planetary landing problem when viewed as a rendezvous problem with an intended landing site. For the required precision navigation system that is capable of navigating along a descent-to-landing trajectory to a precise landing, the impact of attitude errors on the translational state estimation are included in a fully integrated navigation structure in which translation state estimation is combined with attitude state estimation. The map tie errors are estimated as part of the process, thereby creating a dual-state filter implementation. Also, the filter is implemented using inertial states rather than states relative to the target. External measurements include altimeter, velocimeter, star camera, terrain relative navigation sensor, and a hazard relative navigation sensor providing information regarding hazards on a map generated on-the-fly.

  2. 76 FR 22944 - Pipeline Safety: Notice of Public Webinars on Implementation of Distribution Integrity Management...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-25

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... Management Programs AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... Nation's gas distribution pipeline systems through development of inspection methods and guidance for the...

  3. 77 FR 72435 - Pipeline Safety: Using Meaningful Metrics in Conducting Integrity Management Program Evaluations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Evaluations AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice... improve performance. For gas transmission pipelines, Sec. Sec. 192.911(i) and 192.945 define the...

  4. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  5. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  6. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  7. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  8. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  9. Decreased pain sensitivity due to trimethylbenzene exposure: case study on quantitative approaches for hazard identification

    EPA Science Inventory

    Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, often using the Hill criteria and weight of evidence determinations to integrate data from multiple studies. Recently, the National Research Council has recommended the ...

  10. A systems approach to risk reduction of transportation infrastructure networks subject to multiple hazards : final report, December 31, 2008.

    DOT National Transportation Integrated Search

    2008-12-31

    Integrity, robustness, reliability, and resiliency of infrastructure networks are vital to the economy, : security and well-being of any country. Faced with threats caused by natural and man-made hazards, : transportation infrastructure network manag...

  11. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  12. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  13. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  14. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  15. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  16. A Real-Time Construction Safety Monitoring System for Hazardous Gas Integrating Wireless Sensor Network and Building Information Modeling Technologies.

    PubMed

    Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng

    2018-02-02

    In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications.

  17. An Integrative Research Framework to Unravel the Interplay of Natural Hazards and Vulnerabilities

    NASA Astrophysics Data System (ADS)

    Di Baldassarre, Giuliano; Nohrstedt, Daniel; Mârd, Johanna; Burchardt, Steffi; Albin, Cecilia; Bondesson, Sara; Breinl, Korbinian; Deegan, Frances M.; Fuentes, Diana; Lopez, Marc Girons; Granberg, Mikael; Nyberg, Lars; Nyman, Monika Rydstedt; Rhodes, Emma; Troll, Valentin; Young, Stephanie; Walch, Colin; Parker, Charles F.

    2018-03-01

    Climate change, globalization, urbanization, social isolation, and increased interconnectedness between physical, human, and technological systems pose major challenges to disaster risk reduction (DRR). Subsequently, economic losses caused by natural hazards are increasing in many regions of the world, despite scientific progress, persistent policy action, and international cooperation. We argue that these dramatic figures call for novel scientific approaches and new types of data collection to integrate the two main approaches that still dominate the science underpinning DRR: the hazard paradigm and the vulnerability paradigm. Building from these two approaches, here we propose a research framework that specifies the scope of enquiry, concepts, and general relations among phenomena. We then discuss the essential steps to advance systematic empirical research and evidence-based DRR policy action.

  18. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    NASA Astrophysics Data System (ADS)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.

  19. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less

  20. Integrated approaches for reducing sample size for measurements of trace elemental impurities in plutonium by ICP-OES and ICP-MS

    DOE PAGES

    Xu, Ning; Chamberlin, Rebecca M.; Thompson, Pam; ...

    2017-10-07

    This study has demonstrated that bulk plutonium chemical analysis can be performed at small scales (\\50 mg material) through three case studies. Analytical methods were developed for ICP-OES and ICP-MS instruments to measure trace impurities and gallium content in plutonium metals with comparable or improved detection limits, measurement accuracy and precision. In two case studies, the sample size has been reduced by 109, and in the third case study, by as much as 50009, so that the plutonium chemical analysis can be performed in a facility rated for lower-hazard and lower-security operations.

  1. Integrated approaches for reducing sample size for measurements of trace elemental impurities in plutonium by ICP-OES and ICP-MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Ning; Chamberlin, Rebecca M.; Thompson, Pam

    This study has demonstrated that bulk plutonium chemical analysis can be performed at small scales (\\50 mg material) through three case studies. Analytical methods were developed for ICP-OES and ICP-MS instruments to measure trace impurities and gallium content in plutonium metals with comparable or improved detection limits, measurement accuracy and precision. In two case studies, the sample size has been reduced by 109, and in the third case study, by as much as 50009, so that the plutonium chemical analysis can be performed in a facility rated for lower-hazard and lower-security operations.

  2. ROBOTICS IN HAZARDOUS ENVIRONMENTS - REAL DEPLOYMENTS BY THE SAVANNAH RIVER NATIONAL LABORATORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kriikku, E.; Tibrea, S.; Nance, T.

    The Research & Development Engineering (R&DE) section in the Savannah River National Laboratory (SRNL) engineers, integrates, tests, and supports deployment of custom robotics, systems, and tools for use in radioactive, hazardous, or inaccessible environments. Mechanical and electrical engineers, computer control professionals, specialists, machinists, welders, electricians, and mechanics adapt and integrate commercially available technology with in-house designs, to meet the needs of Savannah River Site (SRS), Department of Energy (DOE), and other governmental agency customers. This paper discusses five R&DE robotic and remote system projects.

  3. 75 FR 27504 - Substantial Product Hazard List: Hand-Held Hair Dryers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-17

    ...The Consumer Product Safety Improvement Act of 2008 (``CPSIA''), authorizes the United States Consumer Product Safety Commission (``Commission'') to specify, by rule, for any consumer product or class of consumer products, characteristics whose existence or absence shall be deemed a substantial product hazard under certain circumstances. In this document, the Commission is proposing a rule to determine that any hand-held hair dryer without integral immersion protection presents a substantial product hazard.

  4. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...

  5. Petroleum and hazardous material releases from industrial facilities associated with Hurricane Katrina.

    PubMed

    Santella, Nicholas; Steinberg, Laura J; Sengul, Hatice

    2010-04-01

    Hurricane Katrina struck an area dense with industry, causing numerous releases of petroleum and hazardous materials. This study integrates information from a number of sources to describe the frequency, causes, and effects of these releases in order to inform analysis of risk from future hurricanes. Over 200 onshore releases of hazardous chemicals, petroleum, or natural gas were reported. Storm surge was responsible for the majority of petroleum releases and failure of storage tanks was the most common mechanism of release. Of the smaller number of hazardous chemical releases reported, many were associated with flaring from plant startup, shutdown, or process upset. In areas impacted by storm surge, 10% of the facilities within the Risk Management Plan (RMP) and Toxic Release Inventory (TRI) databases and 28% of SIC 1311 facilities experienced accidental releases. In areas subject only to hurricane strength winds, a lower fraction (1% of RMP and TRI and 10% of SIC 1311 facilities) experienced a release while 1% of all facility types reported a release in areas that experienced tropical storm strength winds. Of industrial facilities surveyed, more experienced indirect disruptions such as displacement of workers, loss of electricity and communication systems, and difficulty acquiring supplies and contractors for operations or reconstruction (55%), than experienced releases. To reduce the risk of hazardous material releases and speed the return to normal operations under these difficult conditions, greater attention should be devoted to risk-based facility design and improved prevention and response planning.

  6. The reactivity of sodium alanates with O[2], H[2]O, and CO[2] : an investigation of complex metal hydride contamination in the context of automotive systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dedrick, Daniel E.; Bradshaw, Robert W.; Behrens, Richard, Jr.

    2007-08-01

    Safe and efficient hydrogen storage is a significant challenge inhibiting the use of hydrogen as a primary energy carrier. Although energy storage performance properties are critical to the success of solid-state hydrogen storage systems, operator and user safety is of highest importance when designing and implementing consumer products. As researchers are now integrating high energy density solid materials into hydrogen storage systems, quantification of the hazards associated with the operation and handling of these materials becomes imperative. The experimental effort presented in this paper focuses on identifying the hazards associated with producing, storing, and handling sodium alanates, and thus allowingmore » for the development and implementation of hazard mitigation procedures. The chemical changes of sodium alanates associated with exposure to oxygen and water vapor have been characterized by thermal decomposition analysis using simultaneous thermogravimetric modulated beam mass spectrometry (STMBMS) and X-ray diffraction methods. Partial oxidation of sodium alanates, an alkali metal complex hydride, results in destabilization of the remaining hydrogen-containing material. At temperatures below 70 C, reaction of sodium alanate with water generates potentially combustible mixtures of H{sub 2} and O{sub 2}. In addition to identifying the reaction hazards associated with the oxidation of alkali-metal containing complex hydrides, potential treatment methods are identified that chemically stabilize the oxidized material and reduce the hazard associated with handling the contaminated metal hydrides.« less

  7. Ecosystems for Early Warning: Potential Use of Bioindicators

    NASA Astrophysics Data System (ADS)

    Zommers, Z. A.; Sitati, A. M.; Habilov, M.

    2014-12-01

    Bioindicators are biological processes, species or communities, which are used to assess changes in the environment or environmental quality. Theoretically, they could also be used to provide advanced warning of hazards. They are inexpensive, locally relevant, and can encourage stakeholder participation in early warning system development and maintenance. While bioindicators have been identified for environmental problems such as air pollution and water pollution, and have been used to assess health of ecosystems, little information is available on bioindicators for climate related hazards. This presentation reviews possible biodindicators for droughts, wildfires and tropical cyclones, based on the results of a literature review. It will also present results from a household survey of 36 communities in Kenya, Ghana and Burkina Faso. Indigenous knowledge offers a wealth of potential bioindicators; including animal and insect behavior, and plant phenology. Yet significant study is needed to verify these indicators and evaluate them against criteria such as specificity, variability, monotonicity, practicality and relevance. Bioindicators may not be specific to individual hazards and may provide limited advanced warning, as response often occurs after the actual onset of the hazard. Furthermore, indicators may become increasingly unreliable due to climate change itself. There is a need for a large-scale assessment of hazard bioindicators, which should also include forecasts of bioindicator change under global warming, and a cost-benefit analysis of the value of integrating bioindicators into early warning systems. Lessons can be drawn from ethnopharmacology. Coordinated research on this topic could contribute to the resilience of both ecosystems and human livelihoods.

  8. 40 CFR 63.7823 - What test methods and other procedures must I use to demonstrate initial compliance with the...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel Manufacturing Facilities Initial...

  9. 40 CFR 63.7823 - What test methods and other procedures must I use to demonstrate initial compliance with the...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel Manufacturing Facilities Initial...

  10. 49 CFR 192.911 - What are the elements of an integrity management program?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.911 What are the elements of an integrity management program...

  11. 49 CFR 192.933 - What actions must be taken to address integrity issues?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.933 What actions must be taken to address integrity issues? (a...

  12. 49 CFR 192.909 - How can an operator change its integrity management program?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.909 How can an operator change its integrity management...

  13. Comparing capacity coefficient and dual task assessment of visual multitasking workload

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaha, Leslie M.

    Capacity coefficient analysis could offer a theoretically grounded alternative approach to subjective measures and dual task assessment of cognitive workload. Workload capacity or workload efficiency is a human information processing modeling construct defined as the amount of information that can be processed by the visual cognitive system given a specified of amount of time. In this paper, I explore the relationship between capacity coefficient analysis of workload efficiency and dual task response time measures. To capture multitasking performance, I examine how the relatively simple assumptions underlying the capacity construct generalize beyond the single visual decision making tasks. The fundamental toolsmore » for measuring workload efficiency are the integrated hazard and reverse hazard functions of response times, which are defined by log transforms of the response time distribution. These functions are used in the capacity coefficient analysis to provide a functional assessment of the amount of work completed by the cognitive system over the entire range of response times. For the study of visual multitasking, capacity coefficient analysis enables a comparison of visual information throughput as the number of tasks increases from one to two to any number of simultaneous tasks. I illustrate the use of capacity coefficients for visual multitasking on sample data from dynamic multitasking in the modified Multi-attribute Task Battery.« less

  14. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  15. Drought Risk Assessment based on Natural and Social Factors

    NASA Astrophysics Data System (ADS)

    Huang, Jing; Wang, Huimin; Han, Dawei

    2015-04-01

    In many parts of the world, drought hazard is becoming more frequent and severe due to climate change and human activities. It is crucial to monitor and assess drought conditions, especially for decision making support in agriculture sector. The vegetation index (VI) decreases, and the land surface temperature (LST) increases when the vegetation is under drought stress. Therefore both of these remotely sensed indices are widely used in drought monitoring and assessment. Temperature-Vegetation Dryness Index (TVDI) is obtained by establishing the feature space of the normalized difference vegetation index (NDVI) and LST, which reflects agriculture dry situation by inverting soil moisture. However, these indices only concern the natural hazard-causing factors. Our society is a complex large-scale system with various natural and social elements. The drought risk is the joint consequence of hazard-causing factors and hazard-affected bodies. For example, as the population increases, the exposure of the hazard-affected bodies also tends to increase. The high GDP enhances the response ability of government, and the irrigation and water conservancy reduces the vulnerability. Such characteristics of hazard-affected bodies should be coupled with natural factors. In this study, the 16-day moderate-resolution imaging spectroradiometer (MODIS) NDVI and LST data are combined to establish NDVI-Ts space according to different land use types in Yunnan Province, China. And then, TVDIs are calculated through dry and wet edges modeled as a linear fit to data for each land cover type. Next, the efforts are turned to establish an integrated drought assessment index of social factors and TVDI through ascertaining attribute weight based on rough sets theory. Thus, the new CDI (comprehensive drought index) recorded during spring of 2010 and the spatial variations in drought are analyzed and compared with TVDI dataset. Moreover, actual drought risk situation in the study area is given to verify the effectiveness of the CDI. In addition, GIS is applied to provide geographically referenced information, i.e. information involving location, elevation, land use, water resources distance and so on, which are essential inputs for spatial analysis in drought risk assessment. On the whole, this study has proposed a new idea on drought risk assessment integrating natural factors with social factors, as well as providing a real-time drought monitoring method in a social context.

  16. Protection of agriculture against drought in Slovenia based on vulnerability and risk assessment

    NASA Astrophysics Data System (ADS)

    Dovžak, M.; Stanič, S.; Bergant, K.; Gregorič, G.

    2012-04-01

    Past and recent extreme events, like earthquakes, extreme droughts, heat waves, flash floods and volcanic eruptions continuously remind us that natural hazards are an integral component of the global environment. Despite rapid improvement of detection techniques many of these events evade long-term or even mid-term prediction and can thus have disastrous impacts on affected communities and environment. Effective mitigation and preparedness strategies will be possible to develop only after gaining the understanding on how and where such hazards may occur, what causes them, what circumstances increase their severity, and what their impacts may be and their study has the recent years emerged as under the common title of natural hazard management. The first step in natural risk management is risk identification, which includes hazard analysis and monitoring, vulnerability analysis and determination of the risk level. The presented research focuses on drought, which is at the present already the most widespread as well as still unpredictable natural hazard. Its primary aim was to assess the frequency and the consequences of droughts in Slovenia based on drought events in the past, to develop methodology for drought vulnerability and risk assessment that can be applied in Slovenia and wider in South-Eastern Europe, to prepare maps of drought risk and crop vulnerability and to guidelines to reduce the vulnerability of the crops. Using the amounts of plant available water in the soil, slope inclination, solar radiation, land use and irrigation infrastructure data sets as inputs, we obtained vulnerability maps for Slovenia using GIS-based multi-criteria decision analysis with a weighted linear combination of the input parameters. The weight configuration was optimized by comparing the modelled crop damage to the assessed actual damage, which was available for the extensive drought case in 2006. Drought risk was obtained quantitatively as a function of hazard and vulnerability and presented in the same way as the vulnerability, as a GIS-based map. Risk maps show geographic regions in Slovenia where droughts pose a major threat to the agriculture and together with the vulnerability maps provide the basis for drought management, in particular for the appropriate mitigation and response actions in specific regions. The developed methodology is expected to be applied to the entire region of South-Eastern Europe within the initiative of the Drought Management Centre for Southeastern Europe.

  17. AGU:Comments Requested on Natural Hazards Position Statement

    NASA Astrophysics Data System (ADS)

    2004-11-01

    Natural hazards (earthquakes, floods, hurricanes, landslides, meteors, space weather, tornadoes, volcanoes, and other geophysical phenomena) are an integral component of our dynamic planet. These can have disastrous effects on vulnerable communities and ecosystems. By understanding how and where hazards occur, what causes them, and what circumstances increase their severity, we can develop effective strategies to reduce their impact. In practice, mitigating hazards requires addressing issues such as real-time monitoring and prediction, emergency preparedness, public education and awareness, post-disaster recovery, engineering, construction practices, land use, and building codes. Coordinated approaches involving scientists, engineers, policy makers, builders, lenders, insurers, news media, educators, relief organizations, and the public are therefore essential to reducing the adverse effects of natural hazards.

  18. Design and analysis of a personnel blast shield for different explosives applications

    NASA Astrophysics Data System (ADS)

    Lozano, Eduardo

    The use of explosives brings countless benefits to our everyday lives in areas such as mining, oil and gas exploration, demolition, and avalanche control. However, because of the potential destructive power of explosives, strict safety procedures must be an integral part of any explosives operation. The goal of this work is to provide a solution to protect against the hazards that accompany the general use of explosives, specifically in avalanche control. For this reason, a blast shield was designed and tested to protect the Colorado Department of Transportation personnel against these unpredictable effects. This document will develop a complete analysis to answer the following questions: what are the potential hazards from the detonation of high explosives, what are their effects, and how can we protect ourselves against them. To answer these questions theoretical, analytical, and numerical calculations were performed. Finally, a full blast shield prototype was tested under different simulated operational environments proving its effectiveness as safety device. The Colorado Department of Transportation currently owns more than fifteen shields that are used during every operation involving explosive materials.

  19. UNCERTAINTY AND SENSITIVITY ANALYSES FOR INTEGRATED HUMAN HEALTH AND ECOLOGICAL RISK ASSESSMENT OF HAZARDOUS WASTE DISPOSAL

    EPA Science Inventory

    While there is a high potential for exposure of humans and ecosystems to chemicals released from hazardous waste sites, the degree to which this potential is realized is often uncertain. Conceptually divided among parameter, model, and modeler uncertainties imparted during simula...

  20. Fuels planning: science synthesis and integration; social issues fact sheet 06: Important considerations for communicating about hazards

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Effective public education and communication campaigns about wildland fire and fuels management should have clear objectives, and use the right techniques to achieve these objectives. This fact sheet lists seven important considerations for planning or implementing a hazard communication effort.

  1. Integration of social dimension in an index oriented methodology for consequence analysis of natural hazards: application to the Upper Guil Catchment (Southern French Alps)

    NASA Astrophysics Data System (ADS)

    Carlier, Benoit; Puissant, Anne; Dujarric, Constance

    2017-04-01

    Vulnerability assessment together with hazard exposure is generally accepted as the two main steps of risk analysis. If quantitative methods to estimate hazard exposure are now well-defined, it is not the case regarding vulnerability assessment. Vulnerability is a complex concept involving a variety of disciplines from physical and socio-economic sciences (i.e. engineering, economics, social and health sciences etc.). Currently, two opposite trends exist: the 'physical' approach in which vulnerability is analysed as potential impacts (i.e. structural and functional) on the elements at risk (building, network, land cover); and the 'social' approach in which vulnerability is a combination of socio-economic variables determining people's ability to anticipate before a catastrophic event, to react during it, and to recover after it. For a complete analysis of vulnerability it is essential to combine these two approaches but in reality few works exists. The objective of this research is to improve the Potential Damage Index (PDI), detailed in Puissant el al. (2013), originally developed to assess physical injury, structural and functional consequences of landslide hazard, by including socio-economic characteristics of population information. Data from the French Census data (INSEE, 2012) and a survey on risk perception (100 questionnaires obtained between 2014 and 2015/16) were used to propose an overall index taking into account the three main phases of risk management: preparedness, crisis management and recovery. This new index called Global Potential Damage Index (GPDI) is applied on the Upper Guil Catchment to assess potential torrential floods hazard in the context of the French funded project SAMCO (Society Adaptation for coping with Mountain risks in a global change Context). Results of the PDI are compared with the GPDI and show significant differences. GPDI scores mapping are lower than PDI scores indicating that resilient population may qualify results obtained for physical consequences. In GPDI the social and institutional component is expressed by a unique value applied for the overall stakes of a same community. Consequently, socio-economics differences between Upper Guil catchments communities are highlighted and make results easily understandable for local manager.

  2. Preliminary volcanic hazards evaluation for Los Alamos National Laboratory Facilities and Operations : current state of knowledge and proposed path forward

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keating, Gordon N.; Schultz-Fellenz, Emily S.; Miller, Elizabeth D.

    2010-09-01

    The integration of available information on the volcanic history of the region surrounding Los Alamos National Laboratory indicates that the Laboratory is at risk from volcanic hazards. Volcanism in the vicinity of the Laboratory is unlikely within the lifetime of the facility (ca. 50–100 years) but cannot be ruled out. This evaluation provides a preliminary estimate of recurrence rates for volcanic activity. If further assessment of the hazard is deemed beneficial to reduce risk uncertainty, the next step would be to convene a formal probabilistic volcanic hazards assessment.

  3. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  4. Challenges to Integrating Geographically-Dispersed Data and Expertise at U.S. Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Murray, T. L.; Ewert, J. W.

    2010-12-01

    During the past 10 years the data and information available to volcano observatories to assess hazards and forecast activity has grown dramatically, a trend that will likely continue. Similarly, the ability of observatories to draw upon external specialists who can provide needed expertise is also increasing. Though technology easily provides the ability to move large amounts of information to the observatory, the challenge remains to efficiently and quickly integrate useful information and expertise into the decision-making process. The problem is further exacerbated by the use of new research techniques during times of heightened activity. Eruptive periods typically accelerate research into volcanic processes as scientists use the opportunity to test new hypotheses and develop new tools. Such experimental methods can be extremely insightful, but may be less easily integrated into the normal data streams that inform decisions. Similarly, there is an increased need for collaborative tools that allow efficient and effective communication between the observatory and external experts. Observatories will continue to be the central focus for integrating information, assessing hazards, and communicating with the public, but will increasingly draw on experts at other observatories, government agencies, academia and even the private sector, both foreign and domestic, to provide analysis and assistance. Fostering efficient communication among such a diverse and geographically dispersed group is a challenge. Addressing these challenges is one of the goals of the U.S. National Volcano Early Warning System, falling under the effort to improve interoperability among the five U.S. volcano observatories and their collaborators. In addition to providing the mechanisms to handle the flow of data, efforts will be directed at simplifying - though retaining the required nuance - information and merging data streams while developing tools that enable observatory staff to quickly integrate the data into the decision-making process. Also, advances in the use of collaborative tools and organizational structure will be required if observatories are to tap into the intellectual resources throughout the volcanological community. The last 10 years saw a continuing explosion in the quantity and quality of data and expertise available to address volcano hazards and volcanic activity; the challenge over the next 10 years will be for us to make the best use of it.

  5. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less

  6. Application of a plume model for decision makers' situation awareness during an outdoor airborne HAZMAT release.

    PubMed

    Meris, Ronald G; Barbera, Joseph A

    2014-01-01

    In a large-scale outdoor, airborne, hazardous materials (HAZMAT) incident, such as ruptured chlorine rail cars during a train derailment, the local Incident Commanders and HAZMAT emergency responders must obtain accurate information quickly to assess the situation and act promptly and appropriately. HAZMAT responders must have a clear understanding of key information and how to integrate it into timely and effective decisions for action planning. This study examined the use of HAZMAT plume modeling as a decision support tool during incident action planning in this type of extreme HAZMAT incident. The concept of situation awareness as presented by Endsley's dynamic situation awareness model contains three levels: perception, comprehension, and projection. It was used to examine the actions of incident managers related to adequate data acquisition, current situational understanding, and accurate situation projection. Scientists and engineers have created software to simulate and predict HAZMAT plume behavior, the projected hazard impact areas, and the associated health effects. Incorporating the use of HAZMAT plume projection modeling into an incident action plan may be a complex process. The present analysis used a mixed qualitative and quantitative methodological approach and examined the use and limitations of a "HAZMAT Plume Modeling Cycle" process that can be integrated into the incident action planning cycle. HAZMAT response experts were interviewed using a computer-based simulation. One of the research conclusions indicated the "HAZMAT Plume Modeling Cycle" is a critical function so that an individual/team can be tasked with continually updating the hazard plume model with evolving data, promoting more accurate situation awareness.

  7. Alaska Center for Unmanned Aircraft Systems Integration (ACUASI): Operational Support and Geoscience Research

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Cahill, C. F.; Rogers, M.; Hatfield, M. C.

    2016-12-01

    Unmanned Aircraft Systems (UAS) have enormous potential for use in geoscience research and supporting operational needs from natural hazard assessment to the mitigation of critical infrastructure failure. They provide a new tool for universities, local, state, federal, and military organizations to collect new measurements not readily available from other sensors. We will present on the UAS capabilities and research of the Alaska Center for Unmanned Aircraft Systems Integration (ACUASI, http://acuasi.alaska.edu/). Our UAS range from the Responder with its dual visible/infrared payload that can provide simultaneous data to our new SeaHunter UAS with 90 lb. payload and multiple hour flight time. ACUASI, as a designated US Federal Aviation Administration (FAA) test center, works closely with the FAA on integrating UAS into the national airspace. ACUASI covers all aspects of working with UAS from pilot training, airspace navigation, flight operations, and remote sensing analysis to payload design and integration engineers and policy experts. ACUASI's recent missions range from supporting the mapping of sea ice cover for safe passage of Alaskans across the hazardous winter ice to demonstrating how UAS can be used to provide support during oil spill response. Additionally, we will present on how ACUASI has worked with local authorities in Alaska to integrate UAS into search and rescue operations and with NASA and the FAA on their UAS Transport Management (UTM) project to fly UAS within the manned airspace. ACUASI is also working on developing new capabilities to sample volcanic plumes and clouds, map forest fire impacts and burn areas, and develop a new citizen network for monitoring snow extent and depth during Northern Hemisphere winters. We will demonstrate how UAS can be integrated in operational support systems and at the same time be used in geoscience research projects to provide high precision, accurate, and reliable observations.

  8. Earthquake Hazard Analysis Use Vs30 Data In Palu

    NASA Astrophysics Data System (ADS)

    Rusydi, Muhammad; Efendi, Rustan; Sandra; Rahmawati

    2018-03-01

    Palu City is an area passed by Palu-Koro fault and some small faults around it, causing the Palu of city often hit by earthquake. Therefore, this study is intended to mapped the earthquake hazard zones. Determination of this zone is one of aspect that can be used to reducing risk of earthquake disaster. This research was conducted by integrating Vs30 data from USGS with Vs30 from mikrotremor data. Vs30 data from microtremor used to correction Vs30 from USGS. This Results are then used to determine PeakGround Acceleration (PGA) can be used to calculate the impact of earthquake disaster. Results of the study shows that Palu City is in high danger class. Eight sub-districts in Palu City, there are 7 sub-districts that have high danger level, namely Palu Barat, PaluTimur, Palu Selatan, Palu Utara, Tatanga, Mantikulore and Tawaeli.

  9. Integration of rainfall/runoff and geomorphological analyses flood hazard in small catchments: case studies from the southern Apennines (Italy)

    NASA Astrophysics Data System (ADS)

    Palumbo, Manuela; Ascione, Alessandra; Santangelo, Nicoletta; Santo, Antonio

    2017-04-01

    We present the first results of an analysis of flood hazard in ungauged mountain catchments that are associated with intensely urbanized alluvial fans. Assessment of hydrological hazard has been based on the integration of rainfall/runoff modelling of drainage basins with geomorphological analysis and mapping. Some small and steep, ungauged mountain catchments located in various areas of the southern Apennines, in southern Italy, have been chosen as test sites. In the last centuries, the selected basins have been subject to heavy and intense precipitation events, which have caused flash floods with serious damages in the correlated alluvial fan areas. Available spatial information (regional technical maps, DEMs, land use maps, geological/lithological maps, orthophotos) and an automated GIS-based procedure (ArcGis tools and ArcHydro tools) have been used to extract morphological, hydrological and hydraulic parameters. Such parameters have been used to run the HEC (Hydrologic Engineering Center of the US Army Corps of Engineers) software (GeoHMS, GeoRAS, HMS and RAS) based on rainfall-runoff models, which have allowed the hydrological and hydraulic simulations. As the floods occurred in the studied catchments have been debris flows dominated, the solid load simulation has been also performed. In order to validate the simulations, we have compared results of the modelling with the effects produced by past floods. Such effects have been quantified through estimations of both the sediment volumes within each catchment that have the potential to be mobilised (pre-event) during a sediment transfer event, and the volume of sediments delivered by the debris flows at basins' outlets (post-event). The post-event sediment volume has been quantified through post-event surveys and Lidar data. Evaluation of the pre-event sediment volumes in single catchments has been based on mapping of sediment storages that may constitute source zones of bed load transport and debris flows. For such an approach has been used a methodology that consists of the application of a process-based geomorphological mapping, based on data derived from GIS analysis using high-resolution DEMs, field measurements and aerial photograph interpretations. Our integrated approach, which allows quantification of the flow rate and a semi-quantitative assessment of sediment that can be mobilized during hydro-meteorological events, is applied for the first time to torrential catchmenmts of the southern Apennines and may significantly contribute to previsional studies aimed at risk mitigation in the study region.

  10. K Basin Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PECH, S.H.

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  11. Advancing Alternative Analysis: Integration of Decision Science.

    PubMed

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina M; Blake, Ann; Carroll, William F; Corbett, Charles J; Hansen, Steffen Foss; Lempert, Robert J; Linkov, Igor; McFadden, Roger; Moran, Kelly D; Olivetti, Elsa; Ostrom, Nancy K; Romero, Michelle; Schoenung, Julie M; Seager, Thomas P; Sinsheimer, Peter; Thayer, Kristina A

    2017-06-13

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups' findings. We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. We advance four recommendations: a ) engaging the systematic development and evaluation of decision approaches and tools; b ) using case studies to advance the integration of decision analysis into alternatives analysis; c ) supporting transdisciplinary research; and d ) supporting education and outreach efforts. https://doi.org/10.1289/EHP483.

  12. A Cascading Storm-Flood-Landslide Guidance System: Development and Application in China

    NASA Astrophysics Data System (ADS)

    Zeng, Ziyue; Tang, Guoqiang; Long, Di; Ma, Meihong; Hong, Yang

    2016-04-01

    Flash floods and landslides, triggered by storms, often interact and cause cascading effects on human lives and property. Satellite remote sensing data has significant potential use in analysis of these natural hazards. As one of the regions continuously affected by severe flash floods and landslides, Yunnan Province, located in Southwest China, has a complex mountainous hydrometeorology and suffers from frequent heavy rainfalls from May through to late September. Taking Yunnan as a test-bed, this study proposed a Cascading Storm-Flood-Landslide Guidance System to progressively analysis and evaluate the risk of the multi-hazards based on multisource satellite remote sensing data. First, three standardized rainfall amounts (average daily amount in flood seasons, maximum 1h and maximum 6h amount) from the products of Topical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) were used as rainfall indicators to derive the StorM Hazard Index (SMHI). In this process, an integrated approach of the Analytic Hierarchy Process (AHP) and the Information-Entropy theory was adopted to determine the weight of each indicator. Then, land cover and vegetation cover data from the Moderate Resolution Imaging Spectroradiometer (MODIS) products, soil type from the Harmonized World Soil Database (HWSD) soil map, and slope from the Shuttle Radar Topography Mission (SRTM) data were add as semi-static geo-topographical indicators to derive the Flash Flood Hazard Index (FFHI). Furthermore, three more relevant landslide-controlling indicators, including elevation, slope angle and soil text were involved to derive the LandSlide Hazard Index (LSHI). Further inclusion of GDP, population and prevention measures as vulnerability indicators enabled to consecutively predict the risk of storm to flash flood and landslide, respectively. Consequently, the spatial patterns of the hazard indices show that the southeast of Yunnan has more possibility to encounter with storms than other parts, while the northeast of Yunnan are most susceptible to floods and landslides, which agrees with the distribution of observed flood and landslide events. Moreover, risks for the multi-hazards were classified into four categories. Results show a strong correlation between the distributions of flash flood prone and landslide-prone regions and also highlight the counties with high risk of storms (e.g., Funing and Malipo), flash floods (e.g., Gongshan and Yanjing) and landslides (e.g., Zhaotong and Luxi). Compared to other approaches, the Cascading Storm-Flood-Landslide Guidance System uses a straightforward yet useful indicator-based weighted linear combination method and could be a useful prototype in mapping characteristics of storm-triggered hazards for users at different administrative levels (e.g., catchment, town, county, province and even nation) in China.

  13. Water Induced Hazard Mapping in Nepal: A Case Study of East Rapti River Basin

    NASA Astrophysics Data System (ADS)

    Neupane, N.

    2010-12-01

    This paper presents illustration on typical water induced hazard mapping of East Rapti River Basin under the DWIDP, GON. The basin covers an area of 2398 sq km. The methodology includes making of base map of water induced disaster in the basin. Landslide hazard maps were prepared by SINMAP approach. Debris flow hazard maps were prepared by considering geology, slope, and saturation. Flood hazard maps were prepared by using two approaches: HEC-RAS and Satellite Imagery Interpretation. The composite water-induced hazard maps were produced by compiling the hazards rendered by landslide, debris flow, and flood. The monsoon average rainfall in the basin is 1907 mm whereas maximum 24 hours precipitation is 456.8 mm. The peak discharge of the Rapati River in the year of 1993 at station was 1220 cu m/sec. This discharge nearly corresponds to the discharge of 100-year return period. The landslides, floods, and debris flows triggered by the heavy rain of July 1993 claimed 265 lives, affected 148516 people, and damaged 1500 houses in the basin. The field investigation and integrated GIS interpretation showed that the very high and high landslide hazard zones collectively cover 38.38% and debris flow hazard zone constitutes 6.58%. High flood hazard zone occupies 4.28% area of the watershed. Mitigation measures are recommendated according to Integrated Watershed Management Approach under which the non-structural and structural measures are proposed. The non-structural measures includes: disaster management training, formulation of evacuation system (arrangement of information plan about disaster), agriculture management practices, protection of water sources, slope protections and removal of excessive bed load from the river channel. Similarly, structural measures such as dike, spur, rehabilitation of existing preventive measures and river training at some locations are recommendated. The major factors that have contributed to induce high incidences of various types of mass movements and inundation in the basin are rock and soil properties, prolonged and high-intensity rainfall, steep topography and various anthropogenic factors.

  14. Association of Osteopontin, Neopterin, and Myeloperoxidase With Stroke Risk in Patients With Prior Stroke or Transient Ischemic Attacks: Results of an Analysis of 13 Biomarkers From the Stroke Prevention by Aggressive Reduction in Cholesterol Levels Trial.

    PubMed

    Ganz, Peter; Amarenco, Pierre; Goldstein, Larry B; Sillesen, Henrik; Bao, Weihang; Preston, Gregory M; Welch, K Michael A

    2017-12-01

    Established risk factors do not fully identify patients at risk for recurrent stroke. The SPARCL trial (Stroke Prevention by Aggressive Reduction in Cholesterol Levels) evaluated the effect of atorvastatin on stroke risk in patients with a recent stroke or transient ischemic attack and no known coronary heart disease. This analysis explored the relationships between 13 plasma biomarkers assessed at trial enrollment and the occurrence of outcome strokes. We conducted a case-cohort study of 2176 participants; 562 had outcome strokes and 1614 were selected randomly from those without outcome strokes. Time to stroke was evaluated by Cox proportional hazards models. There was no association between time to stroke and lipoprotein-associated phospholipase A 2 , monocyte chemoattractant protein-1, resistin, matrix metalloproteinase-9, N-terminal fragment of pro-B-type natriuretic peptide, soluble vascular cell adhesion molecule-1, soluble intercellular adhesion molecule-1, or soluble CD40 ligand. In adjusted analyses, osteopontin (hazard ratio per SD change, 1.362; P <0.0001), neopterin (hazard ratio, 1.137; P =0.0107), myeloperoxidase (hazard ratio, 1.177; P =0.0022), and adiponectin (hazard ratio, 1.207; P =0.0013) were independently associated with outcome strokes. After adjustment for the Stroke Prognostic Instrument-II and treatment, osteopontin, neopterin, and myeloperoxidase remained independently associated with outcome strokes. The addition of these 3 biomarkers to Stroke Prognostic Instrument-II increased the area under the receiver operating characteristic curve by 0.023 ( P =0.015) and yielded a continuous net reclassification improvement (29.1%; P <0.0001) and an integrated discrimination improvement (42.3%; P <0.0001). Osteopontin, neopterin, and myeloperoxidase were independently associated with the risk of recurrent stroke and improved risk classification when added to a clinical risk algorithm. URL: http://www.clinicaltrials.gov. Unique Identifier: NCT00147602. © 2017 American Heart Association, Inc.

  15. Radiation predictions and shielding calculations for RITS-6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maenchen, John Eric; O'Malley, John; Kensek, Ronald Patrick

    2005-06-01

    The mission of Radiographic Integrated Test Stand-6 (RITS-6) facility is to provide the underlying science and technology for pulsed-power-driven flash radiographic X-ray sources for the National Nuclear Security Administration (NNSA). Flash X-ray radiography is a penetrating diagnostic to discern the internal structure in dynamic experiments. Short (~50 nanosecond (ns) duration) bursts of very high intensity Xrays from mm-scale source sizes are required at a variety of voltages to address this mission. RITS-6 was designed and is used to both develop the accelerator technology needed for these experiments and serves as the principal test stand to develop the high intensity electronmore » beam diodes that generate the required X-ray sources. RITS is currently in operation with three induction cavities (RITS-3) with a maximum voltage output of 5.5 MV and is classified as a low hazard non-nuclear facility in accordance with CPR 400.1.1, Chapter 13, Hazards Identification/Analysis and Risk Management. The facility will be expanded from three to six cavities (RITS-6) effectively doubling the operating voltage. The increase in the operating voltage to above 10 MV has resulted in RITS-6 being classified as an accelerator facility. RITS-6 will come under DOE Order 420.2B, Safety of Accelerator Facilities. The hazards of RITS are detailed in the "Safety Assessment Document for the Radiographic Integrated Test Stand Facility." The principal non-industrial hazard is prompt x-ray radiation. As the operating voltage is increased, both the penetration power and the total amount (dose) of x-rays are increased, thereby increasing the risk to local personnel. Fixed site shielding (predominantly concrete walls and a steel/lead skyshine shield) is used to attenuate these x-rays and mitigate this risk. This SAND Report details the anticipated x-ray doses, the shielding design, and the anticipated x-ray doses external to this shielding structure both in areas where administrative access control restricts occupation and in adjacent uncontrolled areas.« less

  16. Flood extent and water level estimation from SAR using data-model integration

    NASA Astrophysics Data System (ADS)

    Ajadi, O. A.; Meyer, F. J.

    2017-12-01

    Synthetic Aperture Radar (SAR) images have long been recognized as a valuable data source for flood mapping. Compared to other sources, SAR's weather and illumination independence and large area coverage at high spatial resolution supports reliable, frequent, and detailed observations of developing flood events. Accordingly, SAR has the potential to greatly aid in the near real-time monitoring of natural hazards, such as flood detection, if combined with automated image processing. This research works towards increasing the reliability and temporal sampling of SAR-derived flood hazard information by integrating information from multiple SAR sensors and SAR modalities (images and Interferometric SAR (InSAR) coherence) and by combining SAR-derived change detection information with hydrologic and hydraulic flood forecast models. First, the combination of multi-temporal SAR intensity images and coherence information for generating flood extent maps is introduced. The application of least-squares estimation integrates flood information from multiple SAR sensors, thus increasing the temporal sampling. SAR-based flood extent information will be combined with a Digital Elevation Model (DEM) to reduce false alarms and to estimate water depth and flood volume. The SAR-based flood extent map is assimilated into the Hydrologic Engineering Center River Analysis System (Hec-RAS) model to aid in hydraulic model calibration. The developed technology is improving the accuracy of flood information by exploiting information from data and models. It also provides enhanced flood information to decision-makers supporting the response to flood extent and improving emergency relief efforts.

  17. A conceptual framework for economic optimization of single hazard surveillance in livestock production chains.

    PubMed

    Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W

    2014-06-01

    Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. A Real-Time Construction Safety Monitoring System for Hazardous Gas Integrating Wireless Sensor Network and Building Information Modeling Technologies

    PubMed Central

    Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng

    2018-01-01

    In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications. PMID:29393887

  19. Geospatial techniques for allocating vulnerability zoning of geohazards along the Karakorum Highway, Gilgit-Baltistan-Pakistan

    NASA Astrophysics Data System (ADS)

    Khan, K. M.; Rashid, S.; Yaseen, M.; Ikram, M.

    2016-12-01

    The Karakoram Highway (KKH) 'eighth wonder of the world', constructed and completed by the consent of Pakistan and China in 1979 as a Friendship Highway. It connect Gilgit-Baltistan, a strategically prominent region of Pakistan, with Xinjiang region in China. Due to manifold geology/geomorphology, soil formation, steep slopes, climate change well as unsustainable anthropogenic activities, still, KKH is remarkably vulnerable to natural hazards i.e. land subsistence, landslides, erosion, rock fall, floods, debris flows, cyclical torrential rainfall and snowfall, lake outburst etc. Most of the time these geohazard's damaging effects jeopardized the life in the region. To ascertain the nature and frequency of the disaster and vulnerability zoning, a rating and management (logistic) analysis were made to investigate the spatiotemporal sharing of the natural hazard. The substantial dynamics of the physiograpy, geology, geomorphology, soils and climate were carefully understand while slope, aspect, elevation, profile curvature and rock hardness was calculated by different techniques. To assess the nature and intensity geospatial analysis were conducted and magnitude of every factor was gauged by using logistic regression. Moreover, ever relative variable was integrated in the evaluation process. Logistic regression and geospatial techniques were used to map the geohazard vulnerability zoning (GVZ). The GVZ model findings were endorsed by the reviews of documented hazards in the current years and the precision was realized more than 88.1 %. The study has proved the model authentication by highlighting the comfortable indenture among the vulnerability mapping and past documented hazards. By using a receiver operating characteristic curve, the logistic regression model made satisfactory results. The outcomes will be useful in sustainable land use and infrastructure planning, mainly in high risk zones for reduceing economic damages and community betterment.

  20. Smallholder Food and Water Security in the Face of Climatic Stress and the Coffee Leaf Rust: Lessons from Nicaragua

    NASA Astrophysics Data System (ADS)

    Stewart, I. T.; Bacon, C. M.; Sundstrom, W.

    2015-12-01

    Smallholder farmers in Nicaragua and throughout much of Central America preserve forest biodiversity and contribute to the sustainable production of coffee and other crops while, paradoxically, they themselves must cope with recurring periods of seasonal hunger. Smallholder food and water security in the region is affected by hurricanes, periodic drought events, climatic changes, an on-going outbreak of the coffee leaf rust, and fluctuations in food prices. Using regression analysis, our research examines what factors strengthened resilience to these hazards at the household level over the 1981 - 2014 time period. To this end, we integrate qualitative research on coping responses and local institutions, a participatory survey of 368 households, and an analysis of hydro-climatic data. Our results indicate that coping responses to the coffee leaf rust outbreak and the 2014 drought are comparable in severity to those used to endure Hurricane Mitch in 1998, and a severe 2009 drought. Higher smallholder resilience to stresses affecting food and water security is associated with larger farms, off-farm employment, more on-farm food production, higher numbers of fruit trees, and greater coffee harvests. Households that reported more severe coping responses to hazards earlier in the study period tended to be more strongly impacted by later hazards and reported generally greater seasonal hunger. Affiliation with local farmer-to-farmer institutions prioritizing either subsistence-oriented production or sales to international fair-trade markets did not correlate strongly with coping responses; however, subsistence-oriented institutions promote several resilience-enhancing practices. Lessons learned by adapting to past hazards may be used to develop adaptation and mitigation strategies for smallholders under continued climate variability and change.

  1. Development of requirements on safety cases of machine industry products for power engineering

    NASA Astrophysics Data System (ADS)

    Aronson, K. E.; Brezgin, V. I.; Brodov, Yu. M.; Gorodnova, N. V.; Kultyshev, A. Yu.; Tolmachev, V. V.; Shablova, E. G.

    2016-12-01

    This article considers security assurance for power engineering machinery in the design and production phases. The Federal Law "On Technical Regulation" and the Customs Union Technical Regulations "On Safety of Machinery and Equipment" are analyzed in the legal, technical, and economic aspect with regard to power engineering machine industry products. From the legal standpoint, it is noted that the practical enforcement of most norms of the Law "On Technical Regulation" makes it necessary to adopt subordinate statutory instruments currently unavailable; moreover, the current level of adoption of technical regulations leaves much to be desired. The intensive integration processes observed in the Eurasian Region in recent years have made it a more pressing task to harmonize the laws of the region's countries, including their technical regulation framework. The technical aspect of analyzing the technical regulation of the Customs Union has been appraised by the IDEF0 functional modeling method. The object of research is a steam turbine plant produced at the turbine works. When developing the described model, we considered the elaboration of safety case (SC) requirements from the standpoint of the chief designer of the turbine works as the person generally responsible for the elaboration of the SC document. The economic context relies on risk analysis and appraisal methods. In their respect, these are determined by the purposes and objectives of analysis, complexity of considered objects, availability of required data, and expertise of specialists hired to conduct the analysis. The article proposes the description of all sources of hazard and scenarios of their actualization in all production phases of machinery life cycle for safety assurance purposes. The detection of risks and hazards allows forming the list of unwanted events. It describes the sources of hazard, various risk factors, conditions for their rise and development, tentative risk appraisals, and elaboration of tentative guidelines for reducing hazard and risk levels.

  2. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...

  3. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).

  4. Ecosystem processes at the watershed scale: mapping and modeling ecohydrological controls

    Treesearch

    Lawrence E. Band; T. Hwang; T.C. Hales; James Vose; Chelcy Ford

    2012-01-01

    Mountain watersheds are sources of a set of valuable ecosystem services as well as potential hazards. The former include high quality freshwater, carbon sequestration, nutrient retention, and biodiversity, whereas the latter include flash floods, landslides and forest fires. Each of these ecosystem services and hazards represents different elements of the integrated...

  5. Recording and cataloging hazards information, revision A

    NASA Technical Reports Server (NTRS)

    Stein, R. J.

    1974-01-01

    A data collection process is described for the purpose of discerning causation factors of accidents, and the establishment of boundaries or controls aimed at mitigating and eliminating accidents. A procedure is proposed that suggests a discipline approach to hazard identification based on energy interrelationships together with an integrated control technique which takes the form of checklists.

  6. Fuels planning: science synthesis and integration; social issues fact sheet 07: The "laws" of effective public education about fire hazards

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Within the past 10 years, breakthrough research has identified factors that are most important for effectively communicating about wildland fire hazards. This fact sheet discusses seven "Laws" of effective public communication that should be considered in any state-of-the-art education campaign.

  7. Nasreya: a treatment and disposal facility for industrial hazardous waste in Alexandria, Egypt: phase I.

    PubMed

    Ramadan, Adham R; Kock, Per; Nadim, Amani

    2005-04-01

    A facility for the treatment and disposal of industrial hazardous waste has been established in Alexandria, Egypt. Phase I of the facility encompassing a secure landfill and solar evaporation ponds is ready to receive waste, and Phase II encompassing physico-chemical treatment, solidification, and interim storage is underway. The facility, the Nasreya Centre, is the first of its kind in Egypt, and represents the nucleus for the integration, improvement and further expansion of different hazardous waste management practices and services in Alexandria. It has been developed within the overall legal framework of the Egyptian Law for the Environment, and is expected to improve prospects for enforcement of the regulatory requirements specified in this law. It has been developed with the overall aim of promoting the establishment of an integrated industrial hazardous waste management system in Alexandria, serving as a demonstration to be replicated elsewhere in Egypt. For Phase I, the Centre only accepts inorganic industrial wastes. In this respect, a waste acceptance policy has been developed, which is expected to be reviewed during Phase II, with an expansion of the waste types accepted.

  8. Integrating asthma hazard characterization methods for consumer products.

    PubMed

    Maier, A; Vincent, M J; Gadagbui, B; Patterson, J; Beckett, W; Dalton, P; Kimber, I; Selgrade, M J K

    2014-10-01

    Despite extensive study, definitive conclusions regarding the relationship between asthma and consumer products remain elusive. Uncertainties reflect the multi-faceted nature of asthma (i.e., contributions of immunologic and non-immunologic mechanisms). Many substances used in consumer products are associated with occupational asthma or asthma-like syndromes. However, risk assessment methods do not adequately predict the potential for consumer product exposures to trigger asthma and related syndromes under lower-level end-user conditions. A decision tree system is required to characterize asthma and respiratory-related hazards associated with consumer products. A system can be built to incorporate the best features of existing guidance, frameworks, and models using a weight-of-evidence (WoE) approach. With this goal in mind, we have evaluated chemical hazard characterization methods for asthma and asthma-like responses. Despite the wealth of information available, current hazard characterization methods do not definitively identify whether a particular ingredient will cause or exacerbate asthma, asthma-like responses, or sensitization of the respiratory tract at lower levels associated with consumer product use. Effective use of hierarchical lines of evidence relies on consideration of the relevance and potency of assays, organization of assays by mode of action, and better assay validation. It is anticipated that the analysis of existing methods will support the development of a refined WoE approach. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Application of HACCP principles to control visitor health threats on dairy farms open to the general public.

    PubMed

    Barten, M; Noordhuizen, J P M; Lipman, L J A

    2008-10-01

    An increasing number of Dutch dairy farmers have diversified their activities, often opening their farm up to visitors (tourist accommodation, farm shop, contact with livestock, etc). It is essential to prevent these visitors from having accidents or becoming ill, which could result in financial claims and might harm the reputation of the agricultural sector. This article describes how the hazard analysis critical control points concept and principles (HACCP) can be applied to these activities and integrated with on-farm operational herd health and production management programmes.

  10. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.

  11. Advancing Alternative Analysis: Integration of Decision Science

    PubMed Central

    Zaunbrecher, Virginia M.; Batteate, Christina M.; Blake, Ann; Carroll, William F.; Corbett, Charles J.; Hansen, Steffen Foss; Lempert, Robert J.; Linkov, Igor; McFadden, Roger; Moran, Kelly D.; Olivetti, Elsa; Ostrom, Nancy K.; Romero, Michelle; Schoenung, Julie M.; Seager, Thomas P.; Sinsheimer, Peter; Thayer, Kristina A.

    2017-01-01

    Background: Decision analysis—a systematic approach to solving complex problems—offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. Objectives: We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. Methods: A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups’ findings. Results: We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. Conclusions: We advance four recommendations: a) engaging the systematic development and evaluation of decision approaches and tools; b) using case studies to advance the integration of decision analysis into alternatives analysis; c) supporting transdisciplinary research; and d) supporting education and outreach efforts. https://doi.org/10.1289/EHP483 PMID:28669940

  12. Space vehicle propulsion systems: Environmental space hazards

    NASA Technical Reports Server (NTRS)

    Disimile, P. J.; Bahr, G. K.

    1990-01-01

    The hazards that exist in geolunar space which may degrade, disrupt, or terminate the performance of space-based LOX/LH2 rocket engines are evaluated. Accordingly, a summary of the open literature pertaining to the geolunar space hazards is provided. Approximately 350 citations and about 200 documents and abstracts were reviewed; the documents selected give current and quantitative detail. The methodology was to categorize the various space hazards in relation to their importance in specified regions of geolunar space. Additionally, the effect of the various space hazards in relation to spacecraft and their systems were investigated. It was found that further investigation of the literature would be required to assess the effects of these hazards on propulsion systems per se; in particular, possible degrading effects on exterior nozzle structure, directional gimbals, and internal combustion chamber integrity and geometry.

  13. Shallow subsurface structure of the Wasatch fault, Provo segment, Utah, from integrated compressional and shear-wave seismic reflection profiles with implications for fault structure and development

    USGS Publications Warehouse

    McBride, J.H.; Stephenson, W.J.; Williams, R.A.; Odum, J.K.; Worley, D.M.; South, J.V.; Brinkerhoff, A.R.; Keach, R.W.; Okojie-Ayoro, A. O.

    2010-01-01

    Integrated vibroseis compressional and experimental hammer-source, shear-wave, seismic reflection profiles across the Provo segment of the Wasatch fault zone in Utah reveal near-surface and shallow bedrock structures caused by geologically recent deformation. Combining information from the seismic surveys, geologic mapping, terrain analysis, and previous seismic first-arrival modeling provides a well-constrained cross section of the upper ~500 m of the subsurface. Faults are mapped from the surface, through shallow, poorly consolidated deltaic sediments, and cutting through a rigid bedrock surface. The new seismic data are used to test hypotheses on changing fault orientation with depth, the number of subsidiary faults within the fault zone and the width of the fault zone, and the utility of integrating separate elastic methods to provide information on a complex structural zone. Although previous surface mapping has indicated only a few faults, the seismic section shows a wider and more complex deformation zone with both synthetic and antithetic normal faults. Our study demonstrates the usefulness of a combined shallow and deeper penetrating geophysical survey, integrated with detailed geologic mapping to constrain subsurface fault structure. Due to the complexity of the fault zone, accurate seismic velocity information is essential and was obtained from a first-break tomography model. The new constraints on fault geometry can be used to refine estimates of vertical versus lateral tectonic movements and to improve seismic hazard assessment along the Wasatch fault through an urban area. We suggest that earthquake-hazard assessments made without seismic reflection imaging may be biased by the previous mapping of too few faults. ?? 2010 Geological Society of America.

  14. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude distribution [Youngs and Coppersmith, 1985]. Notably, the software can deal with uncertainty in the seismicity input parameters such as maximum magnitude value. CRISIS offers a set of built-in GMPEs, as well as the possibility of defining new ones by providing information in a tabular format. Our study shows that in case of Ajaristkali HPP study area, significant contribution to Seismic Hazard comes from local sources with quite low Mmax values, thus these two attenuation lows give us quite different PGA and SA values.

  15. Integrating Powered Descent Vehicle with Back Shell of Mars Spacecraft

    NASA Image and Video Library

    2011-11-10

    The powered descent vehicle of NASA Mars Science Laboratory spacecraft is being prepared for final integration into the spacecraft back shell in this photograph from inside the Payload Hazardous Servicing Facility at NASA Kennedy Space Center, Fla.

  16. Mars Science Laboratory Heat Shield Integration for Flight

    NASA Image and Video Library

    2011-11-10

    During final stacking of NASA Mars Science Laboratory spacecraft, the heat shield is positioned for integration with the rest of the spacecraft in this photograph from inside the Payload Hazardous Servicing Facility at NASA Kennedy Space Center, Fla.

  17. Revision of the geological context of the Port-au-Prince metropolitan area, Haiti: implications for slope failures and seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Terrier, M.; Bialkowski, A.; Nachbaur, A.; Prépetit, C.; Joseph, Y. F.

    2014-09-01

    Following the earthquake of 12 January 2010 in the Port-au-Prince area, the Haitian government, in close cooperation with BRGM, the French geological Survey, decided to undertake a seismic microzonation study of the metropolitan area of the capital in order to take more fully into account the seismic risk in the urbanization and planning of the city under reconstruction. As the first step of the microzonation project, a geological study has been carried out. Deposits of Miocene and Pliocene formations in a marine environment have been identified. These deposits are affected by the Enriquillo-Plantain Garden N80° E fault system and N110° E faults. Tectonic observations and morphological analysis indicate Quaternary activity of several faults mapped in the area of Port-au-Prince. These faults have a N110° trend and show a reverse-sinistral strike-slip motion. Moreover, on the basis of these geological results and of new topographical data, a hazard assessment of ground movements has been made. Along with the map of active faults, the hazard map of ground movements is an integral component of the seismic microzonation study.

  18. Occupational hazards control of hazardous substances in clean room of semiconductor manufacturing plant using CFD analysis.

    PubMed

    Li, Jianfeng; Zhou, Ya-Fei

    2015-02-01

    The manufacturing processes in chip industries are complex, and many kinds of raw materials and solvents of different nature are used, most of which are highly toxic and dangerous. During the machine preventive maintenance period, these toxic and harmful substances will escape from the sealed reaction chamber to the clean workshop environment and endanger the health of the workers on-site, resulting in occupational diseases. From the perspective of prevention, the spread and prediction of hydrochloric acid (HCl) that escaped from the metal-etching chamber during maintenance were studied in this article. The computational fluid dynamics technology was used for a three-dimensional numerical simulation of the indoor air velocity field and the HCl concentration field, and the simulation results were then compared with the on-site monitoring data to verify the correctness and feasibility. The occupational hazards and control measures were analyzed based on the numerical simulation, and the optimal control measure was obtained. In this article, using the method of ambient air to analyze the occupational exposure can provide a new idea to the field of occupational health research in the integrated circuit industry and had theoretical and practical significance. © The Author(s) 2012.

  19. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  20. Hazardous Materials Routing Study Phase II: Analysis of Hazardous Materials Truck Routes in Proximity to the Dallas Central Business District

    DOT National Transportation Integrated Search

    1985-10-01

    This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...

  1. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  2. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  3. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  4. The Inverse Relation Between Risks and Benefits: The Role of Affect and Expertise.

    PubMed

    Sokolowska, Joanna; Sleboda, Patrycja

    2015-07-01

    Although risk and benefits of risky activities are positively correlated in the real world, empirical results indicate that people perceive them as negatively correlated. The common explanation is that confounding benefits and losses stems from affect. In this article, we address the issue that has not been clearly established in studies on the affect heuristic: to what extent boundary conditions, such as judgments' generality and expertise, influence the presence of the inverse relation in judgments of hazards. These conditions were examined in four studies in which respondents evaluated general or specific benefits and risks of "affect-rich" and "affect-poor" hazards (ranging from investments to applications of stem cell research). In line with previous research, affect is defined as good or bad feelings integral to a stimulus. In contrast to previous research, affect is considered as related both to personal feelings and to social controversies associated with a hazard. Expertise is related to personal knowledge (laypersons vs. experts) as well as to objective knowledge (targets well vs. poorly known to science). The direct comparison of the input from personal and objective ignorance into the inverse relation has not been investigated previously. It was found that affect invoked by a hazard guides general but not specific judgments of its benefits and risks. Technical expertise helps to avoid simplified evaluations of consequences as long as they are well known to science. For new, poorly understood hazards (e.g., stem cell research), expertise does not protect from the perception of the inverse relation between benefits and risks. © 2015 Society for Risk Analysis.

  5. An Integrated Scenario Ensemble-Based Framework for Hurricane Evacuation Modeling: Part 2-Hazard Modeling.

    PubMed

    Blanton, Brian; Dresback, Kendra; Colle, Brian; Kolar, Randy; Vergara, Humberto; Hong, Yang; Leonardo, Nicholas; Davidson, Rachel; Nozick, Linda; Wachtendorf, Tricia

    2018-04-25

    Hurricane track and intensity can change rapidly in unexpected ways, thus making predictions of hurricanes and related hazards uncertain. This inherent uncertainty often translates into suboptimal decision-making outcomes, such as unnecessary evacuation. Representing this uncertainty is thus critical in evacuation planning and related activities. We describe a physics-based hazard modeling approach that (1) dynamically accounts for the physical interactions among hazard components and (2) captures hurricane evolution uncertainty using an ensemble method. This loosely coupled model system provides a framework for probabilistic water inundation and wind speed levels for a new, risk-based approach to evacuation modeling, described in a companion article in this issue. It combines the Weather Research and Forecasting (WRF) meteorological model, the Coupled Routing and Excess STorage (CREST) hydrologic model, and the ADvanced CIRCulation (ADCIRC) storm surge, tide, and wind-wave model to compute inundation levels and wind speeds for an ensemble of hurricane predictions. Perturbations to WRF's initial and boundary conditions and different model physics/parameterizations generate an ensemble of storm solutions, which are then used to drive the coupled hydrologic + hydrodynamic models. Hurricane Isabel (2003) is used as a case study to illustrate the ensemble-based approach. The inundation, river runoff, and wind hazard results are strongly dependent on the accuracy of the mesoscale meteorological simulations, which improves with decreasing lead time to hurricane landfall. The ensemble envelope brackets the observed behavior while providing "best-case" and "worst-case" scenarios for the subsequent risk-based evacuation model. © 2018 Society for Risk Analysis.

  6. Incident pregnancy and time to death or AIDS among HIV-positive women receiving antiretroviral therapy.

    PubMed

    Westreich, Daniel; Maskew, Mhairi; Evans, Denise; Firnhaber, Cindy; Majuba, Pappie; Sanne, Ian

    2013-01-01

    Little is known about the impact of pregnancy on response to highly active antiretroviral therapy (HAART) in sub-Saharan Africa. We examined the effect of incident pregnancy after HAART initiation on clinical response to HAART. We evaluated a prospective clinical cohort of adult women initiating HAART in Johannesburg, South Africa between 1 April 2004 and 31 March 2011, and followed up until an event, transfer, drop-out, or administrative end of follow-up on 30 September 2011. Women over age 45 and women who were pregnant at HAART initiation were excluded from the study. Main exposure was having experienced pregnancy after HAART initiation; main outcome was death and (separately) death or new AIDS event. We calculated adjusted hazard ratios (HRs) and 95% confidence limits (CL) using marginal structural Cox proportional hazards models. The study included 7,534 women, and 20,813 person-years of follow-up; 918 women had at least one recognized pregnancy during follow-up. For death alone, the weighted (adjusted) HR was 0.84 (95% CL 0.44, 1.60). Sensitivity analyses confirmed main results, and results were similar for analysis of death or new AIDS event. Incident pregnancy was associated with a substantially reduced hazard of drop-out (HR = 0.62, 95% CL 0.51, 0.75). Recognized incident pregnancy after HAART initiation was not associated with increases in hazard of clinical events, but was associated with a decreased hazard of drop-out. High rates of pregnancy after initiation of HAART may point to a need to better integrate family planning services into clinical care for HIV-infected women.

  7. Risk Assessment in the 21st Century | Science Inventory | US ...

    EPA Pesticide Factsheets

    For the past ~50 years, risk assessment depended almost exclusively on animal testing for hazard identification and dose-response assessment. Originally sound and effective, with increasing dependence on chemical tools and the number of chemicals in commerce, this traditional approach is no longer adequate. This presentation provides an update on current progress in achieving the goals outlined in the NAS report on Toxicology Testing in the 21st Century, highlighting many of the advances lead by the EPA. Topics covered include the evolution of the mode of action framework into a chemically agnostic, adverse outcome pathway (AOP), a systems-based data framework that facilitates integration of modifiable factors (e.g., genetic variation, life stages), and an understanding of networks, and mixtures. Further, the EDSP pivot is used to illustrate how AOPs drive development of predictive models for risk assessment based on assembly of high throughput assays representing AOP key elements. The birth of computational exposure science, capable of large-scale predictive exposure models, is reviewed. Although still in its infancy, development of non-targeted analysis to begin addressing exposome also is presented. Finally, the systems-based AEP is described that integrates exposure, toxicokinetics and AOPs into a comprehensive framework. For the past ~50 years, risk assessment depended almost exclusively on animal testing for hazard identification and dose-response as

  8. Comparative analysis of geodynamic activity of the Caucasian and Eastern Mediterranean segments of the Alpine-Himalayan convergence zone

    NASA Astrophysics Data System (ADS)

    Chelidze, Tamaz; Eppelbaum, Lev

    2013-04-01

    The Alpine-Himalayan convergence zone (AHCZ) underwent recent transverse shortening under the effect of collisional compression. The process was accompanied by rotation of separate microplates. The Caucasian and Eastern Mediterranean regions are segments of the of the AHCZ and are characterized by intensive endogenous and exogenous geodynamic processes, which manifest themselves in occurrence of powerful (with magnitude of 8-9) earthquakes accompanied by development of secondary catastrophic processes. Large landslides, rock falls, avalanches, mud flows, etc. cause human deaths and great material losses. The development of the aforesaid endogenous processes is set forth by peculiarities of the deep structure of the region and an impact of deep geological processes. The Caucasus is divided into several main tectonic terranes: platform (sub-platform, quasi-platform) and fold-thrust units. Existing data enable to perform a division of the Caucasian region into two large-scale geological provinces: southern Tethyan and northern Tethyan located to the south of and to the north of the Lesser Caucasian ophiolite suture, respectively. The recent investigations show that the assessments of the seismic hazard in these regions are not quite correct - for example in the West Caucasus the seismic hazard can be significantly underestimated, which affects the corresponding risk assessments. Integrated analysis of gravity, magnetic, seismic and thermal data enables to refine the assessment of the seismic hazard of the region, taking into account real rates of the geodynamic movements. Important role play the last rheological constructions. According to Reilinger et al. (2006) tectonic scheme, the West flanking of the Arabian Plate manifests strike-slip motion, when the East Caucasian block is converging and shortening. The Eastern Mediterranean is a tectonically complex region located in the midst of the progressive Afro-Eurasian collision. The recent increasing geotectonic activity in this region highlights the need for combined analysis of seismo-neotectonic signatures. For this purpose, this article presents the key features of the tectonic zonation of the Eastern Mediterranean. Map of derivatives of the gravity field retracked from the Geosat satellite and novel map of the Moho discontinuity illustrate the most important tectonic features of the region. The Post-Jurassic map of the deformation of surface leveling reflects the modern tectonic stage of Eastern Mediterranean evolution. The developed tectono-geophysical zonation map integrates the potential geophysical field analysis and seismic section utilization, as well as tectonic-structural, paleogeographical and facial analyses. Tectonically the map agrees with the earlier model of continental accretion (Ben-Avraham and Ginzburg, 1990). Overlaying the seismicity map of the Eastern Mediterranean tectonic region (for the period between 1900 and 2012) on the tectonic zonation chart reveals the key features of the seismo-neotectonic pattern of the Eastern Mediterranean. The results have important implications for tectonic-seismological analysis in this region (Eppelbaum and Katz, 2012). A difference in the geotectonic patterns makes interesting comparison of geodynamic activity and seismic hazard of the Caucasian and Eastern Mediterranean segments of the AHCZ.

  9. Satellite-driven modeling approach for monitoring lava flow hazards during the 2017 Etna eruption

    NASA Astrophysics Data System (ADS)

    Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.; Zago, V.

    2017-12-01

    The integration of satellite data and modeling represents an efficient strategy that may provide immediate answers to the main issues raised at the onset of a new effusive eruption. Satellite-based thermal remote sensing of hotspots related to effusive activity can effectively provide a variety of products suited to timing, locating, and tracking the radiant character of lava flows. Hotspots show the location and occurrence of eruptive events (vents). Discharge rate estimates may indicate the current intensity (effusion rate) and potential magnitude (volume). High-spatial resolution multispectral satellite data can complement field observations for monitoring the front position (length) and extension of flows (area). Physics-based models driven, or validated, by satellite-derived parameters are now capable of fast and accurate forecast of lava flow inundation scenarios (hazard). Here, we demonstrate the potential of the integrated application of satellite remote-sensing techniques and lava flow models during the 2017 effusive eruption at Mount Etna in Italy. This combined approach provided insights into lava flow field evolution by supplying detailed views of flow field construction (e.g., the opening of ephemeral vents) that were useful for more accurate and reliable forecasts of eruptive activity. Moreover, we gave a detailed chronology of the lava flow activity based on field observations and satellite images, assessed the potential extent of impacted areas, mapped the evolution of lava flow field, and executed hazard projections. The underside of this combination is the high sensitivity of lava flow inundation scenarios to uncertainties in vent location, discharge rate, and other parameters, which can make interpreting hazard forecasts difficult during an effusive crisis. However, such integration at last makes timely forecasts of lava flow hazards during effusive crises possible at the great majority of volcanoes for which no monitoring exists.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodman, Julie, E-mail: jgoodman@gradientcorp.com

    Background: The International Agency for Research on Cancer (IARC) recently developed a framework for evaluating mechanistic evidence that includes a list of 10 key characteristics of carcinogens. This framework is useful for identifying and organizing large bodies of literature on carcinogenic mechanisms, but it lacks sufficient guidance for conducting evaluations that fully integrate mechanistic evidence into hazard assessments. Objectives: We summarize the framework, and suggest approaches to strengthen the evaluation of mechanistic evidence using this framework. Discussion: While the framework is useful for organizing mechanistic evidence, its lack of guidance for implementation limits its utility for understanding human carcinogenic potential.more » Specifically, it does not include explicit guidance for evaluating the biological significance of mechanistic endpoints, inter- and intra-individual variability, or study quality and relevance. It also does not explicitly address how mechanistic evidence should be integrated with other realms of evidence. Because mechanistic evidence is critical to understanding human cancer hazards, we recommend that IARC develop transparent and systematic guidelines for the use of this framework so that mechanistic evidence will be evaluated and integrated in a robust manner, and concurrently with other realms of evidence, to reach a final human cancer hazard conclusion. Conclusions: IARC does not currently provide a standardized approach to evaluating mechanistic evidence. Incorporating the recommendations discussed here will make IARC analyses of mechanistic evidence more transparent, and lead to assessments of cancer hazards that reflect the weight of the scientific evidence and allow for scientifically defensible decision-making. - Highlights: • IARC has a revised framework for evaluating literature on carcinogenic mechanisms. • The framework is based on 10 key characteristics of carcinogens. • IARC should develop transparent and systematic guidelines for using the framework. • It should better address biological significance, study quality, and relevance. • It should better address integrating mechanistic evidence with other evidence.« less

  11. Approach-Phase Precision Landing with Hazard Relative Navigation: Terrestrial Test Campaign Results of the Morpheus/ALHAT Project

    NASA Technical Reports Server (NTRS)

    Crain, Timothy P.; Bishop, Robert H.; Carson, John M., III; Trawny, Nikolas; Hanak, Chad; Sullivan, Jacob; Christian, John; DeMars, Kyle; Campbell, Tom; Getchius, Joel

    2016-01-01

    The Morpheus Project began in late 2009 as an ambitious e ort code-named Project M to integrate three ongoing multi-center NASA technology developments: humanoid robotics, liquid oxygen/liquid methane (LOX/LCH4) propulsion and Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) into a single engineering demonstration mission to be own to the Moon by 2013. The humanoid robot e ort was redirected to a deploy- ment of Robonaut 2 on the International Space Station in February of 2011 while Morpheus continued as a terrestrial eld test project integrating the existing ALHAT Project's tech- nologies into a sub-orbital ight system using the world's rst LOX/LCH4 main propulsion and reaction control system fed from the same blowdown tanks. A series of 33 tethered tests with the Morpheus 1.0 vehicle and Morpheus 1.5 vehicle were conducted from April 2011 - December 2013 before successful, sustained free ights with the primary Vertical Testbed (VTB) navigation con guration began with Free Flight 3 on December 10, 2013. Over the course of the following 12 free ights and 3 tethered ights, components of the ALHAT navigation system were integrated into the Morpheus vehicle, operations, and ight control loop. The ALHAT navigation system was integrated and run concurrently with the VTB navigation system as a reference and fail-safe option in ight (see touchdown position esti- mate comparisons in Fig. 1). Flight testing completed with Free Flight 15 on December 15, 2014 with a completely autonomous Hazard Detection and Avoidance (HDA), integration of surface relative and Hazard Relative Navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman lter software, and landing within 2 meters of the VTB GPS-based navigation solution at the safe landing site target. This paper describes the Mor- pheus joint VTB/ALHAT navigation architecture, the sensors utilized during the terrestrial ight campaign, issues resolved during testing, and the navigation results from the ight tests.

  12. Knowledge Integration and Use-Case Analysis for a Customized Drug-Drug Interaction CDS Service

    NASA Astrophysics Data System (ADS)

    Kam, Hye Jin; Park, Man Young; Kim, Woojae; Yoon, Duk Yong; Ahn, Eun Kyoung; Park, Rae Woong

    Clinical decision support systems (CDSSs) are thought to reduce adverse drug events (ADEs) by monitoring drug-drug interactions(DDIs). However, clinically improper or excessive alerts can result in high alert overrides. A tailored CDS service, which is appropriate for clinicians and their ordering situations, is required to increase alert acceptance. In this study, we conducted a 12-week pilot project adopting a tailed CDSS at an emergency department. The new CDSS was conducted via a stepwise integration of additional new rules. The alert status with changes in acceptance rate was analyzed. The most frequent DDI alerts were related to prescriptions of anti-inflammatory drugs. The percentages of alert overrides for each stage were 98.0%, 96.0%, 96.9%, and 98.1%, respectively. 91.5% of overridden alerts were related to discharge medications. To reduce the potential hazards of ADEs, the development of an effective customized DDI CDSS is required, via in-depth analysis on alert patterns and overridden reasons.

  13. Portable standoff spectrometer for hazard identification using integrated quantum cascade laser arrays from 6.5 to 11 µm.

    PubMed

    Witinski, Mark F; Blanchard, Romain; Pfluegl, Christian; Diehl, Laurent; Li, Biao; Krishnamurthy, Kalyani; Pein, Brandt C; Azimi, Masud; Chen, Peili; Ulu, Gokhan; Vander Rhodes, Greg; Howle, Chris R; Lee, Linda; Clewes, Rhea J; Williams, Barry; Vakhshoori, Daryoosh

    2018-04-30

    This article presents new spectroscopic results in standoff chemical detection that are enabled by monolithic arrays of Distributed Feedback (DFB) Quantum Cascade Lasers (QCLs), with each array element at a slightly different wavelength than its neighbor. The standoff analysis of analyte/substrate pairs requires a laser source with characteristics offered uniquely by a QCL Array. This is particularly true for time-evolving liquid chemical warfare agent (CWA) analysis. In addition to describing the QCL array source developed for long wave infrared coverage, a description of an integrated prototype standoff detection system is provided. Experimental standoff detection results using the man-portable system for droplet examination from 1.3 meters are presented using the CWAs VX and T-mustard as test cases. Finally, we consider three significant challenges to working with droplets and liquid films in standoff spectroscopy: substrate uptake of the analyte, time-dependent droplet spread of the analyte, and variable substrate contributions to retrieved signals.

  14. Special Issue "Natural Hazards' Impact on Urban Areas and Infrastructure" in Natural Hazards

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2009-04-01

    In 2006 and 2007, at the 3rd and 4th General Assembly of the European Geosciences Union respectivelly, the session on "Natural Hazards' Impact on Urban Areas and Infrastructure" was convened by Maria Bostenaru Dan, then at the Istituto Universitario di Studi Superiori di Pavia, ROSE School, Italy, who conducts research on earthquake management and Heidi Kreibich from the GFZ Potsdam, Germany, who conducts research on flood hazards, in 2007 being co-convened also by Agostino Goretti from the Civil Protection in Rome, Italy. The session initially started from an idea of Friedemann Wenzel from the Universität Karlsruhe (TH), Germany, the former speaker of the SFB 461 "Strong earthquakes", the university where also Maria Bostenaru graduated and worked and which runs together with the GFZ Potsdam the CEDIM, the Center for Disaster Management and Risk Reduction Technology. Selected papers from these two sessions as well as invited papers from other specialists were gathered for a special issue to be published in the journal "Natural Hazards" under the guest editorship of Heidi Kreibich and Maria Bostenaru Dan. Unlike the former special issue, this one contains a well balanced mixture of many hazards: climate change, floods, mountain hazards like avalanches, volcanoes, earthquakes. Aim of the issue was to enlarge the co-operation prospects between geosciences and other professions in field of natural hazards. Earthquake engineering and engineering seismology are seen more frequently co-operating, but in field of natural hazards there is a need to co-operate with urban planners, and, looking to the future, also in the field of integrated conservation, which implies co-operation between architecture and urban planning for the preservation of our environment. Integrated conservation is stipulated since the 1970s, which are the years when the participatism, and so the involvment of social sciences started.

  15. The New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX) Project - An overview of its major findings

    NASA Astrophysics Data System (ADS)

    Fleming, Kevin; Zschau, Jochen; Gasparini, Paolo

    2014-05-01

    Recent major natural disasters, such as the 2011 Tōhoku earthquake, tsunami and subsequent Fukushima nuclear accident, have raised awareness of the frequent and potentially far-reaching interconnections between natural hazards. Such interactions occur at the hazard level, where an initial hazard may trigger other events (e.g., an earthquake triggering a tsunami) or several events may occur concurrently (or nearly so), e.g., severe weather around the same time as an earthquake. Interactions also occur at the vulnerability level, where the initial event may make the affected community more susceptible to the negative consequences of another event (e.g., an earthquake weakens buildings, which are then damaged further by windstorms). There is also a temporal element involved, where changes in exposure may alter the total risk to a given area. In short, there is the likelihood that the total risk estimated when considering multiple hazard and risks and their interactions is greater than the sum of their individual parts. It is with these issues in mind that the European Commission, under their FP7 program, supported the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project (10.2010 to 12.2013). MATRIX set out to tackle multiple natural hazards (i.e., those of concern to Europe, namely earthquakes, landslides, volcanos, tsunamis, wild fires, storms and fluvial and coastal flooding) and risks within a common theoretical framework. The MATRIX work plan proceeded from an assessment of single-type risk methodologies (including how uncertainties should be treated), cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and an assessment of how the multi-hazard and risk viewpoint may be integrated into current decision making and risk mitigation programs, considering the existing single-hazard and risk focus. Three test sites were considered during the project: Naples, Cologne, and the French West Indies. In addition, a software platform, the MATRIX-Common IT sYstem (MATRIX-CITY), was developed to allow the evaluation of characteristic multi-hazard and risk scenarios in comparison to single-type analyses. This presentation therefore outlines the more significant outcomes of the project, in particular those dealing with the harmonization of single-type hazards, cascade event analysis, time-dependent vulnerability changes and the response of the disaster management community to the MATRIX point of view.

  16. GAPHAZ: improving knowledge management of glacier and permafrost hazards and risks in mountains

    NASA Astrophysics Data System (ADS)

    Huggel, Christian; Burn, Chris; Clague, John J.; Hewitt, Ken; Kääb, Andreas; Krautblatter, Michael; Kargel, Jeffrey S.; Reynolds, John; Sokratov, Sergey

    2014-05-01

    High-mountain environments worldwide are undergoing changes at an historically unprecedented pace due to the sensitivity of the high-mountain cryosphere to climate change. Humans have settled in many mountain regions hundreds, even thousands of years ago, but recent intensive socio-economic developments have increased exposure and vulnerability of people and infrastructure to a large range of natural hazards related to high-mountain processes. Resulting risks are therefore increasing and highly dynamic. GAPHAZ, the Standing Group on Glacier and Permafrost Hazards in Mountains of the International Association of Cryospheric Sciences (IACS) and International Permafrost Association (IPA), is positioned in this context. The objectives of GAPHAZ are to: • improve the international scientific communication on glacier and permafrost hazards; • stimulating and strengthen research collaborations in the field of glacier and permafrost hazards; • compile a state of knowledge related to glacier and permafrost hazards in high mountains; • work towards a greater transfer of information and improved communication between the scientific and governmental/policy communities; • signpost sources of advice to international and national agencies, responsible authorities, and private companies; and • act as a focal point for information for international media during relevant crises. GAPHAZ has initiated a variety of activities over the past years to meet these objectives. One of the important issues is the development of standards of (1) how to make and portray technical assessments of glacier and permafrost related hazards and risks; and (2) how to communicate these to the public and a range of actors including those who implement measures. Thereby, difficulties of and need for better translation between techno-scientific understanding, and the situations and concerns of people most at risk in cold regions need to be recognized. Knowledge-transfer from the few well-researched and monitored regions to the more extensive and diverse regions needs to be addressed.. Standards are required to ensure an adequate level of quality and to avoid incorrect assessments with potentially adverse consequences, as experiences in the past have shown. Concepts and terminologies related to hazard and risk assessments must follow recently issued consensus statements, such as those of UN-ISDR and IPCC. Hazard assessments must be undertaken routinely and regularly, combined with appropriate ground-based and remote sensing monitoring. Assessments need to adequately consider the physical processes and their interactions. Integrative risk assessments should be achieved by interdisciplinary cooperation. There is still a lack of integration of physical/engineering and social aspects of glacier and permafrost hazards; therefore communication and exchange between natural and social science experts must be strengthened. In the design and implementation of risk reduction and adaptation measures, a close collaboration among scientists, policy makers, and local populations is necessary. Recognizing different perceptions of risks among actors are particularly important if risk reduction efforts are to be successful. Measures should generally be adapted to the local social, cultural, economic, political, and institutional context. Early warning systems are becoming increasingly important, and a growing number of experiences are available also for high-mountain environments. A systematic analysis and exchange of experiences using dedicated expert networks will be fostered by GAPHAZ in collaboration with other initiatives and actors.

  17. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  18. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  19. IRIS Toxicological Review of Dichloromethane (Methylene ...

    EPA Pesticide Factsheets

    EPA is conducting a peer review and public comment of the scientific basis supporting the human health hazard and dose-response assessment of Dichloromethane that when finalized will appear on the Integrated Risk Information System (IRIS) database. The draft Toxicological Review of Dichloromethane provides scientific support and rationale for the hazard and dose-response assessment pertaining to chronic exposure to dichloromethane.

  20. Precision Landing and Hazard Avoidance (PL&HA) Domain

    NASA Technical Reports Server (NTRS)

    Robertson, Edward A.; Carson, John M., III

    2016-01-01

    The Precision Landing and Hazard Avoidance (PL&HA) domain addresses the development, integration, testing, and spaceflight infusion of sensing, processing, and GN&C (Guidance, Navigation and Control) functions critical to the success and safety of future human and robotic exploration missions. PL&HA sensors also have applications to other mission events, such as rendezvous and docking.

  1. LABORATORY AND FIELD AUDITS AS PART OF THE EPA (ENVIRONMENTAL PROTECTION AGENCY) HAZARDOUS WASTE ENGINEERING RESEARCH LABORATORY (HWERL) QUALITY ASSURANCE PROGRAM

    EPA Science Inventory

    Audits are an important and integral part of the EPA Hazardous Waste Engineering Research Laboratory (HWERL) Quality Assurance (QA) Program. As part of the overall QA program, audits are used to determine contractor compliance with quality assurance plans and to assess the overal...

  2. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  3. An epigenetic signature of adhesion molecules predicts poor prognosis of ovarian cancer patients

    PubMed Central

    Chang, Ping-Ying; Liao, Yu-Ping; Wang, Hui-Chen; Chen, Yu-Chih; Huang, Rui-Lan; Wang, Yu-Chi; Yuan, Chiou-Chung; Lai, Hung-Cheng

    2017-01-01

    DNA methylation is a promising biomarker for cancer. The epigenetic effects of cell adhesion molecules may affect the therapeutic outcome and the present study examined their effects on survival in ovarian cancer. We integrated methylomics and genomics datasets in The Cancer Genome Atlas (n = 391) and identified 106 highly methylated adhesion-related genes in ovarian cancer tissues. Univariate analysis revealed the methylation status of eight genes related to progression-free survival. In multivariate Cox regression analysis, four highly methylated genes (CD97, CTNNA1, DLC1, HAPLN2) and three genes (LAMA4, LPP, MFAP4) with low methylation were significantly associated with poor progression-free survival. Low methylation of VTN was an independent poor prognostic factor for overall survival after adjustment for age and stage. Patients who carried any two of CTNNA1, DLC1 or MFAP4 were significantly associated with poor progression-free survival (hazard ratio: 1.59; 95% confidence interval: 1.23, 2.05). This prognostic methylation signature was validated in a methylomics dataset generated in our lab (n = 37, hazard ratio: 16.64; 95% confidence interval: 2.68, 103.14) and in another from the Australian Ovarian Cancer Study (n = 91, hazard ratio: 2.43; 95% confidence interval: 1.11, 5.36). Epigenetics of cell adhesion molecules is related to ovarian cancer prognosis. A more comprehensive methylomics of cell adhesion molecules is needed and may advance personalized treatment with adhesion molecule-related drugs. PMID:28881822

  4. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  5. Advancing alternatives analysis: The role of predictive toxicology in selecting safer chemical products and processes.

    PubMed

    Malloy, Timothy; Zaunbrecher, Virginia; Beryt, Elizabeth; Judson, Richard; Tice, Raymond; Allard, Patrick; Blake, Ann; Cote, Ila; Godwin, Hilary; Heine, Lauren; Kerzic, Patrick; Kostal, Jakub; Marchant, Gary; McPartland, Jennifer; Moran, Kelly; Nel, Andre; Ogunseitan, Oladele; Rossi, Mark; Thayer, Kristina; Tickner, Joel; Whittaker, Margaret; Zarker, Ken

    2017-09-01

    Alternatives analysis (AA) is a method used in regulation and product design to identify, assess, and evaluate the safety and viability of potential substitutes for hazardous chemicals. It requires toxicological data for the existing chemical and potential alternatives. Predictive toxicology uses in silico and in vitro approaches, computational models, and other tools to expedite toxicological data generation in a more cost-effective manner than traditional approaches. The present article briefly reviews the challenges associated with using predictive toxicology in regulatory AA, then presents 4 recommendations for its advancement. It recommends using case studies to advance the integration of predictive toxicology into AA, adopting a stepwise process to employing predictive toxicology in AA beginning with prioritization of chemicals of concern, leveraging existing resources to advance the integration of predictive toxicology into the practice of AA, and supporting transdisciplinary efforts. The further incorporation of predictive toxicology into AA would advance the ability of companies and regulators to select alternatives to harmful ingredients, and potentially increase the use of predictive toxicology in regulation more broadly. Integr Environ Assess Manag 2017;13:915-925. © 2017 SETAC. © 2017 SETAC.

  6. The Mediterranean Supersite Volcanoes (MED-SUV) Project: an overview

    NASA Astrophysics Data System (ADS)

    Puglisi, Giuseppe

    2013-04-01

    In response to the EC call ENV.2012.6.4-2 (Long-term monitoring experiments in geologically active regions of Europe prone to natural hazards: the Supersite concept - FP7-ENV-2012-two-stage) a wide community of volcanological institutions proposed the project Mediterranean Supersite Volcanoes (MED-SUV), which is in the negotiation phase at the time of writing. The Consortium is composed by 18 European University and research institutes, four Small or Medium Enterprises (SME) and two non-European University and research institutes. MED-SUV will improve the consortium capacity of assessment of volcanic hazards in Supersites of Southern Italy by optimising and integrating existing and new observation/monitoring systems, by a breakthrough in understanding of volcanic processes and by increasing the effectiveness of the coordination between the scientific and end-user communities. More than 3 million of people are exposed to potential volcanic hazards in a large region in the Mediterranean Sea, where two among the largest European volcanic areas are located: Mt. Etna and Campi Flegrei/Vesuvius. This project will fully exploit the unique detailed long-term in-situ monitoring data sets available for these volcanoes and integrate with Earth Observation (EO) data, setting the basic tools for a significant step ahead in the discrimination of pre-, syn- and post-eruptive phases. The wide range of styles and intensities of volcanic phenomena observed on these volcanoes, which can be assumed as archetypes of 'closed conduit ' and 'open conduit' volcano, together with the long-term multidisciplinary data sets give an exceptional opportunity to improve the understanding of a very wide spectrum of geo-hazards, as well as implementing and testing a large variety of innovative models of ground deformation and motion. Important impacts on the European industrial sector are expected, arising from a partnership integrating the scientific community and SMEs to implement together new observation/monitoring sensors/systems. Specific experiments and studies will be carried out to improve our understanding of the volcanic internal structure and dynamics, as well as to recognise signals related to impending unrest or eruption. Hazard quantitative assessment will benefit by the outcomes of these studies and by their integration into the cutting edge monitoring approaches thus leading to a step-change in hazard awareness and preparedness and leveraging the close relationship between scientists, SMEs, and end-users.

  7. Accident analysis and control options in support of the sludge water system safety analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HEY, B.E.

    A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have beenmore » incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.« less

  8. 3D Simulation of External Flooding Events for the RISMC Pathway

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less

  9. [Assessment of eco-environmental vulnerability of Hainan Island, China].

    PubMed

    Huang, Bao-rong; Ouyang, Zhi-yun; Zhang, Hui-zhi; Zhang, Li-hua; Zheng, Hua

    2009-03-01

    Based on the assessment method of environmental vulnerability constructed by SOPAC and UNEP, this paper constructed an indicator system from three sub-themes including hazard, resistance, and damage to assess the eco-environmental vulnerability of Hainan Island. The results showed that Hainan Island was suffering a middling level eco-environmental hazard, and the main hazards came from some intensive human activities such as intensive agriculture, mass tourism, mining, and a mass of solid wastes thrown by islanders and tourists. Some geographical characters such as larger land area, larger altitude range, integrated geographical form, and abundant habitat types endowed Hainan Island higher resistance to environmental hazards. However, disturbed by historical accumulative artificial and natural hazards, the Island ecosystem had showed serious ecological damage, such as soil degradation and biodiversity loss. Comprehensively considered hazard, resistance, damage, and degradation, the comprehensive environmental vulnerability of the Island was at a middling level. Some indicators showed lower vulnerability, but some showed higher vulnerability.

  10. Capturing spatiotemporal variation in wildfires for improving postwildfire debris-flow hazard assessments: Chapter 20

    USGS Publications Warehouse

    Haas, Jessica R.; Thompson, Matthew P.; Tillery, Anne C.; Scott, Joe H.

    2017-01-01

    Wildfires can increase the frequency and magnitude of catastrophic debris flows. Integrated, proactive natural hazard assessment would therefore characterize landscapes based on the potential for the occurrence and interactions of wildfires and postwildfire debris flows. This chapter presents a new modeling effort that can quantify the variability surrounding a key input to postwildfire debris-flow modeling, the amount of watershed burned at moderate to high severity, in a prewildfire context. The use of stochastic wildfire simulation captures variability surrounding the timing and location of ignitions, fire weather patterns, and ultimately the spatial patterns of watershed area burned. Model results provide for enhanced estimates of postwildfire debris-flow hazard in a prewildfire context, and multiple hazard metrics are generated to characterize and contrast hazards across watersheds. Results can guide mitigation efforts by allowing planners to identify which factors may be contributing the most to the hazard rankings of watersheds.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Camp

    Over the past four years, the Electrical Safety Program at PPPL has evolved in addressing changing regulatory requirements and lessons learned from accident events, particularly in regards to arc flash hazards and implementing NFPA 70E requirements. This presentation will discuss PPPL's approaches to the areas of electrical hazards evaluation, both shock and arc flash; engineered solutions for hazards mitigation such as remote racking of medium voltage breakers, operational changes for hazards avoidance, targeted personnel training and hazard appropriate personal protective equipment. Practical solutions for nominal voltage identification and zero voltage checks for lockout/tagout will also be covered. Finally, we willmore » review the value of a comprehensive electrical drawing program, employee attitudes expressed as a personal safety work ethic, integrated safety management, and sustained management support for continuous safety improvement.« less

  12. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less

  13. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...

  14. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  15. Seismic hazard exposure for the Trans-Alaska Pipeline

    USGS Publications Warehouse

    Cluff, L.S.; Page, R.A.; Slemmons, D.B.; Grouse, C.B.; ,

    2003-01-01

    The discovery of oil on Alaska's North Slope and the construction of a pipeline to transport that oil across Alaska coincided with the National Environmental Policy Act of 1969 and a destructive Southern California earthquake in 1971 to cause stringent stipulations, state-of-the-art investigations, and innovative design for the pipeline. The magnitude 7.9 earthquake on the Denali fault in November 2002 was remarkably consistent with the design earthquake and fault displacement postulated for the Denali crossing of the Trans-Alaska Pipeline route. The pipeline maintained its integrity, and disaster was averted. Recent probabilistic studies to update previous hazard exposure conclusions suggest continuing pipeline integrity.

  16. Improving education and resources for health care providers.

    PubMed Central

    Paul, M; Welch, L

    1993-01-01

    Workers and citizens are turning increasingly to the health care system for information about occupational and environmental reproductive hazards, yet most primary care providers and specialists know little about the effects of occupational/environmental toxicants on the reproductive system or how to evaluate and manage patients at potential risk. Although it is unrealistic to expect all clinicians to become experts in this area, practitioners should know how to take a basic screening history, identify patients at potential risk, and make appropriate referrals. At present, occupational and environmental health issues are not well integrated into health professional education in the United States, and clinical information and referral resources pertaining to reproductive hazards are inadequate. In addressing these problems, the conference "Working Group on Health Provider Education and Resources" made several recommendations that are detailed in this report. Short-term goals include enhancement of existing expertise and resources at a regional level and better integration of information on occupational/environmental reproductive hazards into curricula, meetings, and publications of medical and nursing organizations. Longer term goals include development of a comprehensive, single-access information and referral system for clinicians and integration of occupational and environmental medicine into formal health professional education curricula at all levels. PMID:8243391

  17. Screening and Assessment of Young Children.

    ERIC Educational Resources Information Center

    Friedlander, Bernard Z.

    Most language development hazards in infancy and early childhood fall into the categories of auditory impairment, central integrative dysfunction, inadequate environmental support, and peripheral expressive impairment. Existing knowledge and techniques are inadequate to meet the screening and assessment problems of central integrative dysfunction,…

  18. Colorado Late Cenozoic Fault and Fold Database and Internet Map Server: User-friendly technology for complex information

    USGS Publications Warehouse

    Morgan, K.S.; Pattyn, G.J.; Morgan, M.L.

    2005-01-01

    Internet mapping applications for geologic data allow simultaneous data delivery and collection, enabling quick data modification while efficiently supplying the end user with information. Utilizing Web-based technologies, the Colorado Geological Survey's Colorado Late Cenozoic Fault and Fold Database was transformed from a monothematic, nonspatial Microsoft Access database into a complex information set incorporating multiple data sources. The resulting user-friendly format supports easy analysis and browsing. The core of the application is the Microsoft Access database, which contains information compiled from available literature about faults and folds that are known or suspected to have moved during the late Cenozoic. The database contains nonspatial fields such as structure type, age, and rate of movement. Geographic locations of the fault and fold traces were compiled from previous studies at 1:250,000 scale to form a spatial database containing information such as length and strike. Integration of the two databases allowed both spatial and nonspatial information to be presented on the Internet as a single dataset (http://geosurvey.state.co.us/pubs/ceno/). The user-friendly interface enables users to view and query the data in an integrated manner, thus providing multiple ways to locate desired information. Retaining the digital data format also allows continuous data updating and quick delivery of newly acquired information. This dataset is a valuable resource to anyone interested in earthquake hazards and the activity of faults and folds in Colorado. Additional geologic hazard layers and imagery may aid in decision support and hazard evaluation. The up-to-date and customizable maps are invaluable tools for researchers or the public.

  19. The Saguenay Fjord, Quebec, Canada: Integrating marine geotechnical and geophysical data for spatial seismic slope stability and hazard assessment

    USGS Publications Warehouse

    Urgeles, R.; Locat, J.; Lee, H.J.; Martin, F.

    2002-01-01

    In 1996 a major flood occurred in the Saguenay region, Quebec, Canada, delivering several km3 of sediment to the Saguenay Fjord. Such sediments covered large areas of the, until then, largely contaminated fjord bottom, thus providing a natural capping layer. Recent swath bathymetry data have also shown that sediment landslides are widely present in the upper section of the Saguenay Fjord, and therefore, should a new event occur, it would probably expose the old contaminated sediments. Landslides in the Upper Saguenay Fjord are most probably due to earthquakes given its proximity to the Charlevoix seismic region and to that of the 1988 Saguenay earthquake. In consequence, this study tries to characterize the permanent ground deformations induced by different earthquake scenarios from which shallow sediment landslides could be triggered. The study follows a Newmark analysis in which, firstly, the seismic slope performance is assessed, secondly, the seismic hazard analyzed, and finally an evaluation of the seismic landslide hazard is made. The study is based on slope gradients obtained from EM1000 multibeam bathymetry data as well as water content and undrained shear strength measurements made in box and gravity cores. Ground motions integrating local site conditions were simulated using synthetic time histories. The study assumes the region of the 1988 Saguenay earthquake as the most likely source area for earthquakes capable of inducing large ground motions in the Upper Saguenay region. Accordingly, we have analyzed several shaking intensities to deduce that generalized sediment displacements will begin to occur when moment magnitudes exceed 6. Major displacements, failure, and subsequent landslides could occur only from earthquake moment magnitudes exceeding 6.75. ?? 2002 Elsevier Science B.V. All rights reserved.

  20. Design of an UML conceptual model and implementation of a GIS with metadata information for a seismic hazard assessment cooperative project.

    NASA Astrophysics Data System (ADS)

    Torres, Y.; Escalante, M. P.

    2009-04-01

    This work illustrates the advantages of using a Geographic Information System in a cooperative project with researchers of different countries, such as the RESIS II project (financed by the Norwegian Government and managed by CEPREDENAC) for seismic hazard assessment of Central America. As input data present different formats, cover distinct geographical areas and are subjected to different interpretations, data inconsistencies may appear and their management get complicated. To achieve data homogenization and to integrate them in a GIS, it is required previously to develop a conceptual model. This is accomplished in two phases: requirements analysis and conceptualization. The Unified Modeling Language (UML) is used to compose the conceptual model of the GIS. UML complies with ISO 19100 norms and allows the designer defining model architecture and interoperability. The GIS provides a frame for the combination of large geographic-based data volumes, with an uniform geographic reference and avoiding duplications. All this information contains its own metadata following ISO 19115 normative. In this work, the integration in the same environment of active faults and subduction slabs geometries, combined with the epicentres location, has facilitated the definition of seismogenetic regions. This is a great support for national specialists of different countries to make easier their teamwork. The GIS capacity for making queries (by location and by attributes) and geostatistical analyses is used to interpolate discrete data resulting from seismic hazard calculations and to create continuous maps as well as to check and validate partial results of the study. GIS-based products, such as complete, homogenised databases and thematic cartography of the region, are distributed to all researchers, facilitating cross-national communication, the project execution and results dissemination.

  1. Application of a Rapid Knowledge Synthesis and Transfer Approach to Assess the Microbial Safety of Low-Moisture Foods

    PubMed Central

    Young, Ian; Waddell, Lisa; Cahill, Sarah; Kojima, Mina; Clarke, Renata; Rajic, Andrijana

    2016-01-01

    Low-moisture foods (LMF) are increasingly implicated in outbreaks of foodborne illness resulting in a significant public health burden. To inform the development of a new Codex Alimentarius code of hygienic practice for LMF, we applied a rapid knowledge synthesis and transfer approach to review global research on the burden of illness, prevalence, and interventions to control nine selected microbial hazards in eight categories of LMF. Knowledge synthesis methods included an integrated scoping review (search strategy, relevance screening and confirmation, and evidence mapping), systematic review (detailed data extraction), and meta-analysis of prevalence data. Knowledge transfer of the results was achieved through multiple reporting formats, including evidence summary cards. We identified 214 unique outbreaks and 204 prevalence and 126 intervention studies. ‘Cereals and grains’ (n=142) and Salmonella spp. (n=278) were the most commonly investigated LMF and microbial hazard categories, respectively. Salmonella spp. was implicated in the most outbreaks (n=96, 45%), several of which were large and widespread, resulting in the most hospitalizations (n=895, 89%) and deaths (n=14, 74%). Salmonella spp. had a consistently low prevalence across all LMF categories (0-3%), while other hazards (e.g. B. cereus) were found at highly variable levels. A variety of interventions were investigated in small challenge trials. Key knowledge gaps included under-reporting of LMF outbreaks, limited reporting of microbial concentration data from prevalence studies, and a lack of intervention-efficacy research under commercial conditions. Summary cards were a useful knowledge transfer format to inform complementary risk ranking activities. This review builds upon previous work in this area by synthesizing a broad range of evidence using a structured, transparent, and integrated approach to provide timely evidence-informed inputs into international guidelines. PMID:26613924

  2. Vulnerability of port and harbor communities to earthquake and tsunami hazards: The use of GIS in community hazard planning

    USGS Publications Warehouse

    Wood, Nathan J.; Good, James W.

    2004-01-01

    AbstractEarthquakes and tsunamis pose significant threats to Pacific Northwest coastal port and harbor communities. Developing holistic mitigation and preparedness strategies to reduce the potential for loss of life and property damage requires community-wide vulnerability assessments that transcend traditional site-specific analyses. The ability of a geographic information system (GIS) to integrate natural, socioeconomic, and hazards information makes it an ideal assessment tool to support community hazard planning efforts. This article summarizes how GIS was used to assess the vulnerability of an Oregon port and harbor community to earthquake and tsunami hazards, as part of a larger risk-reduction planning initiative. The primary purposes of the GIS were to highlight community vulnerability issues and to identify areas that both are susceptible to hazards and contain valued port and harbor community resources. Results of the GIS analyses can help decision makers with limited mitigation resources set priorities for increasing community resiliency to natural hazards.

  3. From Invention to Innovation: Risk Analysis to Integrate One Health Technology in the Dairy Farm.

    PubMed

    Lombardo, Andrea; Boselli, Carlo; Amatiste, Simonetta; Ninci, Simone; Frazzoli, Chiara; Dragone, Roberto; De Rossi, Alberto; Grasso, Gerardo; Mantovani, Alberto; Brajon, Giovanni

    2017-01-01

    Current Hazard Analysis Critical Control Points (HACCP) approaches mainly fit for food industry, while their application in primary food production is still rudimentary. The European food safety framework calls for science-based support to the primary producers' mandate for legal, scientific, and ethical responsibility in food supply. The multidisciplinary and interdisciplinary project ALERT pivots on the development of the technological invention (BEST platform) and application of its measurable (bio)markers-as well as scientific advances in risk analysis-at strategic points of the milk chain for time and cost-effective early identification of unwanted and/or unexpected events of both microbiological and toxicological nature. Health-oriented innovation is complex and subject to multiple variables. Through field activities in a dairy farm in central Italy, we explored individual components of the dairy farm system to overcome concrete challenges for the application of translational science in real life and (veterinary) public health. Based on an HACCP-like approach in animal production, the farm characterization focused on points of particular attention (POPAs) and critical control points to draw a farm management decision tree under the One Health view (environment, animal health, food safety). The analysis was based on the integrated use of checklists (environment; agricultural and zootechnical practices; animal health and welfare) and laboratory analyses of well water, feed and silage, individual fecal samples, and bulk milk. The understanding of complex systems is a condition to accomplish true innovation through new technologies. BEST is a detection and monitoring system in support of production security, quality and safety: a grid of its (bio)markers can find direct application in critical points for early identification of potential hazards or anomalies. The HACCP-like self-monitoring in primary production is feasible, as well as the biomonitoring of live food producing animals as sentinel population for One Health.

  4. The approaches for the decision support in case natural hazards

    NASA Astrophysics Data System (ADS)

    Vyazilov, Evgeny; Chunyaev, Nikita

    2013-04-01

    In spite of using highly automated systems of measurement, collecting, storing, handling, prediction and delivery of information on the marine environment, including natural hazards, the amount of damage from natural phenomena increases. Because information on the marine environment delivered to the industrial facilities not effectively used. To such information pays little attention by individual decision-makers and not always perform preventive measures necessary for reduce and prevent damage. Automation of information support will improve the efficiency management of the marine activities. In Russia develops "The Unified system of the information about World ocean" (ESIMO, http://esimo.ru/), that integrates observation, analysis, prognostic and climate data. Necessary to create tools to automatic selection natural disasters through all integrated data; notification decision-makers about arising natural hazards - software agent; provision of information in a compact form for the decision-makers; assessment of possible damage and costs to the preventive measures; providing information on the impacts of environment on economic facilities and recommendations for decision-making; the use of maps, diagrams, tables for reporting. Tools for automatic selection designed for identification of natural phenomena based on the resources ESIMO and corresponding critical values of the indicators environment. The result of this module will be constantly updated database of critical situations of environment for each object or technological process. To operational notify and provide current information about natural hazards proposes using a software agent that is installed on the computer decision-makers, which is activated in case critical situations and provides a minimum of information. In the event of natural disaster software agent should be able to inform decision-makers about this, providing information on the current situation, and the possibility for more and detailed information about natural hazard. Software agent must be able to be configured by the user to a specific object: a specific station or a drilling rigs that region or area, regardless of the size of the object. And depending on the selected settings and parameters it needs to apply the values of critical indexes. There is a need for real-time display hydrometeorological information with compact scheme that reflects the environment indicators that affect industrial facility in the form of devices (thermometer, aneroid, footstock, speedometer, hygrometer, thermograph, barograph, rain gauge, wind vane, etc.) with the ability to display indicators exceed a critical value. This scheme is designed to provide maximum information in the shortest possible time reference that provides an intuitive interface (display by generally accepted standardized instruments). Analytical services ESIMO combined with geographic information service provides information about area or point to decision-makers of showing the dynamics of changes of parameters in the form of graphics, diagrams, maps, tables. If the data sources are updated, analytical services automatically provides the relevance of the data in the analytical views of the complex. Analytical services works with database of integrated data and used for the preparation and presentation of the consumers of the federal and regional level for the analysis of information on the World Oceans. The analytic services show a set of indicators about the situation in the World Ocean. The complex provides the processing, analysis of qualitative and quantitative characteristics (indexes) of the marine environment and maritime activities of the Russian Federation. To assess the need for preventive measures, decision-makers require information in the form of potential economic consequences of natural hazard, cost of preventive measures to prevent damage. We can use for that the economic and mathematical models. When making the decision decision-makers must understand the possible consequences of natural phenomena and rely on the advice that they receive from a decision support system. Such information about the impacts and recommendations are based on the knowledge base (rules of "if, then"), formed on the basis of experience gained in the past. This should take into account not only the critical values for each object and technological process, but also the type of information (observations, forecasting, climate, and after the phenomenon), the level of decision-making, the season of the year, the climate zone where the object is located. Within ESIMO created integrated database that containing the results of the monitoring of the marine environment. The database includes the observation, analysis, and forecasting and climate data. Operational data is updated automatically at specified intervals of a few minutes to a week, depending on the frequency of observations. Climatic data is updated as needed, for example, once a year or once every five years. Integrated database contains a set of indicators related to a specific object, and to a certain geographic area, according to the observations made and the data entered data. In addition to hydrometeorological parameters database contains socio-economic information. At present, was created the database of critical values and information about the impacts of and recommendations for the 30 natural hazards, developed the data structure and software is under development. The interaction of all components of an information support will reduce the economic damage from natural disaster on individual objects economy (ships, ports, etc.) and improve public safety through more prompt delivery of information to decision-makers and the public, besides receive more informative data (district with symptoms natural phenomena , assessment of possible damage, the impact of and recommendations).

  5. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  6. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  7. Integrate urban‐scale seismic hazard analyses with the U.S. National Seismic Hazard Model

    USGS Publications Warehouse

    Moschetti, Morgan P.; Luco, Nicolas; Frankel, Arthur; Petersen, Mark D.; Aagaard, Brad T.; Baltay, Annemarie S.; Blanpied, Michael; Boyd, Oliver; Briggs, Richard; Gold, Ryan D.; Graves, Robert; Hartzell, Stephen; Rezaeian, Sanaz; Stephenson, William J.; Wald, David J.; Williams, Robert A.; Withers, Kyle

    2018-01-01

    For more than 20 yrs, damage patterns and instrumental recordings have highlighted the influence of the local 3D geologic structure on earthquake ground motions (e.g., M">M 6.7 Northridge, California, Gao et al., 1996; M">M 6.9 Kobe, Japan, Kawase, 1996; M">M 6.8 Nisqually, Washington, Frankel, Carver, and Williams, 2002). Although this and other local‐scale features are critical to improving seismic hazard forecasts, historically they have not been explicitly incorporated into the U.S. National Seismic Hazard Model (NSHM, national model and maps), primarily because the necessary basin maps and methodologies were not available at the national scale. Instead,...

  8. Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.

    2008-01-01

    Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.

  9. Towards a cross-domain interoperable framework for natural hazards and disaster risk reduction information

    NASA Astrophysics Data System (ADS)

    Tomas, Robert; Harrison, Matthew; Barredo, José I.; Thomas, Florian; Llorente Isidro, Miguel; Cerba, Otakar; Pfeiffer, Manuela

    2014-05-01

    The vast amount of information and data necessary for comprehensive hazard and risk assessment presents many challenges regarding the lack of accessibility, comparability, quality, organisation and dissemination of natural hazards spatial data. In order to mitigate these limitations an interoperable framework has been developed in the framework of the development of legally binding Implementing rules of the EU INSPIRE Directive1* aiming at the establishment of the European Spatial Data Infrastructure. The interoperability framework is described in the Data Specification on Natural risk zones - Technical Guidelines (DS) document2* that was finalized and published on 10.12. 2013. This framework provides means for facilitating access, integration, harmonisation and dissemination of natural hazard data from different domains and sources. The objective of this paper is twofold. Firstly, the paper demonstrates the applicability of the interoperable framework developed in the DS and highlights the key aspects of the interoperability to the various natural hazards communities. Secondly, the paper "translates" into common language the main features and potentiality of the interoperable framework of the DS for a wider audience of scientists and practitioners in the natural hazards domain. Further in this paper the main five aspects of the interoperable framework will be presented. First, the issue of a common terminology for the natural hazards domain will be addressed. A common data model to facilitate cross domain data integration will follow secondly. Thirdly, the common methodology developed to provide qualitative or quantitative assessments of natural hazards will be presented. Fourthly, the extensible classification schema for natural hazards developed from a literature review and key reference documents from the contributing community of practice will be shown. Finally, the applicability of the interoperable framework for the various stakeholder groups will be also presented. This paper closes discussing open issues and next steps regarding the sustainability and evolution of the interoperable framework and missing aspects such as multi-hazard and multi-risk. --------------- 1*INSPIRE - Infrastructure for spatial information in Europe, http://inspire.ec.europa.eu 2*http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_NZ_v3.0.pdf

  10. Tropical Airborne LiDAR for Landslide Assessment in Malaysia: a technical perspective

    NASA Astrophysics Data System (ADS)

    Abd Manap, Mohamad; Azhari Razak, Khamarrul; Mohamad, Zakaria; Ahmad, Azhari; Ahmad, Ferdaus; Mohamad Zin, Mazlan; A'zad Rosle, Qalam

    2015-04-01

    Malaysia has faced a substantial number of landslide events every year. Cameron Highlands, Pahang is one of the badly areas affected by slope failures characterized by extreme climate, rugged topographic and weathered geological structures in a tropical environment. A high frequency of landslide occurrence in the hilly areas is predominantly due to the geological materials, tropical monsoon seasons and uncontrolled agricultural activities. Therefore the Government of Malaysia through the Prime Minister Department has allocated a special budget to conduct national level hazard and risk mapping project through Minerals and Geoscience Department Malaysia, the Ministry of Natural Resources and Environment. The primary aim of this project is to provide slope hazard risk information for a better slope management in Malaysia. In addition this project will establish national infrastructure for geospatial information on the geological terrain and slope by emphasizing the disaster risk throughout the country. The areas of interest are located in the three different selected areas i.e. Cameron Highlands (275 square kilometers), Ipoh (200 square kilometers) and Cheras Kajang -- Batang kali (650 square kilometers). These areas are selected based on National Slope Master Plan (2009 -- 2023) that endorsed by Malaysia Government Cabinet. The national hazard and risk mapping project includes six parts of major tasks: (1) desk study and mobilization, (2) airborne LiDAR data acquisition and analysis, (3) field data acquisition and verification, (4) hazard and risk for natural terrain, (5) hazard and risk analysis for man-made slope and (6) Man-made slope mitigation/preventive measures. The project was authorized in September, 2014 and will be ended in March, 2016. In this paper, the main focus is to evaluate the suitability of integrated capability of airborne- and terrestrial LiDAR data acquisition and analysis, and also digital photography for regional landslide assessment. The results of the study produced 4 point/m2 density of LiDAR data point cloud, very detailed DEM and DSM of 0.5 m grid and high resolution digital aerial photograph of 7 cm grid, as well as an inventory of the landslide. A reliable landslide inventory has been critically developed with the input of LIDAR derivatives data and field investigation emphasizing on its crucial attributes, e.g., the landslide types, depth, style-, states and distribution of landslide activity. As a result of this study, guidelines and recommendation on the technical aspect of LIDAR-derived landslide assessment are explicitly presented and critically discussed. The finding of this project will be very useful for future planning of slope management, sustainable land use planning and development by related government agencies and local authorities in Malaysia. Keywords: Airborne LiDAR; landslide assessment; hazard and risk analysis; 3D point cloud density; slope failures; Malaysia

  11. The Necessity of Functional Analysis for Space Exploration Programs

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Breidenthal, Julian C.

    2011-01-01

    As NASA moves toward expanded commercial spaceflight within its human exploration capability, there is increased emphasis on how to allocate responsibilities between government and commercial organizations to achieve coordinated program objectives. The practice of program-level functional analysis offers an opportunity for improved understanding of collaborative functions among heterogeneous partners. Functional analysis is contrasted with the physical analysis more commonly done at the program level, and is shown to provide theoretical performance, risk, and safety advantages beneficial to a government-commercial partnership. Performance advantages include faster convergence to acceptable system solutions; discovery of superior solutions with higher commonality, greater simplicity and greater parallelism by substituting functional for physical redundancy to achieve robustness and safety goals; and greater organizational cohesion around program objectives. Risk advantages include avoidance of rework by revelation of some kinds of architectural and contractual mismatches before systems are specified, designed, constructed, or integrated; avoidance of cost and schedule growth by more complete and precise specifications of cost and schedule estimates; and higher likelihood of successful integration on the first try. Safety advantages include effective delineation of must-work and must-not-work functions for integrated hazard analysis, the ability to formally demonstrate completeness of safety analyses, and provably correct logic for certification of flight readiness. The key mechanism for realizing these benefits is the development of an inter-functional architecture at the program level, which reveals relationships between top-level system requirements that would otherwise be invisible using only a physical architecture. This paper describes the advantages and pitfalls of functional analysis as a means of coordinating the actions of large heterogeneous organizations for space exploration programs.

  12. A new approach to flood vulnerability assessment for historic buildings in England

    NASA Astrophysics Data System (ADS)

    Stephenson, V.; D'Ayala, D.

    2014-05-01

    The recent increase in frequency and severity of flooding in the UK has led to a shift in the perception of risk associated with flood hazards. This has extended to the conservation community, and the risks posed to historic structures that suffer from flooding are particularly concerning for those charged with preserving and maintaining such buildings. In order to fully appraise the risks in a manner appropriate to the complex issue of preservation, a new methodology is presented here that studies the nature of the vulnerability of such structures, and places it in the context of risk assessment, accounting for the vulnerable object and the subsequent exposure of that object to flood hazards. The testing of the methodology is carried out using three urban case studies and the results of the survey analysis provide guidance on the development of fragility curves for historic structures exposed to flooding. This occurs through appraisal of vulnerability indicators related to building form, structural and fabric integrity, and preservation of architectural and archaeological values. Key findings of the work include determining the applicability of these indicators to fragility analysis, and the determination of the relative vulnerability of the three case study sites.

  13. Cross-border impacts of the restriction of hazardous substances: a perspective based on Japanese solders.

    PubMed

    Fuse, Masaaki; Tsunemi, Kiyotaka

    2013-08-20

    Despite the relevance of the global economy, Regulatory Impact Assessments of the restriction of hazardous substances (RoHS) in the European Union (EU) are based only on domestic impacts. This paper explores the cross-border environmental impacts of the RoHS by focusing on the shifts to lead-free solders in Japan, which exports many electronics to the EU. The regulatory impacts are quantified by integrating a material flow analysis for metals constituting a solder with a scenario analysis with and without the RoHS. The results indicate that the EU regulation, the RoHS, has triggered shifts in Japan to lead-free solders, not only for electronics subject to this regulation, but for other products as well. We also find that the RoHS leads to a slow reduction in environmental emissions of the target, lead, but results in a rapid increase in the use of tin and silver in lead-free solders. This indicates the importance of assessing potential alternative substances, the use of which may increase as a result of adhering to the RoHS. The latter constitutes a negative impact because of recent concerns regarding resource criticality.

  14. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  15. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.

  16. 43 CFR 2.51 - Assuring integrity of records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...

  17. 43 CFR 2.51 - Assuring integrity of records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...

  18. 43 CFR 2.51 - Assuring integrity of records.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...

  19. 43 CFR 2.226 - Assuring integrity of records.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...

  20. 43 CFR 2.226 - Assuring integrity of records.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...

  1. ISEA2007 panel: Integration of better exposure characterizations into disaster preparedness for responders and the public

    PubMed Central

    Rodes, Charles E.; Pellizzari, Edo D.; Dellarco, Michael J.; Erickson, Mitchell D.; Vallero, Daniel A.; Reissman, Dori B.; Lioy, Paul J.; Lippmann, Morton; Burke, Thomas A.; Goldstein, Bernard D.

    2014-01-01

    An expert panel was convened in October 2007 at the International Society for Exposure Analysis Annual Meeting in Durham, NC, entitled “The Path Forward in Disaster Preparedness Since WTC—Exposure Characterization and Mitigation: Substantial Unfinished Business!” The panel prospectively discussed the critical exposure issues being overlooked during disaster responses and highlighted the needs for an optimal blending of exposure characterizations and hazard controls within disaster settings. The cases were made that effective and timely exposure characterizations must be applied during responses to any disaster, whether terrorist, manmade, or natural in origin. The consistent application of exposure sciences across acute and chronic disaster timelines will assure that the most effective strategies are applied to collect the needed information to guide risk characterization and management approaches. Exposure sciences must be effectively applied across all phases of a disaster (defined as rescue, reentry, recovery, and rehabitation—the four Rs) to appropriately characterize risks and guide risk-mitigation approaches. Failure to adequately characterize and control hazardous exposures increases the likelihood of excess morbidity and mortality. Advancing the infrastructure and the technologies to collect the right exposure information before, during, and immediately after disasters would advance our ability to define risks and protect responders and the public better. The panel provided conclusions, recommendations, and next steps toward effective and timely integration of better exposure science into disaster preparedness, including the need for a subsequent workshop to facilitate this integration. All panel presentations and a summary were uploaded to the ISES1 website (http://www.iseaweb.org/Disaster_Preparedness/index.php). PMID:18685563

  2. ISEA2007 panel: integration of better exposure characterizations into disaster preparedness for responders and the public.

    PubMed

    Rodes, Charles E; Pellizzari, Edo D; Dellarco, Michael J; Erickson, Mitchell D; Vallero, Daniel A; Reissman, Dori B; Lioy, Paul J; Lippmann, Morton; Burke, Thomas A; Goldstein, Bernard D

    2008-11-01

    An expert panel was convened in October 2007 at the International Society for Exposure Analysis Annual Meeting in Durham, NC, entitled "The Path Forward in Disaster Preparedness Since WTC-Exposure Characterization and Mitigation: Substantial Unfinished Business!" The panel prospectively discussed the critical exposure issues being overlooked during disaster responses and highlighted the needs for an optimal blending of exposure characterizations and hazard controls within disaster settings. The cases were made that effective and timely exposure characterizations must be applied during responses to any disaster, whether terrorist, manmade, or natural in origin. The consistent application of exposure sciences across acute and chronic disaster timelines will assure that the most effective strategies are applied to collect the needed information to guide risk characterization and management approaches. Exposure sciences must be effectively applied across all phases of a disaster (defined as rescue, reentry, recovery, and rehabitation-the four Rs) to appropriately characterize risks and guide risk-mitigation approaches. Failure to adequately characterize and control hazardous exposures increases the likelihood of excess morbidity and mortality. Advancing the infrastructure and the technologies to collect the right exposure information before, during, and immediately after disasters would advance our ability to define risks and protect responders and the public better. The panel provided conclusions, recommendations, and next steps toward effective and timely integration of better exposure science into disaster preparedness, including the need for a subsequent workshop to facilitate this integration. All panel presentations and a summary were uploaded to the ISES(1) website (http://www.iseaweb.org/Disaster_Preparedness/index.php).

  3. Incorporation of in vitro digestive enzymes in an intestinal epithelial cell line model for protein hazard identification.

    PubMed

    Markell, Lauren K; Wezalis, Stephanie M; Roper, Jason M; Zimmermann, Cindi; Delaney, Bryan

    2017-10-01

    Relatively few proteins in nature produce adverse effects following oral exposure. Of those that do, effects are often observed in the gut, particularly on intestinal epithelial cells (IEC). Previous studies reported that addition of protein toxins to IEC lines disrupted monolayer integrity but innocuous dietary proteins did not. Studies presented here investigated the effects of innocuous (bovine serum albumin, β-lactoglobulin, RuBisCO, fibronectin) or hazardous (phytohaemagglutinin-E, concanavalin A, wheat germ agglutinin, melittin) proteins that either were untreated or exposed to digestive enzymes prior to addition to Caco-2 human IEC line monolayers. At high concentrations intact fibronectin caused an increase in monolayer permeability but other innocuous proteins did not whether exposed to digestive enzymes or not. In contrast, all untreated hazardous proteins and those that were resistant to digestion (ex. wheat germ agglutinin) disrupted monolayer integrity. However, proteins sensitive to degradation by digestive enzymes (ex. melittin) did not adversely affect monolayers when exposed to these enzymes prior to addition to IEC line monolayers. These results indicate that in vitro exposure of proteins to digestive enzymes can assist in differentiating between innocuous and hazardous proteins as another component to consider in the overall weight of evidence approach in protein hazard assessment. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Promoting Integrated Approaches to Reducing Health Inequities among Low-Income Workers: Applying a Social Ecological Framework

    PubMed Central

    Baron, Sherry L; Beard, Sharon; Davis, Letitia K.; Delp, Linda; Forst, Linda; Kidd-Taylor, Andrea; Liebman, Amy K.; Linnan, Laura; Punnett, Laura; Welch, Laura S.

    2013-01-01

    Nearly one of every three workers in the United States is low-income. Low-income populations have a lower life expectancy and greater rates of chronic diseases compared to those with higher incomes. Low- income workers face hazards in their workplaces as well as in their communities. Developing integrated public health programs that address these combined health hazards, especially the interaction of occupational and non-occupational risk factors, can promote greater health equity. We apply a social-ecological perspective in considering ways to improve the health of the low-income working population through integrated health protection and health promotion programs initiated in four different settings: the worksite, state and local health departments, community health centers, and community-based organizations. An example of successful approaches to developing integrated programs in each of these settings is described. Recommendations for improved research, training, and coordination among health departments, health practitioners, worksites and community organizations are proposed. PMID:23532780

  5. Managing forest structure and fire hazard--a tool for planners.

    Treesearch

    M.C. Johnson; D.L. Peterson; C.L. Raymond

    2006-01-01

    Fire planners and other resource managers need to examine a range of potential fuel and vegetation treatments to select options that will lead to desired outcomes for fire hazard and natural resource conditions. A new approach to this issue integrates concepts and tools from silviculture and fuel science to quantify outcomes for a large number of treatment options in...

  6. Fuels planning: science synthesis and integration; forest structure and fire hazard fact sheet 05: fuel treatment principles for complex landscapes

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Appropriate types of thinning and surface fuel treatments are clearly useful in reducing surface and crown fire hazards under a wide range of fuels and topographic situations. This paper provides well-established scientific principles and simulation tools that can be used to adjust fuel treatments to attain specific risk levels.

  7. 46 CFR 129.520 - Hazardous areas.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... liquid with a flashpoint of below 140 °F (60 °C), or carries hazardous cargoes on deck or in integral...-storage spaces, or within 3 meters (10 feet) of a source of vapor on a weather deck unless the equipment... liquid unless the equipment is explosion-proof or intrinsically safe under § 111.105-9 or § 111.105-11 of...

  8. 46 CFR 129.520 - Hazardous areas.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... liquid with a flashpoint of below 140 °F (60 °C), or carries hazardous cargoes on deck or in integral...-storage spaces, or within 3 meters (10 feet) of a source of vapor on a weather deck unless the equipment... liquid unless the equipment is explosion-proof or intrinsically safe under § 111.105-9 or § 111.105-11 of...

  9. 46 CFR 129.520 - Hazardous areas.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... liquid with a flashpoint of below 140 °F (60 °C), or carries hazardous cargoes on deck or in integral...-storage spaces, or within 3 meters (10 feet) of a source of vapor on a weather deck unless the equipment... liquid unless the equipment is explosion-proof or intrinsically safe under § 111.105-9 or § 111.105-11 of...

  10. The Redesigned Hazard Mapping System (HMS) for Fire and Smoke Analysis

    NASA Astrophysics Data System (ADS)

    Ruminski, M.; Cheng, Z.; Salemi, T.

    2017-12-01

    The HMS thru November 2016 incorporated a wide variety of satellite data for use in fire and smoke detection, including 30 minute interval GOES-East and GOES-West, five AVHRR satellites (NOAA and METOP) and MODIS Aqua/Terra. NESDIS' Satellite Analysis Branch (SAB) analysts utilize the HMS to analyze and quality control the automated fire detections from each of the sensors and create the fire/smoke products to enable the users to mitigate disasters and environmental hazards. The new HMS design eliminates the inefficiencies and increases the accuracy of the fire/smoke analysis. The new system has the capability to display higher resolution data available from VIIRS and other future sensors while maintaining individual pixel integrity which improves the representation of fire size. This benefits the input to smoke forecast models and may possibly be useful as input for fire spread models. The new system also provides greater analysis control of layers and display properties and allows for the display of all GOES images, even when in Rapid Scan Operations (RSO) mode. To enhance the efficiency and improve the accuracy of the fire and smoke product, the new HMS eliminates the sector boundaries that the old HMS used as part of the GUI and now displays the full analysis domain (North and Central America, Caribbean and Hawaii). There is also now the functionality to edit smoke plumes on a finer scale. In the presentation we will highlight the new features of the updated HMS.

  11. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  12. An evaluation of treatment strategies for head and neck cancer in an African American population.

    PubMed

    Ignacio, D N; Griffin, J J; Daniel, M G; Serlemitsos-Day, M T; Lombardo, F A; Alleyne, T A

    2013-07-01

    This study evaluated treatment strategies for head and neck cancers in a predominantly African American population. Data were collected utilizing medical records and the tumour registry at the Howard University Hospital. Kaplan-Meier method was used for survival analysis and Cox proportional hazards regression analysis predicted the hazard of death. Analysis revealed that the main treatment strategy was radiation combined with platinum for all stages except stage I. Cetuximab was employed in only 1% of cases. Kaplan-Meier analysis revealed stage II patients had poorer outcome than stage IV while Cox proportional hazard regression analysis (p = 0.4662) showed that stage I had a significantly lower hazard of death than stage IV (HR = 0.314; p = 0.0272). Contributory factors included tobacco and alcohol but body mass index (BMI) was inversely related to hazard of death. There was no difference in survival using any treatment modality for African Americans.

  13. RiskLab - a joint Teaching Lab on Hazard and Risk Management

    NASA Astrophysics Data System (ADS)

    Baruffini, Mi.; Baruffini, Mo.; Thuering, M.

    2009-04-01

    In the future natural disasters are expected to increase due to climatic changes that strongly affect environmental, social and economical systems. For this reason and because of the limited resources, governments require analytical risk analysis for a better mitigation planning. Risk analysis is a process to determine the nature and extent of risk by estimating potential hazards and evaluating existing conditions of vulnerability that could pose a potential threat or harm to people, property, livelihoods and environment. This process has become a generally accepted approach for the assessment of cost-benefit scenarios; originating from technical risks it is being applied to natural hazards for several years now in Switzerland. Starting from these premises "Risk Lab", a joint collaboration between the Institute of Earth Sciences of the University of Applied Sciences of Southern Switzerland and the Institute for Economic Research of the University of Lugano, has been started in 2006, aiming to become a competence centre about Risk Analysis and Evaluation. The main issue studied by the lab concerns the topic "What security at what price?" and the activities follow the philosophy of the integral risk management as proposed by PLANAT, that defines the process as a cycle that contains different and interrelated phases. The final aim is to change the population and technician idea about risk from "defending against danger" to "being aware of risks" through a proper academic course specially addressed to young people. In fact the most important activity of the laboratory consists in a degree course, offered both to Engineering and Architecture students of the University of Applied Sciences of Southern Switzerland and Economy Students of the University of Lugano. The course is structured in two main parts: an introductive, theoretical part, composed by class lessons, where the main aspects of natural hazards, risk perception and evaluation and risk management are presented and analyzed, and a second part, composed by practical activities, where students can learn specific statistical methods and test and use technical software. Special importance is given to seminars held by experts or members of Civil Protection and risk management institutes. Excursions are often organized to directly see and study practical case studies (Eg. The city of Locarno and the lake Maggiore inundations). The course is organized following a "classical" structure (it's mainly held in a class or in an informatics lab), but students can also benefit from a special web portal, powered by "e.coursers" , the official USI/SUPSI Learning Management System , where they can find issues and documents about natural hazards and risk management. The main pedagogical value is that students can attend a course which is entirely devoted to dealing with natural and man-made hazards and risk, allowing them to resume geological, space planning and economic issues and to face real case studies in a challenging and holistic environment. The final aim of the course is to provide students an useful and integrated "toolbox", essential to cope with and to resolve the overwhelming problems due to vulnerability and danger increase of the present-day society. The course has by now reached the third academic year and the initial results are encouraging: beyond the knowledge and expertise acquired, the graduate students, that are now for the most part working in engineering studies or private companies, have shown to have acquired a mentality devoted to understanding and managing risk. REFERENCES PLANAT HTTP://WWW.CENAT.CH/INDEX.PHP?USERHASH=79598753&L=D&NAVID=154 ECOURSES HTTP://CORSI.ELEARNINGLAB.ORG/ NAHRIS HTTP://WWW.NAHRIS.CH/

  14. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed technique to a training dataset of induced earthquakes recorded by Berkeley-Geysers network, which is installed in The Geysers geothermal area in Northern California. The reliability of the techniques is then tested by using a different dataset performing seismic hazard analysis in a time-evolving approach, which provides with ground-motion values having fixed probabilities of exceedence. Those values can be finally compared with the observations by using appropriate statistical tests.

  15. Methods for routine control of irradiated food: Determination of the irradiation status of shellfish by thermoluminescence analysis

    NASA Astrophysics Data System (ADS)

    Schreiber, G. A.; Hoffmann, A.; Helle, N.; Bögl, K. W.

    1994-06-01

    In some countries, clearance has been given for treating certain types of shellfish by ionizing radiation in order to increase the shelf-life and to reduce health hazards which might be caused by contaminating microorganisms. In the present study, thermoluminescence (TL) analysis was used to examine the irradiation status of shellfish products purchased from local suppliers. For analysis minerals were isolated from the guts of the animals. Although on none of the examined products an irradiation treatment prior to analysis could be shown, the results obtained on non-irradiated and irradiated products have revealed that irradiation within the commercially used dose range can clearly be detected. Already first glow TL intensities of minerald indicated irradiation treatments. Normalized TL signals of non-irradiated and irradiated samples were clearly separated. By calculation of differences of TL intensities and TL signals between non-irradiated and irradiated samples in dependency of integration temperature an optimized integration area for glow curves was determined. The result of this study agree well with results obtained by two large-scale intercomparisons between food control laboratories to detect irradiation treatment of spices and herbal products as well as of fruit and vegetables by TL analysis of contaminating minerals.

  16. Assessing floods and droughts in the Mékrou River basin (West Africa): a combined household survey and climatic trends analysis approach

    NASA Astrophysics Data System (ADS)

    Markantonis, Vasileios; Farinosi, Fabio; Dondeynaz, Celine; Ameztoy, Iban; Pastori, Marco; Marletta, Luca; Ali, Abdou; Carmona Moreno, Cesar

    2018-05-01

    The assessment of natural hazards such as floods and droughts is a complex issue that demands integrated approaches and high-quality data. Especially in African developing countries, where information is limited, the assessment of floods and droughts, though an overarching issue that influences economic and social development, is even more challenging. This paper presents an integrated approach to assessing crucial aspects of floods and droughts in the transboundary Mékrou River basin (a portion of the Niger River basin in West Africa), combining climatic trends analysis and the findings of a household survey. The multivariable trend analysis estimates, at the biophysical level, the climate variability and the occurrence of floods and droughts. These results are coupled with an analysis of household survey data that reveals the behaviour and opinions of local residents regarding the observed climate variability and occurrence of flood and drought events, household mitigation measures, and the impacts of floods and droughts. Based on survey data analysis, the paper provides a per-household cost estimation of floods and droughts that occurred over a 2-year period (2014-2015). Furthermore, two econometric models are set up to identify the factors that influence the costs of floods and droughts to impacted households.

  17. The integration of scientific knowledge on hydrogeomorphological processes in fluvial risk management strategies through the "Freedom space for rivers" concept

    NASA Astrophysics Data System (ADS)

    Massé, Simon; Buffin-Bélanger, Thomas; Biron, Pascale; Ruiz, Julie

    2017-04-01

    Extensive knowledge and tools developed by hydrogeomorphologists led to the development of new approaches for fluvial hazards mapping that recognize the diversity of river systems and consider the temporal morphodynamic adjustments. Hydrogeomorphological mapping can be integrated into a management approach by considering distinct processes with specific regulation and management practices. The Freedom space for rivers (FSR) concept promotes the integration of multiple processes into a single space by combining the flooding and the mobility spaces as well as the riparian wetlands. Flooding spaces are delimited by a combination of methods, calling for the use of LiDAR elevation models and geomorphological observations related to past flood events. Mobility spaces are defined by the analysis of historical river positions and the interpretation of landforms associated with morphodynamics. In the FSR approach, fluvial processes can naturally operate, thus limiting risk for citizens and infrastructure, while providing a series of ecological services and socioeconomic benefits. Many methodological and institutional challenges arise for the applicability of the FSR concept in the management of rivers. To investigate these challenges, working groups bringing together different water stakeholders were created in collaboration with local watershed organizations and municipal authorities in three contrasting river environments in Québec (Canada). Stakeholders' engagement help identify local concerns regarding FSR management, collectively set up implementation strategies and transfer knowledge gained on river dynamics and fluvial hazards. The collaborative research approach aims to better understand challenges and opportunities for FSR management concepts. Farmers' reluctance to limit their interventions and practices along watercourses, a lack of political will at local level, the absence of government incentives to support local FSR actions, and the institutional challenge to combine both flood and erosion into a single regulatory framework are the main obstacles for the integration of the FSR concepts in management practices. In the three studied territories however, regulatory protection of FSR is feasible in undeveloped or uncultivated land to prevent future development in high-risk areas and to ensure the quality of ecological services associated with floodplains. Raising stakeholders' awareness about natural hazards and hydrogeomorphological processes appears essential in order to incorporate new management practices that aim to give more space to rivers in order to reduce vulnerability and improve the resilience of riparian communities.

  18. Need for an "integrated safety assessment" of GMOs, linking food safety and environmental considerations.

    PubMed

    Haslberger, Alexander G

    2006-05-03

    Evidence for substantial environmental influences on health and food safety comes from work with environmental health indicators which show that agroenvironmental practices have direct and indirect effects on human health, concluding that "the quality of the environment influences the quality and safety of foods" [Fennema, O. Environ. Health Perspect. 1990, 86, 229-232). In the field of genetically modified organisms (GMOs), Codex principles have been established for the assessment of GM food safety and the Cartagena Protocol on Biosafety outlines international principles for an environmental assessment of living modified organisms. Both concepts also contain starting points for an assessment of health/food safety effects of GMOs in cases when the environment is involved in the chain of events that could lead to hazards. The environment can act as a route of unintentional entry of GMOs into the food supply, such as in the case of gene flow via pollen or seeds from GM crops, but the environment can also be involved in changes of GMO-induced agricultural practices with relevance for health/food safety. Examples for this include potential regional changes of pesticide uses and reduction in pesticide poisonings resulting from the use of Bt crops or influences on immune responses via cross-reactivity. Clearly, modern methods of biotechnology in breeding are involved in the reasons behind the rapid reduction of local varieties in agrodiversity, which constitute an identified hazard for food safety and food security. The health/food safety assessment of GM foods in cases when the environment is involved needs to be informed by data from environmental assessment. Such data might be especially important for hazard identification and exposure assessment. International organizations working in these areas will very likely be needed to initiate and enable cooperation between those institutions responsible for the different assessments, as well as for exchange and analysis of information. An integrated assessment might help to focus and save capacities in highly technical areas such as molecular characterization or profiling, which are often necessary for both assessments. In the area of establishing international standards for traded foods, such as for the newly created Standards in Trade and Development Facility (STDF), an integrated assessment might help in the consideration of important environmental aspects involved in health and food safety. Furthermore, an established integrated view on GMOs may create greater consumer confidence in the technology.

  19. Hydro-geomorphic connectivity and landslide features extraction to identifying potential threats and hazardous areas

    NASA Astrophysics Data System (ADS)

    Tarolli, Paolo; Fuller, Ian C.; Basso, Federica; Cavalli, Marco; Sofia, Giulia

    2017-04-01

    Hydro-geomorphic connectivity has significantly emerged as a new concept to understand the transfer of surface water and sediment through landscapes. A further scientific challenge is determining how the concept can be used to enable sustainable land and water management. This research proposes an interesting approach to integrating remote sensing techniques, connectivity theory, and geomorphometry based on high-resolution digital terrain model (HR-DTMs) to automatically extract landslides crowns and gully erosion, to determine the different rate of connectivity among the main extracted features and the river network, and thus determine a possible categorization of hazardous areas. The study takes place in two mountainous regions in the Wellington Region (New Zealand). The methodology is a three step approach. Firstly, we performed an automatic detection of the likely landslides crowns through the use of thresholds obtained by the statistical analysis of the variability of landform curvature. After that, the research considered the Connectivity Index to analyse how a complex and rugged topography induces large variations in erosion and sediment delivery in the two catchments. Lastly, the two methods have been integrated to create a unique procedure able to classify the different rate of connectivity among the main features and the river network and thus identifying potential threats and hazardous areas. The methodology is fast, and it can produce a detailed and updated inventory map that could be a key tool for erosional and sediment delivery hazard mitigation. This fast and simple method can be a useful tool to manage emergencies giving priorities to more failure-prone zones. Furthermore, it could be considered to do a preliminary interpretations of geomorphological phenomena and more in general, it could be the base to develop inventory maps. References Cavalli M, Trevisani S, Comiti F, Marchi L. 2013. Geomorphometric assessment of spatial sediment connectivity in small Alpine catchments. Geomorphology 188: 31-41 DOI: 10.1016/j.geomorph.2012.05.007 Sofia G, Dalla Fontana G, Tarolli P. 2014. High-resolution topography and anthropogenic feature extraction: testing geomorphometric parameters in floodplains. Hydrological Processes 28 (4): 2046-2061 DOI: 10.1002/hyp.9727 Tarolli P, Sofia G, Dalla Fontana G. 2012. Geomorphic features extraction from high-resolution topography: landslide crowns and bank erosion. Natural Hazards 61 (1): 65-83 DOI: 10.1007/s11069-010-9695-2

  20. Modelling the changing cumulative vulnerability to climate-related hazards for river basin management using a GIS-based multicriteria decision approach

    NASA Astrophysics Data System (ADS)

    Hung, Hung-Chih; Wu, Ju-Yu; Hung, Chih-Hsuan

    2017-04-01

    1. Background Asia-Pacific region is one of the most vulnerable areas of the world to climate-related hazards and extremes due to rapid urbanization and over-development in hazard-prone areas. It is thus increasingly recognized that the management of land use and reduction of hazard risk are inextricably linked. This is especially critical from the perspective of integrated river basin management. A range of studies has targeted existing vulnerability assessments. However, limited attention has been paid to the cumulative effects of multiple vulnerable factors and their dynamics faced by local communities. This study proposes a novel methodology to access the changing cumulative vulnerability to climate-related hazards, and to examine the relationship between the attraction factors relevant to the general process of urbanization and vulnerability variability with a focus on a river basin management unit. 2. Methods and data The methods applied in this study include three steps. First, using Intergovernmental Panel on Climate Change's (IPCC) approach, a Cumulative Vulnerability Assessment Framework (CVAF) is built with a goal to characterize and compare the vulnerability to climate-related hazards within river basin regions based on a composition of multiple indicators. We organize these indicator metrics into three categories: (1) hazard exposure; (2) socioeconomic sensitivity, and (3) adaptive capacity. Second, the CVAF is applied by combining a geographical information system (GIS)-based spatial statistics technique with a multicriteria decision analysis (MCDA) to assess and map the changing cumulative vulnerability, comparing conditions in 1996 and 2006 in Danshui River Basin, Taiwan. Third, to examine the affecting factors of vulnerability changing, we develop a Vulnerability Changing Model (VCM) using four attraction factors to reflect how the process of urban developments leads to vulnerability changing. The factors are transport networks, land uses, production values of industries, and infrastructures. We then conduct a regression analysis to test the VCM. To illustrate the proposed methodology, the data are collected from the National Science and Technology Center for Disaster Reduction, Taiwan as well as the National Land Use Investigation and official census statistics. 3. Results and policy implications Results of CVAF analysis demonstrate heterogeneous patterns of vulnerability in the region, and highlight trends of long-term changes. The vulnerable areas unfold as clustered patterns and spatial analogues across regions, rather than randomly distributed. Highest cumulative vulnerability is concentrated in densely populated and downstream reaches (such as Taipei City) of the Danshui River in both time periods. When examining the VCM, it indicates that upper stream and more remote areas generally show low vulnerability, increases are observed in some areas between 1996 and 2006 due to land use intensification, industrial and infrastructure expansion. These findings suggest that land use planning should consider the socioeconomic progression and infrastructure investment factors that contribute to urban sprawl and address current as well as future urban developments vulnerable to hazard risk transmission. The cumulative vulnerability assessment, mapping methods and modelling presented here can be applied to other climate change and hazard risks to highlight priority areas for further investigation and contribute towards improving river basin management.

  1. Addressing Uniqueness and Unison of Reliability and Safety for a Better Integration

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Safie, Fayssal

    2016-01-01

    Over time, it has been observed that Safety and Reliability have not been clearly differentiated, which leads to confusion, inefficiency, and, sometimes, counter-productive practices in executing each of these two disciplines. It is imperative to address this situation to help Reliability and Safety disciplines improve their effectiveness and efficiency. The paper poses an important question to address, "Safety and Reliability - Are they unique or unisonous?" To answer the question, the paper reviewed several most commonly used analyses from each of the disciplines, namely, FMEA, reliability allocation and prediction, reliability design involvement, system safety hazard analysis, Fault Tree Analysis, and Probabilistic Risk Assessment. The paper pointed out uniqueness and unison of Safety and Reliability in their respective roles, requirements, approaches, and tools, and presented some suggestions for enhancing and improving the individual disciplines, as well as promoting the integration of the two. The paper concludes that Safety and Reliability are unique, but compensating each other in many aspects, and need to be integrated. Particularly, the individual roles of Safety and Reliability need to be differentiated, that is, Safety is to ensure and assure the product meets safety requirements, goals, or desires, and Reliability is to ensure and assure maximum achievability of intended design functions. With the integration of Safety and Reliability, personnel can be shared, tools and analyses have to be integrated, and skill sets can be possessed by the same person with the purpose of providing the best value to a product development.

  2. GEOTAIL Spacecraft historical data report

    NASA Technical Reports Server (NTRS)

    Boersig, George R.; Kruse, Lawrence F.

    1993-01-01

    The purpose of this GEOTAIL Historical Report is to document ground processing operations information gathered on the GEOTAIL mission during processing activities at the Cape Canaveral Air Force Station (CCAFS). It is hoped that this report may aid management analysis, improve integration processing and forecasting of processing trends, and reduce real-time schedule changes. The GEOTAIL payload is the third Delta 2 Expendable Launch Vehicle (ELV) mission to document historical data. Comparisons of planned versus as-run schedule information are displayed. Information will generally fall into the following categories: (1) payload stay times (payload processing facility/hazardous processing facility/launch complex-17A); (2) payload processing times (planned, actual); (3) schedule delays; (4) integrated test times (experiments/launch vehicle); (5) unique customer support requirements; (6) modifications performed at facilities; (7) other appropriate information (Appendices A & B); and (8) lessons learned (reference Appendix C).

  3. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    PubMed

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  4. Preliminary hazards analysis -- vitrification process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coordes, D.; Ruggieri, M.; Russell, J.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s constructionmore » and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.« less

  5. Integrating geological and geophysical data to improve probabilistic hazard forecasting of Arabian Shield volcanism

    NASA Astrophysics Data System (ADS)

    Runge, Melody G.; Bebbington, Mark S.; Cronin, Shane J.; Lindsay, Jan M.; Moufti, Mohammed R.

    2016-02-01

    During probabilistic volcanic hazard analysis of volcanic fields, a greater variety of spatial data on crustal features should help improve forecasts of future vent locations. Without further examination, however, geophysical estimations of crustal or other features may be non-informative. Here, we present a new, robust, non-parametric method to quantitatively determine the existence of any relationship between natural phenomena (e.g., volcanic eruptions) and a variety of geophysical data. This provides a new validation tool for incorporating a range of potentially hazard-diagnostic observable data into recurrence rate estimates and hazard analyses. Through this study it is shown that the location of Cenozoic volcanic fields across the Arabian Shield appear to be related to locations of major and minor faults, at higher elevations, and regions where gravity anomaly values were between - 125 mGal and 0 mGal. These findings support earlier hypotheses that the western shield uplift was related to Cenozoic volcanism. At the harrat (volcanic field)-scale, higher vent density regions are related to both elevation and gravity anomaly values. A by-product of this work is the collection of existing data on the volcanism across Saudi Arabia, with all vent locations provided herein, as well as updated maps for Harrats Kura, Khaybar, Ithnayn, Kishb, and Rahat. This work also highlights the potential dangers of assuming relationships between observed data and the occurrence of a natural phenomenon without quantitative assessment or proper consideration of the effects of data resolution.

  6. Physical applications of GPS geodesy: a review.

    PubMed

    Bock, Yehuda; Melgar, Diego

    2016-10-01

    Geodesy, the oldest science, has become an important discipline in the geosciences, in large part by enhancing Global Positioning System (GPS) capabilities over the last 35 years well beyond the satellite constellation's original design. The ability of GPS geodesy to estimate 3D positions with millimeter-level precision with respect to a global terrestrial reference frame has contributed to significant advances in geophysics, seismology, atmospheric science, hydrology, and natural hazard science. Monitoring the changes in the positions or trajectories of GPS instruments on the Earth's land and water surfaces, in the atmosphere, or in space, is important for both theory and applications, from an improved understanding of tectonic and magmatic processes to developing systems for mitigating the impact of natural hazards on society and the environment. Besides accurate positioning, all disturbances in the propagation of the transmitted GPS radio signals from satellite to receiver are mined for information, from troposphere and ionosphere delays for weather, climate, and natural hazard applications, to disturbances in the signals due to multipath reflections from the solid ground, water, and ice for environmental applications. We review the relevant concepts of geodetic theory, data analysis, and physical modeling for a myriad of processes at multiple spatial and temporal scales, and discuss the extensive global infrastructure that has been built to support GPS geodesy consisting of thousands of continuously operating stations. We also discuss the integration of heterogeneous and complementary data sets from geodesy, seismology, and geology, focusing on crustal deformation applications and early warning systems for natural hazards.

  7. Analysis of a Hypergolic Propellant Explosion During Processing of Launch Vehicles in the VAB

    NASA Technical Reports Server (NTRS)

    Chrostowski, Jon D.; Gan Wenshui; Campbell, Michael D.

    2010-01-01

    NASA is developing launch vehicles to support missions to Low Earth Orbit (LEO), the moon and deep space. Whether manned or unmanned, the vehicle components will likely be integrated in the Vehicle Assembly Building (VAB) at Kennedy Space Center (KSC) and typically include a fueled spacecraft (SC) that sits on top of one or more stages. The processing of a fueled SC involves hazardous operations when it is brought into the VAB Transfer Aisle and lifted a significant height for mating with lower stages. Accidents resulting from these hazardous operations could impact unrelated personnel working in buildings adjacent to the VAB. Safe separation distances based on the DOD Explosives Standards Quantity-Distance (Q-D) approach result in large IBD arcs. This paper presents site-specific air blast and fragmentation hazard analyses for comparison with the Q-D arcs as well as consequence and risk analyses to provide added information for the decision maker. A new physics-based fragmentation model is presented that includes: a) the development of a primary fragment list (which defines the fragment characteristics) associated with a hypergolic propellant explosion, b) a description of a 3D fragment bounce model, c) the results of probabilistic Monte-Carlo simulations (that include uncertainties in the fragment characteristics) to determine: i) the hazardous fragment density distance, ii) the expected number of wall/roof impacts and penetrations to over 40 buildings adjacent to the VAB, and iii) the risk to building occupants.

  8. Engineered Nanomaterials, Sexy New Technology and Potential Hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaulieu, R A

    Engineered nanomaterials enhance exciting new applications that can greatly benefit society in areas of cancer treatments, solar energy, energy storage, and water purification. While nanotechnology shows incredible promise in these and other areas by exploiting nanomaterials unique properties, these same properties can potentially cause adverse health effects to workers who may be exposed during work. Dispersed nanoparticles in air can cause adverse health effects to animals not merely due to their chemical properties but due to their size, structure, shape, surface chemistry, solubility, carcinogenicity, reproductive toxicity, mutagenicity, dermal toxicity, and parent material toxicity. Nanoparticles have a greater likelihood of lungmore » deposition and blood absorption than larger particles due to their size. Nanomaterials can also pose physical hazards due to their unusually high reactivity, which makes them useful as catalysts, but has the potential to cause fires and explosions. Characterization of the hazards (and potential for exposures) associated with nanomaterial development and incorporation in other products is an essential step in the development of nanotechnologies. Developing controls for these hazards are equally important. Engineered controls should be integrated into nanomaterial manufacturing process design according to 10CFR851, DOE Policy 456.1, and DOE Notice 456.1 as safety-related hardware or administrative controls for worker safety. Nanomaterial hazards in a nuclear facility must also meet control requirements per DOE standards 3009, 1189, and 1186. Integration of safe designs into manufacturing processes for new applications concurrent with the developing technology is essential for worker safety. This paper presents a discussion of nanotechnology, nanomaterial properties/hazards and controls.« less

  9. Managing Space System Faults: Coalescing NASA's Views

    NASA Technical Reports Server (NTRS)

    Muirhead, Brian; Fesq, Lorraine

    2012-01-01

    Managing faults and their resultant failures is a fundamental and critical part of developing and operating aerospace systems. Yet, recent studies have shown that the engineering "discipline" required to manage faults is not widely recognized nor evenly practiced within the NASA community. Attempts to simply name this discipline in recent years has been fraught with controversy among members of the Integrated Systems Health Management (ISHM), Fault Management (FM), Fault Protection (FP), Hazard Analysis (HA), and Aborts communities. Approaches to managing space system faults typically are unique to each organization, with little commonality in the architectures, processes and practices across the industry.

  10. Safety considerations in the design and operation of large wind turbines

    NASA Technical Reports Server (NTRS)

    Reilly, D. H.

    1979-01-01

    The engineering and safety techniques used to assure the reliable and safe operation of large wind turbine generators utilizing the Mod 2 Wind Turbine System Program as an example is described. The techniques involve a careful definition of the wind turbine's natural and operating environments, use of proven structural design criteria and analysis techniques, an evaluation of potential failure modes and hazards, and use of a fail safe and redundant component engineering philosophy. The role of an effective quality assurance program, tailored to specific hardware criticality, and the checkout and validation program developed to assure system integrity are described.

  11. Development of a geophysical data fusion system for assessment and remediation studies of polluted groundwater aquifers. Scientific report No. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, J.R.; O`Neill, D.C.; Barker, B.W.

    1994-10-01

    The research described in this report is directed toward the development of a workstation-based data management, analysis and visualization system which can be used to improve the Air Force`s capability to evaluate site specific environmental hazards. The initial prototype system described in this report is directed toward a specific application to the Massachusetts Military Reservation (formerly Otis Air Force Base) on Cape Cod, Massachusetts. This system integrates a comprehensive, on-line environmental database for the site together with a map-based graphical user interface which facilitates analyst access to the databases and analysis tools needed to characterize the subsurface geologic and hydrologicmore » environments at the site.« less

  12. 16 CFR 1511.5 - Structural integrity tests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Structural integrity tests. 1511.5 Section 1511.5 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT... lowest position in the cylinder. If the uppermost edge of the component or fragment is below the plane of...

  13. 25 CFR 700.263 - Assuring integrity of records.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarrassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...

  14. 25 CFR 700.263 - Assuring integrity of records.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarrassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...

  15. 25 CFR 700.263 - Assuring integrity of records.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...

  16. 25 CFR 700.263 - Assuring integrity of records.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...

  17. 25 CFR 700.263 - Assuring integrity of records.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarrassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...

  18. 25 CFR 43.22 - Assuring integrity of records.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ..., “Computer Security Guidelines for Implementing the Privacy Act of 1974” (May 30, 1975), and any supplements... appropriate administrative, technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which...

  19. 46 CFR 111.105-5 - System integrity.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false System integrity. 111.105-5 Section 111.105-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING ELECTRIC SYSTEMS-GENERAL... individual electrical installation in a hazardous location must comply specifically with Articles 500-505 of...

  20. Global Volcano Model

    NASA Astrophysics Data System (ADS)

    Sparks, R. S. J.; Loughlin, S. C.; Cottrell, E.; Valentine, G.; Newhall, C.; Jolly, G.; Papale, P.; Takarada, S.; Crosweller, S.; Nayembil, M.; Arora, B.; Lowndes, J.; Connor, C.; Eichelberger, J.; Nadim, F.; Smolka, A.; Michel, G.; Muir-Wood, R.; Horwell, C.

    2012-04-01

    Over 600 million people live close enough to active volcanoes to be affected when they erupt. Volcanic eruptions cause loss of life, significant economic losses and severe disruption to people's lives, as highlighted by the recent eruption of Mount Merapi in Indonesia. The eruption of Eyjafjallajökull, Iceland in 2010 illustrated the potential of even small eruptions to have major impact on the modern world through disruption of complex critical infrastructure and business. The effects in the developing world on economic growth and development can be severe. There is evidence that large eruptions can cause a change in the earth's climate for several years afterwards. Aside from meteor impact and possibly an extreme solar event, very large magnitude explosive volcanic eruptions may be the only natural hazard that could cause a global catastrophe. GVM is a growing international collaboration that aims to create a sustainable, accessible information platform on volcanic hazard and risk. We are designing and developing an integrated database system of volcanic hazards, vulnerability and exposure with internationally agreed metadata standards. GVM will establish methodologies for analysis of the data (eg vulnerability indices) to inform risk assessment, develop complementary hazards models and create relevant hazards and risk assessment tools. GVM will develop the capability to anticipate future volcanism and its consequences. NERC is funding the start-up of this initiative for three years from November 2011. GVM builds directly on the VOGRIPA project started as part of the GRIP (Global Risk Identification Programme) in 2004 under the auspices of the World Bank and UN. Major international initiatives and partners such as the Smithsonian Institution - Global Volcanism Program, State University of New York at Buffalo - VHub, Earth Observatory of Singapore - WOVOdat and many others underpin GVM.

Top