Sample records for hazard analysis based

  1. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...

  2. 77 FR 55371 - System Safety Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-07

    ...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...

  3. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-26

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...

  4. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...

  5. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis

    PubMed Central

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011

  6. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...

  7. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less

  8. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  9. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  10. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  11. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  12. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  13. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  14. A conceptual framework for economic optimization of single hazard surveillance in livestock production chains.

    PubMed

    Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W

    2014-06-01

    Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Multi-hazard Assessment and Scenario Toolbox (MhAST): A Framework for Analyzing Compounding Effects of Multiple Hazards

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Moftakhari, H.; AghaKouchak, A.

    2017-12-01

    Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.

  16. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less

  17. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  18. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less

  19. Agent-based simulation for human-induced hazard analysis.

    PubMed

    Bulleit, William M; Drewek, Matthew W

    2011-02-01

    Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.

  20. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less

  1. Installation-Restoration Program. Preliminary assessment; records search for the 155th Tactical Reconnaissance Group, Nebraska Air National Guard, Lincoln Municipal Airport, Lincoln, Nebraska

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-11-01

    The Hazardous Materials Technical Center (HMTC) was retained in May 1986 to conduct the Installation-Restoration Program (IRP) Preliminary Assessment (PA) - Records Search for the 155th Tactical Reconnaissance Group (TRG), Nebraska Air National Guard, Lincoln Municipal Airport, Lincoln, Nebraska (hereinafter referred to as the Base). The Records Search included: an onsite visit including interviews with 19 Base personnel conducted by HMTC personnel on 21-23 May 1986; the acquisition and analysis of pertinent information and records on hazardous materials use and hazardous-waste generation and disposal at the Base; the acquisition and analysis of available geologic, hydrologic, meteorologic, and environmental data frommore » pertinent Federal, State, and local agencies; and the identification of sites on the Base that may be potentially contaminated with hazardous materials/hazardous wastes (HM/HW).« less

  2. 78 FR 48636 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Parts 1, 16, 106, 110... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of...-Based Preventive Controls for Human Food,'' that appeared in the Federal Register of January 16, 2013...

  3. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.

  4. 78 FR 17142 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-20

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Correction AGENCY... manufacturing, packing, or holding human food (CGMPs) to modernize it and to add requirements for domestic and... ``food-production purposes (i.e., manufacturing, processing, packing, and holding) to consistently use...

  5. Hazards and occupational risk in hard coal mines - a critical analysis of legal requirements

    NASA Astrophysics Data System (ADS)

    Krause, Marcin

    2017-11-01

    This publication concerns the problems of occupational safety and health in hard coal mines, the basic elements of which are the mining hazards and the occupational risk. The work includes a comparative analysis of selected provisions of general and industry-specific law regarding the analysis of hazards and occupational risk assessment. Based on a critical analysis of legal requirements, basic assumptions regarding the practical guidelines for occupational risk assessment in underground coal mines have been proposed.

  6. 78 FR 64425 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Parts 16, 225, 500, 507, and 579 [Docket No. FDA-2011-N-0922] Current Good Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Food for Animals; Public Meeting on Proposed Rule AGENCY: Food and...

  7. Laser safety and hazard analysis for the temperature stabilized BSLT ARES laser system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustoni, Arnold L.

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. As a result of temperature stabilization of the BSLT laser the operating parameters of the laser had changed requiring a hazard analysis based on the new operating conditions. The ARES laser system is a Van/Truck based mobile platform, which is used to performmore » laser interaction experiments and tests at various national test sites.« less

  8. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  9. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    PubMed

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  10. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    PubMed Central

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  11. Environmental Assessment for Airborne Laser Debris Management Vandenberg Air Force Base, California

    DTIC Science & Technology

    2008-07-01

    hazardous waste management, water resources, air quality, and biological resources. Based on the analysis of the Proposed Action and No-Action...aesthetics, hazardous materials management, soils and geology, noise, cultural resources, and environmental justice. The resources analyzed in more detail...include: health and safety, hazardous waste management, water resources, air quality, and biological resources. Environmental Effects Under the

  12. An evaluation of HACCP implementation status in UK small and medium enterprises in food manufacturing.

    PubMed

    Fielding, L M; Ellis, L; Beveridge, C; Peters, A C

    2005-04-01

    To reduce foodborne illnesses, hazard and risk-based quality management systems are essential. Small and medium sized companies (SMEs) tend to have a poor understanding of such systems and limited adoption of the Hazard Analysis Critical Control Point system (HACCP). The requirement for full HACCP implementation by 2006 will place an even greater burden on these businesses. The aim of this project is to assess the current levels of understanding of hazards and risks in SMEs in the manufacturing sector. A questionnaire survey was made of 850 SMEs, including microbusinesses. This determined the industry sector and processes carried out, whether the company operated hazard-based quality management and the knowledge of the technical manager regarding the associated hazards and risks. Follow-up visits to the manufacturing plant observed the processes and the operatives to determine their level of understanding. A benchmarking audit was carried out and each company was rated. The results show that the majority of respondents stated that they operated hazard analysis-based quality management. The ability of the respondents to correctly define a hazard or risk or identify different types of hazard was, however, poor. There was no correlation between business type and audit score. The microbusinesses did, however, perform significantly less well than the larger SMEs.

  13. Manned space flight nuclear system safety. Volume 3: Reactor system preliminary nuclear safety analysis. Part 2: Accident Model Document (AMD)

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The Accident Model Document is one of three documents of the Preliminary Safety Analysis Report (PSAR) - Reactor System as applied to a Space Base Program. Potential terrestrial nuclear hazards involving the zirconium hydride reactor-Brayton power module are identified for all phases of the Space Base program. The accidents/events that give rise to the hazards are defined and abort sequence trees are developed to determine the sequence of events leading to the hazard and the associated probabilities of occurence. Source terms are calculated to determine the magnitude of the hazards. The above data is used in the mission accident analysis to determine the most probable and significant accidents/events in each mission phase. The only significant hazards during the prelaunch and launch ascent phases of the mission are those which arise form criticality accidents. Fission product inventories during this time period were found to be very low due to very limited low power acceptance testing.

  14. Software for occupational health and safety risk analysis based on a fuzzy model.

    PubMed

    Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan

    2012-01-01

    Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.

  15. Introduction: Hazard mapping

    USGS Publications Warehouse

    Baum, Rex L.; Miyagi, Toyohiko; Lee, Saro; Trofymchuk, Oleksandr M

    2014-01-01

    Twenty papers were accepted into the session on landslide hazard mapping for oral presentation. The papers presented susceptibility and hazard analysis based on approaches ranging from field-based assessments to statistically based models to assessments that combined hydromechanical and probabilistic components. Many of the studies have taken advantage of increasing availability of remotely sensed data and nearly all relied on Geographic Information Systems to organize and analyze spatial data. The studies used a range of methods for assessing performance and validating hazard and susceptibility models. A few of the studies presented in this session also included some element of landslide risk assessment. This collection of papers clearly demonstrates that a wide range of approaches can lead to useful assessments of landslide susceptibility and hazard.

  16. Flood hazard assessment in areas prone to flash flooding

    NASA Astrophysics Data System (ADS)

    Kvočka, Davor; Falconer, Roger A.; Bray, Michaela

    2016-04-01

    Contemporary climate projections suggest that there will be an increase in the occurrence of high-intensity rainfall events in the future. These precipitation extremes are usually the main cause for the emergence of extreme flooding, such as flash flooding. Flash floods are among the most unpredictable, violent and fatal natural hazards in the world. Furthermore, it is expected that flash flooding will occur even more frequently in the future due to more frequent development of extreme weather events, which will greatly increase the danger to people caused by flash flooding. This being the case, there will be a need for high resolution flood hazard maps in areas susceptible to flash flooding. This study investigates what type of flood hazard assessment methods should be used for assessing the flood hazard to people caused by flash flooding. Two different types of flood hazard assessment methods were tested: (i) a widely used method based on an empirical analysis, and (ii) a new, physically based and experimentally calibrated method. Two flash flood events were considered herein, namely: the 2004 Boscastle flash flood and the 2007 Železniki flash flood. The results obtained in this study suggest that in the areas susceptible to extreme flooding, the flood hazard assessment should be conducted using methods based on a mechanics-based analysis. In comparison to standard flood hazard assessment methods, these physically based methods: (i) take into account all of the physical forces, which act on a human body in floodwater, (ii) successfully adapt to abrupt changes in the flow regime, which often occur for flash flood events, and (iii) rapidly assess a flood hazard index in a relatively short period of time.

  17. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.

  18. Experimental Fuels Facility Re-categorization Based on Facility Segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reiss, Troy P.; Andrus, Jason

    The Experimental Fuels Facility (EFF) (MFC-794) at the Materials and Fuels Complex (MFC) located on the Idaho National Laboratory (INL) Site was originally constructed to provide controlled-access, indoor storage for radiological contaminated equipment. Use of the facility was expanded to provide a controlled environment for repairing contaminated equipment and characterizing, repackaging, and treating waste. The EFF facility is also used for research and development services, including fuel fabrication. EFF was originally categorized as a LTHC-3 radiological facility based on facility operations and facility radiological inventories. Newly planned program activities identified the need to receive quantities of fissionable materials in excessmore » of the single parameter subcritical limit in ANSI/ANS-8.1, “Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors” (identified as “criticality list” quantities in DOE-STD-1027-92, “Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports,” Attachment 1, Table A.1). Since the proposed inventory of fissionable materials inside EFF may be greater than the single parameter sub-critical limit of 700 g of U-235 equivalent, the initial re-categorization is Hazard Category (HC) 2 based upon a potential criticality hazard. This paper details the facility hazard categorization performed for the EFF. The categorization was necessary to determine (a) the need for further safety analysis in accordance with LWP-10802, “INL Facility Categorization,” and (b) compliance with 10 Code of Federal Regulations (CFR) 830, Subpart B, “Safety Basis Requirements.” Based on the segmentation argument presented in this paper, the final hazard categorization for the facility is LTHC-3. Department of Energy Idaho (DOE-ID) approval of the final hazard categorization determined by this hazard assessment document (HAD) was required per the DOE-ID Supplemental Guidance for DOE-STD-1027-92 based on the proposed downgrade of the initial facility categorization of Hazard Category 2.« less

  19. Safety and Hazard Analysis for the Coherent/Acculite Laser Based Sandia Remote Sensing System (Trailer B70).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustoni, Arnold L.

    A laser safety and hazard analysis is presented, for the Coherent(r) driven Acculite(r) laser central to the Sandia Remote Sensing System (SRSS). The analysis is based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The trailer (B70) based SRSS laser system is a mobile platform which is used to perform laser interaction experiments and tests at various national test sites. The trailer based SRSS laser system is generally operated on the United State Air Forcemore » Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances. In order to protect personnel who work inside the Nominal Hazard Zone (NHZ) from hazardous laser exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (ODmin) necessary for the laser safety eyewear used by authorized personnel. Also, the Nominal Ocular Hazard Distance (NOHD) and The Extended Ocular Hazard Distance (EOHD) are calculated in order to protect unauthorized personnel who may have violated the boundaries of the control area and might enter into the laser's NHZ for testing outside the trailer. 4Page intentionally left blank« less

  20. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills.

    PubMed

    Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  1. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    NASA Astrophysics Data System (ADS)

    Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey

    2015-06-01

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.

  2. Accident analysis and control options in support of the sludge water system safety analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HEY, B.E.

    A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have beenmore » incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.« less

  3. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    NASA Astrophysics Data System (ADS)

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.

  4. Final Environmental Assessment for Camp Rudder Master Plan at Eglin Air Force Base, FL

    DTIC Science & Technology

    2005-06-07

    warranting detailed analysis. Hazardous Materials The 6th RTB currently generates hazardous materials in the form of weapons cleaning products and...wastes. There would be no increase in the use of weapons cleaning products ; and therefore, this area does not require analysis. Additionally

  5. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  6. Application of systems and control theory-based hazard analysis to radiation oncology.

    PubMed

    Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G

    2016-03-01

    Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve safety and prevent accidents and warrants further investigation.

  7. Safety distance assessment of industrial toxic releases based on frequency and consequence: a case study in Shanghai, China.

    PubMed

    Yu, Q; Zhang, Y; Wang, X; Ma, W C; Chen, L M

    2009-09-15

    A case study on the safety distance assessment of a chemical industry park in Shanghai, China, is presented in this paper. Toxic releases were taken into consideration. A safety criterion based on frequency and consequence of major hazard accidents was set up for consequence analysis. The exposure limits for the accidents with the frequency of more than 10(-4), 10(-5)-10(-4) and 10(-6)-10(-5) per year were mortalities of 1% (or SLOT), 50% (SLOD) and 75% (twice of SLOD) respectively. Accidents with the frequency of less than 10(-6) per year were considered incredible and ignored in the consequence analysis. Taking the safety distance of all the hazard installations in a chemical plant into consideration, the results based on the new criterion were almost smaller than those based on LC50 or SLOD. The combination of the consequence and risk based results indicated that the hazard installations in two of the chemical plants may be dangerous to the protection targets and measurements had to be taken to reduce the risk. The case study showed that taking account of the frequency of occurrence in the consequence analysis would give more feasible safety distances for major hazard accidents and the results were more comparable to those calculated by risk assessment.

  8. 24 CFR 35.1315 - Collection and laboratory analysis of samples.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1315...

  9. 24 CFR 35.1315 - Collection and laboratory analysis of samples.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1315...

  10. 24 CFR 35.1315 - Collection and laboratory analysis of samples.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1315...

  11. 24 CFR 35.1315 - Collection and laboratory analysis of samples.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1315...

  12. 24 CFR 35.1315 - Collection and laboratory analysis of samples.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1315...

  13. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  14. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

    2015-08-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and ways for their usage in flood risk management are outlined.

  15. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe

    2009-10-15

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose ofmore » GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.« less

  16. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran.

    PubMed

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe; Mosstafakhani, Parasto; Taheri, Kamal; Shahoie, Saber; Khodamoradpour, Mehran

    2009-10-01

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  17. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  18. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    NASA Astrophysics Data System (ADS)

    Grasso, S.; Maugeri, M.

    After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either rigorous complex methods of analysis or qualitative procedures. A semi quantitative procedure based on the definition of the geotechnical hazard index has been applied for the zonation of the seismic geotechnical hazard of the city of Catania. In particular this procedure has been applied to define the influence of geotechnical properties of soil in a central area of the city of Catania, where some historical buildings of great importance are sited. It was also performed an investigation based on the inspection of more than one hundred historical ecclesiastical buildings of great importance, located in the city. Then, in order to identify the amplification effects due to the site conditions, a geotechnical survey form was prepared, to allow a semi quantitative evaluation of the seismic geotechnical hazard for all these historical buildings. In addition, to evaluate the foundation soil time -history response, a 1-D dynamic soil model was employed for all these buildings, considering the non linearity of soil behaviour. Using a GIS, a map of the seismic geotechnical hazard, of the liquefaction hazard and a preliminary map of the seismic hazard for the city of Catania have been obtained. From the analysis of obtained results it may be noticed that high hazard zones are mainly clayey sites

  19. Clusters of community exposure to coastal flooding hazards based on storm and sea level rise scenarios—implications for adaptation networks in the San Francisco Bay region

    USGS Publications Warehouse

    Hummel, Michelle; Wood, Nathan J.; Schweikert, Amy; Stacey, Mark T.; Jones, Jeanne; Barnard, Patrick L.; Erikson, Li H.

    2018-01-01

    Sea level is projected to rise over the coming decades, further increasing the extent of flooding hazards in coastal communities. Efforts to address potential impacts from climate-driven coastal hazards have called for collaboration among communities to strengthen the application of best practices. However, communities currently lack practical tools for identifying potential partner communities based on similar hazard exposure characteristics. This study uses statistical cluster analysis to identify similarities in community exposure to flooding hazards for a suite of sea level rise and storm scenarios. We demonstrate this approach using 63 jurisdictions in the San Francisco Bay region of California (USA) and compare 21 distinct exposure variables related to residents, employees, and structures for six hazard scenario combinations of sea level rise and storms. Results indicate that cluster analysis can provide an effective mechanism for identifying community groupings. Cluster compositions changed based on the selected societal variables and sea level rise scenarios, suggesting that a community could participate in multiple networks to target specific issues or policy interventions. The proposed clustering approach can serve as a data-driven foundation to help communities identify other communities with similar adaptation challenges and to enhance regional efforts that aim to facilitate adaptation planning and investment prioritization.

  20. A new multi-disciplinary model for the assessment and reduction of volcanic risk: the example of the island of Vulcano, Italy

    NASA Astrophysics Data System (ADS)

    Simicevic, Aleksandra; Bonadonna, Costanza; di Traglia, Federico; Rosi, Mauro

    2010-05-01

    Volcanic eruptions are accompanied by numerous hazards which pose short- and long-term threats to people and property. Recent experiences have shown that successful responses to hazard events correlate strongly with the degree to which proactive policies of risk reduction are already in place before an eruption occurs. Effective proactive risk-reduction strategies require contributions from numerous disciplines. A volcanic eruption is not a hazard, per se, but rather an event capable of producing a variety of hazards (e.g. earthquakes, pyroclastic density currents, lava flows, tephra fall, lahars, landslides, gas release, and tsunamis) that can affect the built environment in a variety of ways, over different time scales and with different degrees of intensity. Our proposed model for the assessment and mitigation of exposure-based volcanic risk is mainly based on the compilation of three types of maps: hazard maps, hazard-specific vulnerability maps and exposure-based risk maps. Hazard maps identify the spatial distribution of individual volcanic hazard and it includes both event analysis and impact analysis. Hazard-specific vulnerability maps represent the systematic evaluation of physical vulnerability of the built environment to a range of volcanic phenomena, i.e. spatial distribution of buildings vulnerable to a given hazard based on the analysis of selected building elements. Buildings are classified on the basis of their major components that are relevant for different volcanic hazards, their strength, their construction materials and are defined taking into account the potential damage that each group of building elements (e.g. walls, roof, load-bearing structure) will suffer under a volcanic hazard. All those factors are enumerated in a checklist and are used for the building survey. Hazard-specific vulnerability maps are then overlapped with hazard maps in order to compile exposure-based risk maps and so quantify the potential damage. Such quantification is the starting point of the identification of suitable mitigation measures which will be analyzed through a cost-benefit analysis to assess their financial feasibility. Information about public networks is also recorded in order to give an overall idea of the built environment condition of the island. The vulnerability assessment of the technical systems describes the potential damages that could stress systems like electricity supply, water distribution, communication networks or transport systems. These damages can also be described as function disruption of the system. The important aspect is not only the physical capacity of a system to resist, but also its capacity to continue functioning. The model will be tested on the island of Vulcano in southern Italy. Vulcano is characterized by clear signs of volcanic unrest and is the type locality for a deadly style of eruption. The main active system of Vulcano Island (La Fossa cone) is known to produce a variety of eruption styles and intensities, each posing their own hazards and threats. Six different hazard scenarios have been identified based on a detailed stratigraphic work. The urbanization on Vulcano took place in the 1980s with no real planning and its population mostly subsists on tourism. Our preliminary results show that Vulcano is not characterized by a great variability of architectural typologies and construction materials. Three main types of buildings are present (masonry with concrete frame, masonry with manufactured stone units, masonry with hollow clay bricks) and no statistically significant trends were found between physical and morphological characteristics. The recent signs of volcanic unrest combined with a complex vulnerability of the island due to an uncontrolled urban development and a significant seasonal variation of the exposed population in summer months result in a high volcanic risk. As a result, Vulcano represents the ideal environment to test a multi-hazard based risk model and to study the transition between micro (building) and macro (urban environment) scale of analysis, which is still an unexplored field in the study of volcanic risk. Different levels of vulnerability need to be analyzed in order to increase the level of preparedness, plan a potential evacuation, manage a potential volcanic crisis and assess the best mitigation measures to put in place and reduce the volcanic risk.

  1. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  2. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  3. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    NASA Astrophysics Data System (ADS)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  4. Multi-hazard risk analysis using the FP7 RASOR Platform

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew

    2014-10-01

    Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.

  5. Pyrotechnic hazards classification and evaluation program. Phase 3, segments 1-4: Investigation of sensitivity test methods and procedures for pyrotechnic hazards evaluation and classification, part A

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The findings, conclusions, and recommendations relative to the investigations conducted to evaluate tests for classifying pyrotechnic materials and end items as to their hazard potential are presented. Information required to establish an applicable means of determining the potential hazards of pyrotechnics is described. Hazard evaluations are based on the peak overpressure or impulse resulting from the explosion as a function of distance from the source. Other hazard classification tests include dust ignition sensitivity, impact ignition sensitivity, spark ignition sensitivity, and differential thermal analysis.

  6. Topography- and nightlight-based national flood risk assessment in Canada

    NASA Astrophysics Data System (ADS)

    Elshorbagy, Amin; Bharath, Raja; Lakhanpal, Anchit; Ceola, Serena; Montanari, Alberto; Lindenschmidt, Karl-Erich

    2017-04-01

    In Canada, flood analysis and water resource management, in general, are tasks conducted at the provincial level; therefore, unified national-scale approaches to water-related problems are uncommon. In this study, a national-scale flood risk assessment approach is proposed and developed. The study focuses on using global and national datasets available with various resolutions to create flood risk maps. First, a flood hazard map of Canada is developed using topography-based parameters derived from digital elevation models, namely, elevation above nearest drainage (EAND) and distance from nearest drainage (DFND). This flood hazard mapping method is tested on a smaller area around the city of Calgary, Alberta, against a flood inundation map produced by the city using hydraulic modelling. Second, a flood exposure map of Canada is developed using a land-use map and the satellite-based nightlight luminosity data as two exposure parameters. Third, an economic flood risk map is produced, and subsequently overlaid with population density information to produce a socioeconomic flood risk map for Canada. All three maps of hazard, exposure, and risk are classified into five classes, ranging from very low to severe. A simple way to include flood protection measures in hazard estimation is also demonstrated using the example of the city of Winnipeg, Manitoba. This could be done for the entire country if information on flood protection across Canada were available. The evaluation of the flood hazard map shows that the topography-based method adopted in this study is both practical and reliable for large-scale analysis. Sensitivity analysis regarding the resolution of the digital elevation model is needed to identify the resolution that is fine enough for reliable hazard mapping, but coarse enough for computational tractability. The nightlight data are found to be useful for exposure and risk mapping in Canada; however, uncertainty analysis should be conducted to investigate the effect of the overglow phenomenon on flood risk mapping.

  7. Laser Safety and Hazardous Analysis for the ARES (Big Sky) Laser System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AUGUSTONI, ARNOLD L.

    A laser safety and hazard analysis was performed for the ARES laser system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1,for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  8. The influence of maximum magnitude on seismic-hazard estimates in the Central and Eastern United States

    USGS Publications Warehouse

    Mueller, C.S.

    2010-01-01

    I analyze the sensitivity of seismic-hazard estimates in the central and eastern United States (CEUS) to maximum magnitude (mmax) by exercising the U.S. Geological Survey (USGS) probabilistic hazard model with several mmax alternatives. Seismicity-based sources control the hazard in most of the CEUS, but data seldom provide an objective basis for estimating mmax. The USGS uses preferred mmax values of moment magnitude 7.0 and 7.5 for the CEUS craton and extended margin, respectively, derived from data in stable continental regions worldwide. Other approaches, for example analysis of local seismicity or judgment about a source's seismogenic potential, often lead to much smaller mmax. Alternative models span the mmax ranges from the 1980s Electric Power Research Institute/Seismicity Owners Group (EPRI/SOG) analysis. Results are presented as haz-ard ratios relative to the USGS national seismic hazard maps. One alternative model specifies mmax equal to moment magnitude 5.0 and 5.5 for the craton and margin, respectively, similar to EPRI/SOG for some sources. For 2% probability of exceedance in 50 years (about 0.0004 annual probability), the strong mmax truncation produces hazard ratios equal to 0.35-0.60 for 0.2-sec spectral acceleration, and 0.15-0.35 for 1.0-sec spectral acceleration. Hazard-controlling earthquakes interact with mmax in complex ways. There is a relatively weak dependence on probability level: hazardratios increase 0-15% for 0.002 annual exceedance probability and decrease 5-25% for 0.00001 annual exceedance probability. Although differences at some sites are tempered when faults are added, mmax clearly accounts for some of the discrepancies that are seen in comparisons between USGS-based and EPRI/SOG-based hazard results.

  9. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  10. Preliminary hazards analysis -- vitrification process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coordes, D.; Ruggieri, M.; Russell, J.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s constructionmore » and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.« less

  11. Landslide and flood hazard assessment in urban areas of Levoča region (Eastern Slovakia)

    NASA Astrophysics Data System (ADS)

    Magulova, Barbora; Caporali, Enrica; Bednarik, Martin

    2010-05-01

    The case study presents the use of statistical methods and analysis tools, for hazard assessment of "urbanization units", implemented in a Geographic Information Systems (GIS) environment. As a case study, the Levoča region (Slovakia) is selected. The region, with a total area of about 351 km2, is widely affected by landslides and floods. The problem, for small urbanization areas, is nowadays particularly significant from the socio-economic point of view. It is considered, presently, also an increasing problem, mainly because of climate change and more frequent extreme rainfall events. The geo-hazards are evaluated using a multivariate analysis. The landslide hazard assessment is based on the comparison and subsequent statistical elaboration of territorial dependence among different input factors influencing the instability of the slopes. Particularly, five factors influencing slope stability are evaluated, i.e. lithology, slope aspect, slope angle, hypsographic level and present land use. As a result a new landslide susceptibility map is compiled and different zones of stable, dormant and non-stable areas are defined. For flood hazard map a detailed digital elevation model is created. A compose index of flood hazard is derived from topography, land cover and pedology related data. To estimate flood discharge, time series of stream flow and precipitation measurements are used. The assessment results are prognostic maps of landslide hazard and flood hazard, which presents the optimal base for urbanization planning.

  12. Back analysis of Swiss flood danger map to define local flood hazards

    NASA Astrophysics Data System (ADS)

    Choffet, Marc; Derron, Marc-Henri; Jaboyedoff, Michel; Leroi, Eric; Mayis, Arnaud

    2010-05-01

    The flood hazard maps for the entire Switzerland will be available at the end of 2011. Furthermore, the Swiss territory has been covered by aerial laser scanning (ALS) providing high resolution digital elevation model (DEM). This paper describes the development of a method for analyzing the local flood hazard based on Swiss hazard maps and HR-DEM. In their original state, Swiss hazard maps are constructed on the basis of an aggregation of information, a matrix intensity, and frequency. The degree of danger represented by the yellow, blue and red zones gives no information on the water level at each point of the territory. The developed method is based on a superposition of the danger map with the HR-DEM to determine the water level in a hazard area. To perform this method, (1) a triangulation is based on the intersection of the hazard map with the HR-DEM. It uses the limits of area where information is contrain. The hazard map perimeter and the boundaries of hazard areas give information on the widest possible overflow in case of flooding. It is also possible to associate it with a return period. (2) Based on these areas and the difference with the DEM, it is possible to calibrate the highest flood level and the extract water levels for the entire area. This analysis of existing documents opens up interesting perspectives for understanding how infrastructures are threatened by flood hazard by predicting water levels and potential damages to buildings while proposing remedial measures. Indeed, this method allows estimating the water level at each point of a building in case of flooding. It is designed to provide spatial information on water height levels; this offers a different approach of buildings in danger zones. Indeed, it is possible to discern several elements, such as areas of water accumulation involving longer flood duration, possible structural damages to buildings due to high hydrostatic pressure, determination of a local hazard, or the display of water levels in 3D.

  13. Real-time Position Based Population Data Analysis and Visualization Using Heatmap for Hazard Emergency Response

    NASA Astrophysics Data System (ADS)

    Ding, R.; He, T.

    2017-12-01

    With the increased popularity in mobile applications and services, there has been a growing demand for more advanced mobile technologies that utilize real-time Location Based Services (LBS) data to support natural hazard response efforts. Compared to traditional sources like the census bureau that often can only provide historical and static data, an LBS service can provide more current data to drive a real-time natural hazard response system to more accurately process and assess issues such as population density in areas impacted by a hazard. However, manually preparing or preprocessing the data to suit the needs of the particular application would be time-consuming. This research aims to implement a population heatmap visual analytics system based on real-time data for natural disaster emergency management. System comprised of a three-layered architecture, including data collection, data processing, and visual analysis layers. Real-time, location-based data meeting certain polymerization conditions are collected from multiple sources across the Internet, then processed and stored in a cloud-based data store. Parallel computing is utilized to provide fast and accurate access to the pre-processed population data based on criteria such as the disaster event and to generate a location-based population heatmap as well as other types of visual digital outputs using auxiliary analysis tools. At present, a prototype system, which geographically covers the entire region of China and combines population heat map based on data from the Earthquake Catalogs database has been developed. It Preliminary results indicate that the generation of dynamic population density heatmaps based on the prototype system has effectively supported rapid earthquake emergency rescue and evacuation efforts as well as helping responders and decision makers to evaluate and assess earthquake damage. Correlation analyses that were conducted revealed that the aggregation and movement of people depended on various factors, including earthquake occurrence time and location of epicenter. This research hopes to continue to build upon the success of the prototype system in order to improve and extend the system to support the analysis of earthquakes and other types of natural hazard events.

  14. Updated laser safety & hazard analysis for the ARES laser system based on the 2007 ANSI Z136.1 standard.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustoni, Arnold L.

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2007 version of the American National Standards Institutes (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2005 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  15. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    NASA Astrophysics Data System (ADS)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m resolution was used for the analysis. Three failure mechanisms were analyzed: planar and wedge sliding, as well as toppling. Based on this kinematic analysis, areas where failure is feasible were used as source areas for run out analysis using Rockyfor3D v. 4.1 (www.ecorisq.org). The software calculates trajectories of single falling blocks in three dimensions using physically based algorithms developed under a stochastic approach. The ALS-DEM was down-scaled to 5 m resolution to optimize processing time. Results were compared with run-out simulations using Rockyfor3D with the whole rock wall as source area, and with maps of deposits generated from field observations and aerial photo interpretation. The results product of our implementation show a better correlation with field observations, and help to produce more accurate rock fall hazard assessment maps by a better definition of the source areas. It reduces the time processing for the analysis as well. The findings presented in this contribution are part of an effort to produce guidelines for natural hazard mapping in Norway. Guidelines will be used in upcoming years for hazard mapping in areas where larger groups of population are exposed to mass movements from steep slopes.

  16. Effects of metric change on safety in the workplace for selected occupations

    NASA Astrophysics Data System (ADS)

    Lefande, J. M.; Pokorney, J. L.

    1982-04-01

    The study assesses the potential safety issues of metric conversion in the workplace. A purposive sample of 35 occupations based on injury and illnesses indexes were assessed. After an analysis of workforce population, hazard analysis and measurement sensitivity of the occupations, jobs were analyzed to identify potential safety hazards by industrial hygienists, safety engineers and academia. The study's major findings were as follows: No metric hazard experience was identified. An increased exposure might occur when particular jobs and their job tasks are going the transition from customary measurement to metric measurement. Well planned metric change programs reduce hazard potential. Metric safety issues are unresolved in the aviation industry.

  17. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    NASA Astrophysics Data System (ADS)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach torrent in Tyrol (Austria), are analysed in detail. A couple of buildings are entirely reconstructed within the physical scale model at the scale 1:30. They include basement and first floor and thereby all relevant openings on the building envelopes. The results from experimental modelling represent the data basis for further physics-based vulnerability analysis. Hence, the applied vulnerability analysis concept significantly extends the methods presently used in flood risk assessment. The results of the study are of basic importance for practical application, as they provide extensive information to support hazard zone mapping and management, as well as the planning of local technical protection measures.

  18. Rock cliffs hazard analysis based on remote geostructural surveys: The Campione del Garda case study (Lake Garda, Northern Italy)

    NASA Astrophysics Data System (ADS)

    Ferrero, A. M.; Migliazza, M.; Roncella, R.; Segalini, A.

    2011-02-01

    The town of Campione del Garda (located on the west coast of Lake Garda) and its access road have been historically subject to rockfall phenomena with risk for public security in several areas of the coast. This paper presents a study devoted to the determination of risk for coastal cliffs and the design of mitigation measures. Our study was based on statistical rockfall analysis performed with a commercial code and on stability analysis of rock slopes based on the key block method. Hazard from block kinematics and rock-slope failure are coupled by applying the Rockfall Hazard Assessment Procedure (RHAP). Because of the huge dimensions of the slope, its morphology and the geostructural survey were particularly complicated and demanding. For these reasons, noncontact measurement methods, based on aerial photogrammetry by helicopter, were adopted. A special software program, developed by the authors, was applied for discontinuity identification and for their orientation measurements. The potentially of aerial photogrammetic survey in rock mechanic application and its improvement in the rock mass knowledge is analysed in the article.

  19. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  20. Occupational hazard evaluation model underground coal mine based on unascertained measurement theory

    NASA Astrophysics Data System (ADS)

    Deng, Quanlong; Jiang, Zhongan; Sun, Yaru; Peng, Ya

    2017-05-01

    In order to study how to comprehensively evaluate the influence of several occupational hazard on miners’ physical and mental health, based on unascertained measurement theory, occupational hazard evaluation indicator system was established to make quantitative and qualitative analysis. Determining every indicator weight by information entropy and estimating the occupational hazard level by credible degree recognition criteria, the evaluation model was programmed by Visual Basic, applying the evaluation model to occupational hazard comprehensive evaluation of six posts under a coal mine, and the occupational hazard degree was graded, the evaluation results are consistent with actual situation. The results show that dust and noise is most obvious among the coal mine occupational hazard factors. Excavation face support workers are most affected, secondly, heading machine drivers, coal cutter drivers, coalface move support workers, the occupational hazard degree of these four types workers is II mild level. The occupational hazard degree of ventilation workers and safety inspection workers is I level. The evaluation model could evaluate underground coal mine objectively and accurately, and can be employed to the actual engineering.

  1. Applicability of the Common Safety Method for Risk Evaluation and Assessment (CSM-RA) to the Space Domain

    NASA Astrophysics Data System (ADS)

    Moreira, Francisco; Silva, Nuno

    2016-08-01

    Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.

  2. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  3. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

    PubMed

    Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.

  4. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    PubMed

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. ASIL determination for motorbike's Electronics Throttle Control System (ETCS) mulfunction

    NASA Astrophysics Data System (ADS)

    Zaman Rokhani, Fakhrul; Rahman, Muhammad Taqiuddin Abdul; Ain Kamsani, Noor; Sidek, Roslina Mohd; Saripan, M. Iqbal; Samsudin, Khairulmizam; Khair Hassan, Mohd

    2017-11-01

    Electronics Throttle Control System (ETCS) is the principal electronic unit in all fuel injection engine motorbike, augmenting the engine performance efficiency in comparison to the conventional carburetor based engine. ETCS is regarded as a safety-critical component, whereby ETCS malfunction can cause unintended acceleration or deceleration event, which can be hazardous to riders. In this study, Hazard Analysis and Risk Assessment, an ISO26262 functional safety standard analysis has been applied on motorbike's ETCS to determine the required automotive safety integrity level. Based on the analysis, the established automotive safety integrity level can help to derive technical and functional safety measures for ETCS development.

  6. Landslide hazard assessment of the Black sea coastline (Caucasus, Russia) via drones

    NASA Astrophysics Data System (ADS)

    Kazeev, Andrey; Postoev, German; Fedotova, Ksenia

    2017-04-01

    Landslide hazard assessment of slopes of Sochi was performed along the railway between the cities Tuapse and Adler (total length 103 km). The railway passes through the territory with active development of hazardous geological processes such as landslides, rock falls and debris-flows. By the beginning of 2016, 36 landslide sites were discovered along the railway (total length 34 km), 48 rock-fall sites (length 31 km), and 5 debris-flow sites (length 0.14 km). In recent years the intensification of deformations was observed. For instance, during previous 10 years (1996¬¬-2005) 28 sudden deformations occurred due to slope processes, which caused interruptions in traffic. And in the present decade (2006-2015), 72 deformations were recorded. High landslide activity and economic loss determined the necessity of complex investigations of engineering geological conditions of landslides development and causes of its intensification. The protection strategy development was needed to minimize negative consequences. Thus, the investigations of landslide situation along the railway "Tuapse - Adler" included the categorization of landslide sites by level of hazard, with risk assessment based on numerical criteria. Preliminary evaluation of landslide hazard for the railway was conducted via the analysis of archived engineering-geological documents. 13 of 36 landslide sites (total length 13 km) were selected, reflecting the variety and peculiarities of landslide displacements on slopes (both active and inactive sites). Visual field observations of landslide slopes using drone "DJI Phantom 4" were completed during the second stage of this investigation. High-resolution photographs of landslide cirques, cracks, scarp walls, vegetation features were obtained via drone, which would have been impossible to obtain from the ground in conditions of dense subtropical vegetation cover. Possible approaches to the landslide activity and hazard assessment were evaluated: slope stability analysis, geophysical monitoring methods, analysis of critical deformations and critical velocities of displacement, the analysis of changes of conditions of landslide development during its displacement, as well as scoring approaches to landslide hazard and risk assessment. As the result, the method of probabilistic estimation of landslide activity and hazard has been proposed, based on selection and analysis of main factors, influencing landslide displacements. Slope steepness, landslide thickness, slope length, bedrock dip, slope relief, cracks, vegetation patterns and other factors were used for assessment of activity of landslide sites. The investigation was based on the proposed probabilistic method of assessment of landslide activity and hazard. The considered landslide sites were ranked by the rate of activity as inactive, potentially active and active. The most active sites were used to identify potentially the most hazardous sites. Furthermore, the following factors were additionally considered: the damage of railroad facilities due to landslide, landslide activity, thickness of landslide at the toe of the slope, bedrock stratification, the conditions for the cirque development, the position of the sliding surface relatively to the railway, the involvement of bedrock into displaced mass. As the result, the investigated railroad sites were divided into three categories: non-hazardous, potentially hazardous and hazardous. The research was supported by Russian Scientific Foundation (Project № 16-17-00125).

  7. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L. K.; Vogel, R. M.

    2015-11-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.

  8. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  9. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    Reinsurance companies are stating a high increase in natural hazard related losses, both insured and economic losses, within the last decades on a global scale. This ongoing trend can be described as a product of the dynamic in the natural and in the anthroposphere. To analyze the potential impact of natural hazard process to a certain insurance portfolio or to the society in general, reinsurance companies or risk management consultants have developed loss models. However, those models are generally not fitting the scale dependent demand on regional scales like it is appropriate (i) for analyses on the scale of a specific province or (ii) for portfolio analyses of regional insurance companies. Moreover, the scientific basis of most of the models is not transparent documented and therefore scientific evaluations concerning the methodology concepts are not possible (black box). This is contrary to the scientific principles of transparency and traceability. Especially in mountain regions like the European Alps with their inherent (i) specific characteristic on small scales, (ii) the relative high process dynamics in general, (iii) the occurrence of gravitative mass movements which are related to high relief energy and thus only exists in mountain regions, (iv) the small proportion of the area of permanent settlement on the overall area, (v) the high value concentration in the valley floors, (vi) the exposition of important infrastructures and lifelines, and others, analyses must consider these circumstances adequately. Therefore, risk-based analyses are methodically estimating the potential consequences of hazard process on the built environment standardized with the risk components (i) hazard, (ii) elements at risk, and (iii) vulnerability. However, most research and progress have been made in the field of hazard analyses, whereas the other both components are not developed accordingly. Since these three general components are influencing factors without any weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were incorporated with additional GIS and statistic data to a comprehensive property-by-property geodatabase of the existing elements and values. This stock of elements and values geodatabase is furthermore the consistent basis for all natural hazard analyses and enables the comparison of the results. The study follows the general accepted moduls (i) hazard analysis, (ii) exposition analysis, and (iii) consequence analysis, whereas the exposition analysis estimates the elements at risk with their corresponding damage potentials and the consequence analysis estimates the PMLs. This multi-hazard analysis focuses on process types with a high to extreme potential of negative consequences on a regional scale. In this context, (i) floodings, (ii) rockslides with the potential of corresponding consequence effects (backwater ponding and outburst flood), (iii) earthquakes, (iv) hail events, and (v) winter storms were considered as hazard processes. Based on general hazard analyses (hazard maps) concrete scenarios and their spatial affectedness were determined. For the different hazard processes, different vulnerability approaches were considered to demonstrate their sensitivity and implication on the results. Thus, no absolute values of losses but probable loss ranges were estimated. It can be shown, that the most serious amount of losses would arise from extreme earthquake events with loss burdens up to more than € 7 bn. solely on buildings and inventory. Possible extreme flood events could lead to losses between € 2 and 2.5 bn., whereas a severe hail swath which affects the central Inn valley could result in losses of ca. € 455 mill. (thereof € 285 mill. on vehicles). The potential most serious rockslide with additional consequence effects would result in losses up to ca. € 185 mill. and extreme winter storms can induce losses between € 100 mill. and 150 mill..

  10. Study Of The Risks Arising From Natural Disasters And Hazards On Urban And Intercity Motorways By Using Failure Mode Effect Analysis (FMEA) Methods

    NASA Astrophysics Data System (ADS)

    DELİCE, Yavuz

    2015-04-01

    Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing, signaling caused malfunctions and risks), fire or explosion etc.- In this study, with FMEA method, risk analysis of the urban and intercity motorways against natural disasters and hazards have been performed and found solutions were brought against these risks. Keywords: Failure Modes Effects Analysis (FMEA), Pareto Analyses (PA), Highways, Risk Management.

  11. An integrated knowledge system for the Space Shuttle hazardous gas detection system

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Shi, George Z.; Bangasser, Carl; Fensky, Connie; Cegielski, Eric; Overbey, Glenn

    1993-01-01

    A computer-based integrated Knowledge-Based System, the Intelligent Hypertext Manual (IHM), was developed for the Space Shuttle Hazardous Gas Detection System (HGDS) at NASA Marshall Space Flight Center (MSFC). The IHM stores HGDS related knowledge and presents it in an interactive and intuitive manner. This manual is a combination of hypertext and an expert system which store experts' knowledge and experience in hazardous gas detection and analysis. The IHM's purpose is to provide HGDS personnel with the capabilities of: locating applicable documentation related to procedures, constraints, and previous fault histories; assisting in the training of personnel; enhancing the interpretation of real time data; and recognizing and identifying possible faults in the Space Shuttle sub-systems related to hazardous gas detection.

  12. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  13. Preharvest food safety.

    PubMed

    Childers, A B; Walsh, B

    1996-07-23

    Preharvest food safety is essential for the protection of our food supply. The production and transport of livestock and poultry play an integral part in the safety of these food products. The goals of this safety assurance include freedom from pathogenic microorganisms, disease, and parasites, and from potentially harmful residues and physical hazards. Its functions should be based on hazard analysis and critical control points from producer to slaughter plant with emphasis on prevention of identifiable hazards rather than on removal of contaminated products. The production goal is to minimize infection and insure freedom from potentially harmful residues and physical hazards. The marketing goal is control of exposure to pathogens and stress. Both groups should have functional hazard analysis and critical control points management programs which include personnel training and certification of producers. These programs must cover production procedures, chemical usage, feeding, treatment practices, drug usage, assembly and transportation, and animal identification. Plans must use risk assessment principles, and the procedures must be defined. Other elements would include preslaughter certification, environmental protection, control of chemical hazards, live-animal drug-testing procedures, and identification of physical hazards.

  14. An empirical comparison of statistical tests for assessing the proportional hazards assumption of Cox's model.

    PubMed

    Ng'andu, N H

    1997-03-30

    In the analysis of survival data using the Cox proportional hazard (PH) model, it is important to verify that the explanatory variables analysed satisfy the proportional hazard assumption of the model. This paper presents results of a simulation study that compares five test statistics to check the proportional hazard assumption of Cox's model. The test statistics were evaluated under proportional hazards and the following types of departures from the proportional hazard assumption: increasing relative hazards; decreasing relative hazards; crossing hazards; diverging hazards, and non-monotonic hazards. The test statistics compared include those based on partitioning of failure time and those that do not require partitioning of failure time. The simulation results demonstrate that the time-dependent covariate test, the weighted residuals score test and the linear correlation test have equally good power for detection of non-proportionality in the varieties of non-proportional hazards studied. Using illustrative data from the literature, these test statistics performed similarly.

  15. The comparison of proportional hazards and accelerated failure time models in analyzing the first birth interval survival data

    NASA Astrophysics Data System (ADS)

    Faruk, Alfensi

    2018-03-01

    Survival analysis is a branch of statistics, which is focussed on the analysis of time- to-event data. In multivariate survival analysis, the proportional hazards (PH) is the most popular model in order to analyze the effects of several covariates on the survival time. However, the assumption of constant hazards in PH model is not always satisfied by the data. The violation of the PH assumption leads to the misinterpretation of the estimation results and decreasing the power of the related statistical tests. On the other hand, the accelerated failure time (AFT) models do not assume the constant hazards in the survival data as in PH model. The AFT models, moreover, can be used as the alternative to PH model if the constant hazards assumption is violated. The objective of this research was to compare the performance of PH model and the AFT models in analyzing the significant factors affecting the first birth interval (FBI) data in Indonesia. In this work, the discussion was limited to three AFT models which were based on Weibull, exponential, and log-normal distribution. The analysis by using graphical approach and a statistical test showed that the non-proportional hazards exist in the FBI data set. Based on the Akaike information criterion (AIC), the log-normal AFT model was the most appropriate model among the other considered models. Results of the best fitted model (log-normal AFT model) showed that the covariates such as women’s educational level, husband’s educational level, contraceptive knowledge, access to mass media, wealth index, and employment status were among factors affecting the FBI in Indonesia.

  16. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    PubMed

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  17. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  18. 78 FR 24693 - Draft Qualitative Risk Assessment of Risk of Activity/Food Combinations for Activities (Outside...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-26

    ... Analysis and Risk-Based Preventive Controls for Human Food'' (the proposed preventive controls rule) and... Farm.'' The purpose of the draft RA is to provide a science-based risk analysis of those activity/food... Food, Drug, and Cosmetic Act for hazard analysis and risk-based preventive controls (the proposed...

  19. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2015-04-01

    The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm and landslide is assessed. The application examples show that the proposed model provides a useful tool for screening of undesirable events, with the ultimate goal to reduce the societal vulnerability.

  20. Workplace health hazards: Analysis of hotline calls over a six-year period

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quint, J.; Handley, M.; Cummings, K.

    1990-02-01

    Between 1981-1986 a state-based occupational health telephone hotline received more than 8,000 inquiries on over 3,000 hazardous agents. Major caller groups were employees (37%), employers (20%), health care providers, primarily physicians (19%), government agencies (12%), and labor unions (6%). Employees were the fastest growing caller group. Callers inquired about general health hazards of chemicals (65%), the relation of symptoms to work (22%), and risks to pregnancy (13%).

  1. Implementation of health and safety management system to reduce hazardous potential in PT.XYZ Indonesia

    NASA Astrophysics Data System (ADS)

    Widodo, L.; Adianto; Sartika, D. I.

    2017-12-01

    PT. XYZ is a large automotive manufacturing company that manufacture, assemble as well as a car exporter. The other products are spare parts, jig and dies. PT. XYZ has long been implementing the Occupational Safety and Health Management System (OSHMS) to reduce the potential hazards that cause work accidents. However, this does not mean that OSHMS that has been implemented does not need to be upgraded and improved. This is due to the potential danger caused by work is quite high. This research was conducted in Sunter 2 Plant where its production activities have a high level of potential hazard. Based on Hazard Identification risk assessment, Risk Assessment, and Risk Control (HIRARC) found 10 potential hazards in Plant Stamping Production, consisting of 4 very high risk potential hazards (E), 5 high risk potential hazards (H), and 1 moderate risk potential hazard (M). While in Plant Casting Production found 22 potential hazards findings consist of 7 very high risk potential hazards (E), 12 high risk potential hazards (H), and 3 medium risk potential hazards (M). Based on the result of Fault Tree Analysis (FTA), the main priority is the high risk potential hazards (H) and very high risk potential hazards (E). The proposed improvement are to make the visual display of the importance of always using the correct Personal Protective Equipment (PPE), establishing good working procedures, conducting OSH training for workers on a regular basis, and continuing to conduct safety campaigns.

  2. A Multihazard Regional Level Impact Assessment for South Asia

    NASA Astrophysics Data System (ADS)

    Amarnath, Giriraj; Alahacoon, Niranga; Aggarwal, Pramod; Smakhtin, Vladimir

    2016-04-01

    To prioritize climate adaptation strategies, there is a need for quantitative and systematic regional-level assessments which are comparable across multiple climatic hazard regimes. Assessing which countries in a region are most vulnerable to climate change requires analysis of multiple climatic hazards including: droughts, floods, extreme temperature as well as rainfall and sea-level rise. These five climatic hazards, along with population densities were modelled using GIS which enabled a summary of associated human exposure and agriculture losses. A combined index based on hazard, exposure and adaptive capacity is introduced to identify areas of extreme risks. The analysis results in population climate hazard exposure defined as the relative likelihood that a person in a given location was exposed to a given climate-hazard event in a given period of time. The study presents a detailed and coherent approach to fine-scale climate hazard mapping and identification of risks areas for the regions of South Asia that, for the first time, combines the following unique features: (a) methodological consistency across different climate-related hazards, (b) assessment of total exposure on population and agricultural losses, (c) regional-level spatial coverage, and (d) development of customized tools using ArcGIS toolbox that allow assessment of changes in exposure over time and easy replacement of existing datasets with a newly released or superior datasets. The resulting maps enable comparison of the most vulnerable regions in South Asia to climate-related hazards and is among the most urgent of policy needs. Subnational areas (regions/districts/provinces) most vulnerable to climate change impacts in South Asia are documented. The approach involves overlaying climate hazard maps, sensitivity maps, and adaptive capacity maps following the vulnerability assessment framework of the United Nations' Intergovernmental Panel on Climate Change (IPCC). The study used data on the spatial distribution of various climate-related hazards in 1,398 subnational areas of Bangladesh, Bhutan, India, Nepal, Pakistan and Sri Lanka. An analysis of country-level population exposure showed that approximately 750 million people are affected from combined climate-hazards. Of the affected population 72% are in India, followed by 12% each from Bangladesh and Pakistan. Due in part to the economic importance of agriculture, it was found to be most vulnerable and exposed to climate extremes. An analysis of individual hazards indicates that floods and droughts) are the dominant hazards impacting agricultural areas followed by extreme rainfall, extreme temperature and sea-level rise. Based on this vulnerability assessment, all the regions of Bangladesh and the Indian States in Andhra Pradesh, Bihar, Maharashtra, Karnataka and Orissa; Ampara, Puttalam, Trincomalee, Mannar and Batticaloa in Sri Lanka; Sind and Baluchistan in Pakistan; Central and East Nepal; and the transboundary river basins of Indus, Ganges and Brahmaputra are among the most vulnerable regions in South Asia.

  3. Programmatic Environmental Assessment (EA) for Hazardous Materials Removal at F. E. Warren Air Force Base, Wyoming

    DTIC Science & Technology

    2013-05-31

    ACM). The FEW Environmental Planning Function ( EPF ) conducted the analysis of this proposed action. 2. PURPOSE AND NEED FOR ACTION. The... EPF determined that the proposed action has the potential to affect Air Quality, Occupational Safety and Health, Cultural Resources and Hazardous Waste

  4. NANOSTRUCTURED PLANAR WAVEGUIDE DEVICE FOR MOLECULAR IDENTIFICATION OF HAZARDOUS COMPOUNDS IN WATER BY EVANESCENT SURFACE ENHANCED RAMAN SPECTROSCOPY - PHASE I

    EPA Science Inventory

    Senspex, Inc. proposes to investigate a novel diagnostic tool based upon evanescent field planar waveguide sensing and complementary nanostructured mediated molecular vibration spectroscopy methods for rapid detection and analysis of hazardous biological and chemical targets i...

  5. Maps Showing Seismic Landslide Hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.; Michael, John A.

    2009-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =~300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazard zones were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  6. Maps showing seismic landslide hazards in Anchorage, Alaska

    USGS Publications Warehouse

    Jibson, Randall W.

    2014-01-01

    The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazards were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.

  7. The role of emotions in risk communication.

    PubMed

    Xie, Xiao-Fei; Wang, Mei; Zhang, Ruo-Gu; Li, Jie; Yu, Qing-Yuan

    2011-03-01

    We present two experiments investigating the role of emotions concerning technological and natural hazards. In the first experiment, technological hazards aroused stronger emotions, and were considered to be riskier than natural hazards. No differences were found between the texts versus audio presentations. However, the presence of pictures aroused stronger emotions and increased the perceived risk. Emotions play a mediating role between hazard types and perceived risk, as well as between pictures and perceived risk. The second experiment adopted real-world materials from webpages and TV. Emotions again play a mediating role between pictorial information and risk perception. Moreover, specific emotions were found to be associated with different types of action tendencies. For example, loss-based emotions (e.g., fear, regret) tend to lead to prevention strategies, whereas ethical emotions (e.g., anger) lead to aggressive behavior. We also find that loss-based emotions in the technical hazard scenario trigger more coping strategies (from prevention to retaliation) than in the natural hazard scenario. © 2010 Society for Risk Analysis.

  8. Microbiological quality of food in relation to hazard analysis systems and food hygiene training in UK catering and retail premises.

    PubMed

    Little, C L; Lock, D; Barnes, J; Mitchell, R T

    2003-09-01

    A meta-analysis of eight UK food studies was carried out to determine the microbiological quality of food and its relationship with the presence in food businesses of hazard analysis systems and food hygiene training. Of the 19,022 premises visited to collect food samples in these studies between 1997 and 2002, two thirds (66%) were catering premises and one third (34%) were retail premises. Comparison with PHLS Microbiological Guidelines revealed that significantly more ready-to-eat food samples from catering premises (20%; 2,511/12,703) were of unsatisfactory or unacceptable microbiological quality compared to samples from retail premises (12%; 1,039/8,462) (p < 0.00001). Three quarters (76%) of retail premises had hazard analysis systems in place compared with 59% of catering premises (p < 0.00001). In 87% of retail premises the manager had received some form of food hygiene training compared with 80% of catering premises (p < 0.00001). From premises where the manager had received no food hygiene training a greater proportion of samples were of unsatisfactory and unacceptable microbiological quality (20% retail, 27% catering) compared with premises where the manager had received food hygiene training (11% retail, 19% catering) (p < 0.00001). Where the manager of the premises had received food hygiene training, documented hazard analysis systems were more likely to be in place (p < 0.00001). Higher proportions of samples of unsatisfactory and unacceptable microbiological quality (17% retail, 22% catering) were from premises where there was no hazard analysis system in place compared to premises that had a documented hazard analysis system in place (10% retail, 18% catering) (p < 0.00001). Our meta-analysis suggests that the lower microbiological quality of ready-to-eat foods from catering premises compared with those collected from retail premises may reflect differences in management food hygiene training and the presence of a hazard analysis system. The importance of adequate training for food handlers and their managers as a pre-requisite for effective hazard analysis and critical control point (HACCP) based controls is therefore emphasised.

  9. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA

  10. Hazard function theory for nonstationary natural hazards

    DOE PAGES

    Read, Laura K.; Vogel, Richard M.

    2016-04-11

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  11. Hazard function theory for nonstationary natural hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Read, Laura K.; Vogel, Richard M.

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  12. The Role and Quality of Software Safety in the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.

    2010-01-01

    In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.

  13. An Integrated GIS-Expert System Framework for Live Hazard Monitoring and Detection.

    PubMed

    McCarthy, James D; Graniero, Phil A; Rozic, Steven M

    2008-02-08

    In the context of hazard monitoring, using sensor web technology to monitor anddetect hazardous conditions in near-real-time can result in large amounts of spatial data thatcan be used to drive analysis at an instrumented site. These data can be used for decisionmaking and problem solving, however as with any analysis problem the success ofanalyzing hazard potential is governed by many factors such as: the quality of the sensordata used as input; the meaning that can be derived from those data; the reliability of themodel used to describe the problem; the strength of the analysis methods; and the ability toeffectively communicate the end results of the analysis. For decision makers to make use ofsensor web data these issues must be dealt with to some degree. The work described in thispaper addresses all of these areas by showing how raw sensor data can be automaticallytransformed into a representation which matches a predefined model of the problem context.This model can be understood by analysis software that leverages rule-based logic andinference techniques to reason with, and draw conclusions about, spatial data. These toolsare integrated with a well known Geographic Information System (GIS) and existinggeospatial and sensor web infrastructure standards, providing expert users with the toolsneeded to thoroughly explore a problem site and investigate hazards in any domain.

  14. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  15. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  16. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and themore » study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area and Not hazardous area. The maps was assessed using ROC (Rate of Curve) based on the area under the curve method (AUC). The result indicates an increase of accuracy from 77.76% (with all 22 factors) to 79.00% (with 14 dominant factors) in the prediction of landslide occurrence.« less

  17. Concept of a spatial data infrastructure for web-mapping, processing and service provision for geo-hazards

    NASA Astrophysics Data System (ADS)

    Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara

    2017-04-01

    Geo-hazards and their effects are distributed geographically over wide regions. The effective mapping and monitoring is essential for hazard assessment and mitigation. It is often best achieved using satellite imagery and new object-based image analysis approaches to identify and delineate geo-hazard objects (landslides, floods, forest fires, storm damages, etc.). At the moment, several local/national databases and platforms provide and publish data of different types of geo-hazards as well as web-based risk maps and decision support systems. Also, the European commission implemented the Copernicus Emergency Management Service (EMS) in 2015 that publishes information about natural and man-made disasters and risks. Currently, no platform for landslides or geo-hazards as such exists that enables the integration of the user in the mapping and monitoring process. In this study we introduce the concept of a spatial data infrastructure for object delineation, web-processing and service provision of landslide information with the focus on user interaction in all processes. A first prototype for the processing and mapping of landslides in Austria and Italy has been developed within the project Land@Slide, funded by the Austrian Research Promotion Agency FFG in the Austrian Space Applications Program ASAP. The spatial data infrastructure and its services for the mapping, processing and analysis of landslides can be extended to other regions and to all types of geo-hazards for analysis and delineation based on Earth Observation (EO) data. The architecture of the first prototypical spatial data infrastructure includes four main areas of technical components. The data tier consists of a file storage system and the spatial data catalogue for the management of EO-data, other geospatial data on geo-hazards, as well as descriptions and protocols for the data processing and analysis. An interface to extend the data integration from external sources (e.g. Sentinel-2 data) is planned for the possibility of rapid mapping. The server tier consists of java based web and GIS server. Sub and main services are part of the service tier. Sub services are for example map services, feature editing services, geometry services, geoprocessing services and metadata services. For (meta)data provision and to support data interoperability, web standards of the OGC and the rest-interface is used. Four central main services are designed and developed: (1) a mapping service (including image segmentation and classification approaches), (2) a monitoring service to monitor changes over time, (3) a validation service to analyze landslide delineations from different sources and (4) an infrastructure service to identify affected landslides. The main services use and combine parts of the sub services. Furthermore, a series of client applications based on new technology standards making use of the data and services offered by the spatial data infrastructure. Next steps include the design to extend the current spatial data infrastructure to other areas and geo-hazard types to develop a spatial data infrastructure that can assist targeted mapping and monitoring of geo-hazards on a global context.

  18. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  19. A New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.

    2017-12-01

    We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.

  20. System hazards in managing laboratory test requests and results in primary care: medical protection database analysis and conceptual model.

    PubMed

    Bowie, Paul; Price, Julie; Hepworth, Neil; Dinwoodie, Mark; McKay, John

    2015-11-27

    To analyse a medical protection organisation's database to identify hazards related to general practice systems for ordering laboratory tests, managing test results and communicating test result outcomes to patients. To integrate these data with other published evidence sources to inform design of a systems-based conceptual model of related hazards. A retrospective database analysis. General practices in the UK and Ireland. 778 UK and Ireland general practices participating in a medical protection organisation's clinical risk self-assessment (CRSA) programme from January 2008 to December 2014. Proportion of practices with system risks; categorisation of identified hazards; most frequently occurring hazards; development of a conceptual model of hazards; and potential impacts on health, well-being and organisational performance. CRSA visits were undertaken to 778 UK and Ireland general practices of which a range of systems hazards were recorded across the laboratory test ordering and results management systems in 647 practices (83.2%). A total of 45 discrete hazard categories were identified with a mean of 3.6 per practice (SD=1.94). The most frequently occurring hazard was the inadequate process for matching test requests and results received (n=350, 54.1%). Of the 1604 instances where hazards were recorded, the most frequent was at the 'postanalytical test stage' (n=702, 43.8%), followed closely by 'communication outcomes issues' (n=628, 39.1%). Based on arguably the largest data set currently available on the subject matter, our study findings shed new light on the scale and nature of hazards related to test results handling systems, which can inform future efforts to research and improve the design and reliability of these systems. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  1. Workplace health hazards: analysis of hotline calls over a six-year period.

    PubMed Central

    Quint, J; Handley, M; Cummings, K

    1990-01-01

    Between 1981-1986 a state-based occupational health telephone hotline received more than 8,000 inquiries on over 3,000 hazardous agents. Major caller groups were employees (37%), employers (20%), health care providers, primarily physicians (19%), government agencies (12%), and labor unions (6%). Employees were the fastest growing caller group. Callers inquired about general health hazards of chemicals (65%), the relation of symptoms to work (22%), and risks to pregnancy (13%). PMID:2297067

  2. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    NASA Astrophysics Data System (ADS)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  3. Rapid field-based landslide hazard assessment in response to post-earthquake emergency

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Gambini, Stefano; Cancelliere, Giorgio

    2016-04-01

    On April 25, 2015 a Mw 7.8 earthquake occurred 80 km to the northwest of Kathmandu (Nepal). The largest aftershock, occurred on May 12, 2015, was the Mw 7.3 Nepal earthquake (SE of Zham, China), 80 km to the east of Kathmandu. . The earthquakes killed ~9000 people and severely damaged a 10,000 sqkm region in Nepal and neighboring countries. Several thousands of landslides have been triggered during the event, causing widespread damages to mountain villages and the evacuation of thousands of people. Rasuwa was one of the most damaged districts. This contribution describes landslide hazard analysis of the Saramthali, Yarsa and Bhorle VDCs (122 km2, Rasuwa district). Hazard is expressed in terms of qualitative classes (low, medium, high), through a simple matrix approach that combines frequency classes and magnitude classes. The hazard analysis is based primarily on the experience gained during a field survey conducted in September 2014. During the survey, local knowledge has been systematically exploited through interviews with local people that have experienced the earthquake and the coseismic landslides. People helped us to recognize fractures and active deformations, and allowed to reconstruct a correct chronicle of landslide events, in order to assign the landslide events to the first shock, the second shock, or the post-earthquake 2015 monsoon. The field experience was complemented with a standard analysis of the relationship between potential controlling factors and the distribution of landslides reported in Kargel et al (2016). This analysis allowed recognizing the most important controlling factor. This information was integrated with the field observations to verify the mapped units and to complete the mapping in area not accessible for field activity. Finally, the work was completed with the analysis and the use of a detailed landslide inventory produced by the University of Milano Bicocca that covers most of the area affected by coseismic landslides in Nepal (Valagussa et al, 2016). As a result, a 1:10.000 hazard map was produced. About 47% of the area is classified at high hazard, almost 19 % at medium level and 34% at low risk. In addition, the hazard map reports 262 polygons of active coseismic or postseismic landslides.

  4. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  5. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  6. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  7. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  8. Design and application analysis of prediction system of geo-hazards based on GIS in the Three Gorges Reservoir

    NASA Astrophysics Data System (ADS)

    Li, Deying; Yin, Kunlong; Gao, Huaxi; Liu, Changchun

    2009-10-01

    Although the project of the Three Gorges Dam across the Yangtze River in China can utilize this huge potential source of hydroelectric power, and eliminate the loss of life and damage by flood, it also causes environmental problems due to the big rise and fluctuation of the water, such as geo-hazards. In order to prevent and predict geo-hazards, the establishment of prediction system of geo-hazards is very necessary. In order to implement functions of hazard prediction of regional and urban geo-hazard, single geo-hazard prediction, prediction of landslide surge and risk evaluation, logical layers of the system consist of data capturing layer, data manipulation and processing layer, analysis and application layer, and information publication layer. Due to the existence of multi-source spatial data, the research on the multi-source transformation and fusion data should be carried on in the paper. Its applicability of the system was testified on the spatial prediction of landslide hazard through spatial analysis of GIS in which information value method have been applied aims to identify susceptible areas that are possible to future landslide, on the basis of historical record of past landslide, terrain parameter, geology, rainfall and anthropogenic activity. Detailed discussion was carried out on spatial distribution characteristics of landslide hazard in the new town of Badong. These results can be used for risk evaluation. The system can be implemented as an early-warning and emergency management tool by the relevant authorities of the Three Gorges Reservoir in the future.

  9. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... associated with the product, the products subsystems, or the products components, in order to preserve the... if they have been agreed to previously with FRA. Based on these analyses, the reviewer shall identify...) The reviewer shall analyze the Hazard Log and/or any other hazard analysis documents for...

  10. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... associated with the product, the products subsystems, or the products components, in order to preserve the... if they have been agreed to previously with FRA. Based on these analyses, the reviewer shall identify...) The reviewer shall analyze the Hazard Log and/or any other hazard analysis documents for...

  11. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... associated with the product, the products subsystems, or the products components, in order to preserve the... if they have been agreed to previously with FRA. Based on these analyses, the reviewer shall identify...) The reviewer shall analyze the Hazard Log and/or any other hazard analysis documents for...

  12. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... associated with the product, the products subsystems, or the products components, in order to preserve the... if they have been agreed to previously with FRA. Based on these analyses, the reviewer shall identify...) The reviewer shall analyze the Hazard Log and/or any other hazard analysis documents for...

  13. Canister Storage Building (CSB) Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    POWERS, T.B.

    2000-03-16

    This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less

  14. Airborne Forward-Looking Interferometer for the Detection of Terminal-Area Hazards

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Lane, Sarah; Smith, Bill L.; Kireev, Stanislav; Daniels, Taumi S.; Cornman, Larry; Sharman, Bob

    2014-01-01

    The Forward Looking Interferometer (FLI) program was a multi-year cooperative research effort to investigate the use of imaging radiometers with high spectral resolution, using both modeling/simulation and field experiments, along with sophisticated data analysis techniques that were originally developed for analysis of data from space-based radiometers and hyperspectral imagers. This investigation has advanced the state of knowledge in this technical area, and the FLI program developed a greatly improved understanding of the radiometric signal strength of aviation hazards in a wide range of scenarios, in addition to a much better understanding of the real-world functionality requirements for hazard detection instruments. The project conducted field experiments on three hazards (turbulence, runway conditions, and wake vortices) and analytical studies on several others including volcanic ash, reduced visibility conditions, in flight icing conditions, and volcanic ash.

  15. Hazard analysis of Clostridium perfringens in the Skylab Food System

    NASA Technical Reports Server (NTRS)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  16. Safety and Performance Analysis of the Non-Radar Oceanic/Remote Airspace In-Trail Procedure

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.; Munoz, Cesar A.

    2007-01-01

    This document presents a safety and performance analysis of the nominal case for the In-Trail Procedure (ITP) in a non-radar oceanic/remote airspace. The analysis estimates the risk of collision between the aircraft performing the ITP and a reference aircraft. The risk of collision is only estimated for the ITP maneuver and it is based on nominal operating conditions. The analysis does not consider human error, communication error conditions, or the normal risk of flight present in current operations. The hazards associated with human error and communication errors are evaluated in an Operational Hazards Analysis presented elsewhere.

  17. A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen

    2014-05-01

    Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho are too short for a meaningful frequency analysis. The detailed hazard mapping is performed by a 2D hydrodynamic model for Can Tho city. As the scenarios are derived in a Monte-Carlo framework, the final flood hazard maps are probabilistic, i.e. show the median flood hazard along with uncertainty estimates for each defined level of probabilities of exceedance. For the pluvial flood hazard a frequency analysis of the hourly rain gauge data of Can Tho is performed implementing a peak-over-threshold procedure. Based on this frequency analysis synthetic rains storms are generated in a Monte-Carlo framework for the same probabilities of exceedance as in the fluvial flood hazard analysis. Probabilistic flood hazard maps were then generated with the same 2D hydrodynamic model for the city. In a last step the fluvial and pluvial scenarios are combined assuming independence of the events. These scenarios were also transferred into hazard maps by the 2D hydrodynamic model finally yielding combined fluvial-pluvial probabilistic flood hazard maps for Can Tho. The derived set of maps may be used for an improved city planning or a flood risk analysis.

  18. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  19. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    PubMed

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  20. The role of building models in the evaluation of heat-related risks

    NASA Astrophysics Data System (ADS)

    Buchin, Oliver; Jänicke, Britta; Meier, Fred; Scherer, Dieter; Ziegler, Felix

    2016-04-01

    Hazard-risk relationships in epidemiological studies are generally based on the outdoor climate, despite the fact that most of humans' lifetime is spent indoors. By coupling indoor and outdoor climates with a building model, the risk concept developed can still be based on the outdoor conditions but also includes exposure to the indoor climate. The influence of non-linear building physics and the impact of air conditioning on heat-related risks can be assessed in a plausible manner using this risk concept. For proof of concept, the proposed risk concept is compared to a traditional risk analysis. As an example, daily and city-wide mortality data of the age group 65 and older in Berlin, Germany, for the years 2001-2010 are used. Four building models with differing complexity are applied in a time-series regression analysis. This study shows that indoor hazard better explains the variability in the risk data compared to outdoor hazard, depending on the kind of building model. Simplified parameter models include the main non-linear effects and are proposed for the time-series analysis. The concept shows that the definitions of heat events, lag days, and acclimatization in a traditional hazard-risk relationship are influenced by the characteristics of the prevailing building stock.

  1. Mars Exploration Rovers Landing Dispersion Analysis

    NASA Technical Reports Server (NTRS)

    Knocke, Philip C.; Wawrzyniak, Geoffrey G.; Kennedy, Brian M.; Desai, Prasun N.; Parker, TImothy J.; Golombek, Matthew P.; Duxbury, Thomas C.; Kass, David M.

    2004-01-01

    Landing dispersion estimates for the Mars Exploration Rover missions were key elements in the site targeting process and in the evaluation of landing risk. This paper addresses the process and results of the landing dispersion analyses performed for both Spirit and Opportunity. The several contributors to landing dispersions (navigation and atmospheric uncertainties, spacecraft modeling, winds, and margins) are discussed, as are the analysis tools used. JPL's MarsLS program, a MATLAB-based landing dispersion visualization and statistical analysis tool, was used to calculate the probability of landing within hazardous areas. By convolving this with the probability of landing within flight system limits (in-spec landing) for each hazard area, a single overall measure of landing risk was calculated for each landing ellipse. In-spec probability contours were also generated, allowing a more synoptic view of site risks, illustrating the sensitivity to changes in landing location, and quantifying the possible consequences of anomalies such as incomplete maneuvers. Data and products required to support these analyses are described, including the landing footprints calculated by NASA Langley's POST program and JPL's AEPL program, cartographically registered base maps and hazard maps, and flight system estimates of in-spec landing probabilities for each hazard terrain type. Various factors encountered during operations, including evolving navigation estimates and changing atmospheric models, are discussed and final landing points are compared with approach estimates.

  2. Hazard detection and avoidance sensor for NASA's planetary landers

    NASA Technical Reports Server (NTRS)

    Lau, Brian; Chao, Tien-Hsin

    1992-01-01

    An optical terrain analysis based sensor system specifically designed for landing hazard detection as required for NASA's autonomous planetary landers is introduced. This optical hazard detection and avoidance (HDA) sensor utilizes an optoelectronic wedge-and-ting (WRD) filter for Fourier transformed feature extraction and an electronic neural network processor for pattern classification. A fully implemented optical HDA sensor would assure safe landing of the planetary landers. Computer simulation results of a successful feasibility study is reported. Future research for hardware system implementation is also provided.

  3. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  4. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE PAGES

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; ...

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  5. Safety evaluation methodology for advanced coal extraction systems

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.

    1981-01-01

    Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.

  6. Extended GTST-MLD for aerospace system safety analysis.

    PubMed

    Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo

    2012-06-01

    The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.

  7. Consistency between subjectively and objectively measured hazard perception skills among young male drivers.

    PubMed

    Ābele, Līva; Haustein, Sonja; Møller, Mette; Martinussen, Laila M

    2018-03-01

    Young male drivers have lower hazard perception skills (HPS) than older and more experienced drivers and a tendency to overestimate their skills in hazardous situations. Both factors contribute to an over-representation in traffic accidents. Based on a sample of 63 drivers aged 18-24, this study compares the consistency of HPS measured by objective and subjective measures and the link between these measures is the key contribution of the study. Both visible and hidden hazards are included. Objective measures of HPS include responsiveness and eye movements while driving in a driving simulator. Subjective measures of HPS include self-reports derived based on the Hazard Perception Questionnaire (HPQ), Driving Skill Questionnaire (DSQ), and Brief Sensation Seeking Scale (BSSS). Results show that drivers who respond to the hazards on time, as compared to drivers who do not respond, have higher scores on subjective measures of HPS and higher driving skills in the visible but not in the hidden condition. Eye movement analysis confirms the difference and shows that response in time to hazards indicate higher HPS and young drivers are poor at detecting hidden hazards. Drivers with a response in time locate the hazard faster, have more fixations, but dwell less on the hazard. At the same time, those who do not respond have a later first fixation and fewer but longer fixations on the hazard. High sensation seeking drivers respond to visible hazards on time, suggesting that sensation seeking does not affect HPS negatively when the hazard is visible. To enhance the HPS among young drivers, the results of this study suggest that specific hazard perception training is relevant, especially for hazards that require more advanced HPS. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Early identification systems for emerging foodborne hazards.

    PubMed

    Marvin, H J P; Kleter, G A; Prandini, A; Dekkers, S; Bolton, D J

    2009-05-01

    This paper provides a non-exhausting overview of early warning systems for emerging foodborne hazards that are operating in the various places in the world. Special attention is given to endpoint-focussed early warning systems (i.e. ECDC, ISIS and GPHIN) and hazard-focussed early warning systems (i.e. FVO, RASFF and OIE) and their merit to successfully identify a food safety problem in an early stage is discussed. Besides these early warning systems which are based on monitoring of either disease symptoms or hazards, also early warning systems and/or activities that intend to predict the occurrence of a food safety hazard in its very beginning of development or before that are described. Examples are trend analysis, horizon scanning, early warning systems for mycotoxins in maize and/or wheat and information exchange networks (e.g. OIE and GIEWS). Furthermore, recent initiatives that aim to develop predictive early warning systems based on the holistic principle are discussed. The assumption of the researchers applying this principle is that developments outside the food production chain that are either directly or indirectly related to the development of a particular food safety hazard may also provide valuable information to predict the development of this hazard.

  9. Hazard Analysis Guidelines for Transit Projects

    DOT National Transportation Integrated Search

    2000-01-01

    These hazard analysis guidelines discuss safety critical systems and subsystems, types of hazard analyses, when hazard analyses should be performed, and the hazard analysis philosophy. These guidelines are published by FTA to assist the transit indus...

  10. Non-front-fanged colubroid snakes: a current evidence-based analysis of medical significance.

    PubMed

    Weinstein, Scott A; White, Julian; Keyler, Daniel E; Warrell, David A

    2013-07-01

    Non-front-fanged colubroid snakes (NFFC; formerly and artificially taxonomically assembled as "colubrids") comprise about 70% of extant snake species and include several taxa now known to cause lethal or life threatening envenoming in humans. Although the medical risks of bites by only a handful of species have been documented, a growing number of NFFC are implicated in medically significant bites. The majority of these snakes have oral products (Duvernoy's secretions, or venoms) with unknown biomedical properties and their potential for causing harm in humans is unknown. Increasingly, multiple NFFC species are entering the commercial snake trade posing an uncertain risk. Published case reports describing NFFC bites were assessed for evidence-based value, clinical detail and verified species identification. These data were subjected to meta-analysis and a hazard index was generated for select taxa. Cases on which we consulted or personally treated were included and subjected to the same assessment criteria. Cases involving approximately 120 species met the selection criteria, and a small subset designated Hazard Level 1 (most hazardous), contained 5 species with lethal potential. Recommended management of these cases included antivenom for 3 species, Dispholidus typus, Rhabdophis tiginis, Rhabdophis subminiatus, whereas others in this subset without commercially available antivenoms (Thelotornis spp.) were treated with plasma/erythrocyte replacement therapy and supportive care. Heparin, antifibrinolytics and/or plasmapheresis/exchange transfusion have been used in the management of some Hazard Level 1 envenomings, but evidence-based analysis positively contraindicates the use of any of these interventions. Hazard Level 2/3 species were involved in cases containing mixed quality data that implicated these taxa (e.g. Boiga irregularis, Philodryas olfersii, Malpolon monspessulanus) with bites that caused rare systemic effects. Recommended management may include use of acetylcholinesterase inhibitors (e.g. neostigmine) and wound care on a case-by-case basis. Hazard level 3 species comprised a larger group capable of producing significant local effects only, often associated with a protracted bite (eg Heterodon nasicus, Borikenophis (Alsophis) portoricensis, Platyceps (Coluber) rhodorachis). Management is restricted to wound care. Bites by Hazard level 4 species comprised the majority of surveyed taxa and these showed only minor effects of no clinical importance. This study has produced a comprehensive evidence-based listing of NFFC snakes tabulated against medical significance of bites, together with best-practice management recommendations. This analysis assumes increasing importance, as there is growing exposure to lesser-known NFFC snakes, particularly in captive collections that may uncover further species of significance in the future. Careful and accurate documentation of bites by verified species of NFFC snakes is required to increase the evidence base and establish the best medical management approach for each species. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  11. Evaluation of In Vitro Biotransformation Using HepaRG Cells to Improve High-Throughput Chemical Hazard Prediction: A Toxicogenomics Analysis (SOT)

    EPA Science Inventory

    The US EPA’s ToxCast program has generated a wealth of data in >600 in vitro assayson a library of 1060 environmentally relevant chemicals and failed pharmaceuticals to facilitate hazard identification. An inherent criticism of many in vitro-based strategies is the inability of a...

  12. Broad-Scale Assessment of Fuel Treatment Opportunities

    Treesearch

    Patrick D. Miles; Kenneth E. Skog; Wayne D. Shepperd; Elizabeth D. Reinhardt; Roger D. Fight

    2006-01-01

    The Forest Inventory and Analysis (FIA) program has produced estimates of the extent and composition of the Nation?s forests for several decades. FIA data have been used with a flexible silvicultural thinning option, a fire hazard model for preharvest and postharvest fire hazard assessment, a harvest economics model, and geospatial data to produce a Web-based tool to...

  13. Statistical analysis of earthquakes after the 1999 MW 7.7 Chi-Chi, Taiwan, earthquake based on a modified Reasenberg-Jones model

    NASA Astrophysics Data System (ADS)

    Chen, Yuh-Ing; Huang, Chi-Shen; Liu, Jann-Yenq

    2015-12-01

    We investigated the temporal-spatial hazard of the earthquakes after the 1999 September 21 MW = 7.7 Chi-Chi shock in a continental region of Taiwan. The Reasenberg-Jones (RJ) model (Reasenberg and Jones, 1989, 1994) that combines the frequency-magnitude distribution (Gutenberg and Richter, 1944) and time-decaying occurrence rate (Utsu et al., 1995) is conventionally employed for assessing the earthquake hazard after a large shock. However, it is found that the b values in the frequency-magnitude distribution of the earthquakes in the study region dramatically decreased from background values after the Chi-Chi shock, and then gradually increased up. The observation of a time-dependent frequency-magnitude distribution motivated us to propose a modified RJ model (MRJ) to assess the earthquake hazard. To see how the models perform on assessing short-term earthquake hazard, the RJ and MRJ models were separately used to sequentially forecast earthquakes in the study region. To depict the potential rupture area for future earthquakes, we further constructed relative hazard (RH) maps based on the two models. The Receiver Operating Characteristics (ROC) curves (Swets, 1988) finally demonstrated that the RH map based on the MRJ model was, in general, superior to the one based on the original RJ model for exploring the spatial hazard of earthquakes in a short time after the Chi-Chi shock.

  14. Evaluation of a Home-Based Environmental and Educational Intervention to Improve Health in Vulnerable Households: Southeastern Pennsylvania Lead and Healthy Homes Program.

    PubMed

    Mankikar, Deepa; Campbell, Carla; Greenberg, Rachael

    2016-09-09

    This evaluation examined whether participation in a home-based environmental educational intervention would reduce exposure to health and safety hazards and asthma-related medical visits. The home intervention program focused on vulnerable, low-income households, where children had asthma, were at risk for lead poisoning, or faced multiple unsafe housing conditions. Home visitors conducted two home visits, two months apart, consisting of an environmental home assessment, Healthy Homes education, and distribution of Healthy Homes supplies. Measured outcomes included changes in participant knowledge and awareness of environmental home-based hazards, rate of children's asthma-related medical use, and the presence of asthma triggers and safety hazards. Analysis of 2013-2014 baseline and post-intervention program data for a cohort of 150 families revealed a significantly lower three-month rate (p < 0.05) of children's asthma-related doctor visits and hospital admissions at program completion. In addition, there were significantly reduced reports of the presence of home-based hazards, including basement or roof leaks (p = 0.011), plumbing leaks (p = 0.019), and use of an oven to heat the home (p < 0.001). Participants' pre- and post- test scores showed significant improvement (p < 0.05) in knowledge and awareness of home hazards. Comprehensive home interventions may effectively reduce environmental home hazards and improve the health of asthmatic children in the short term.

  15. RiskScape: a new tool for comparing risk from natural hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Stirling, M. W.; King, A.

    2010-12-01

    The Regional RiskScape is New Zealand’s joint venture between GNS Science & NIWA, and represents a comprehensive and easy-to-use tool for multi-hazard-based risk and impact analysis. It has basic GIS functionality, in that it has Import/Export functions to use with GIS software. Five natural hazards have been implemented in Riskscape to date: Flood (river), earthquake, volcano (ash), tsunami and wind storm. The software converts hazard exposure information into the likely impacts for a region, for example, damage and replacement costs, casualties, economic losses, disruption, and number of people affected. It therefore can be used to assist with risk management, land use planning, building codes and design, risk identification, prioritization of risk-reduction/mitigation, determination of “best use” risk-reduction investment, evacuation and contingency planning, awareness raising, public information, realistic scenarios for exercises, and hazard event response. Three geographically disparate pilot regions have been used to develop and triall Riskscape in New Zealand, and each region is exposed to a different mix of natural hazards. Future (phase II) development of Riskscape will include the following hazards: Landslides (both rainfall and earthquake triggered), storm surges, pyroclastic flows and lahars, and climate change effects. While Riskscape developments have thus far focussed on scenario-based risk, future developments will advance the software into providing probabilistic-based solutions.

  16. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  17. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  18. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  19. Research on geological hazard identification based on deep learning

    NASA Astrophysics Data System (ADS)

    Zhu, Cheng; Cheng, Tao

    2018-05-01

    Geological hazards such as landslides, debris flows and collapses are potential hazards affecting the safety of nearby roads and people. Land and Resources Bureau and other relevant departments to undertake the responsibility of prevention and control of geological disasters, an important body, how to deal with the characteristics of sudden geological disasters in the region, according to pre-established emergency measures quickly and accurately survey, is an important issue to be solved. Based on the analysis of the types and effects of typical geological disasters, this paper studies the relevant methods of identifying typical geological disasters through artificial neural networks, and proposes and designs intelligent geological survey methods and systems based on deep learning to provide relevant departments such as Land and Resources Bureau Related Mountain Geological Survey and Information Support.

  20. Time-dependent seismic hazard analysis for the Greater Tehran and surrounding areas

    NASA Astrophysics Data System (ADS)

    Jalalalhosseini, Seyed Mostafa; Zafarani, Hamid; Zare, Mehdi

    2018-01-01

    This study presents a time-dependent approach for seismic hazard in Tehran and surrounding areas. Hazard is evaluated by combining background seismic activity, and larger earthquakes may emanate from fault segments. Using available historical and paleoseismological data or empirical relation, the recurrence time and maximum magnitude of characteristic earthquakes for the major faults have been explored. The Brownian passage time (BPT) distribution has been used to calculate equivalent fictitious seismicity rate for major faults in the region. To include ground motion uncertainty, a logic tree and five ground motion prediction equations have been selected based on their applicability in the region. Finally, hazard maps have been presented.

  1. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  2. Wave-height hazard analysis in Eastern Coast of Spain - Bayesian approach using generalized Pareto distribution

    NASA Astrophysics Data System (ADS)

    Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.

    2005-03-01

    Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.

  3. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    NASA Astrophysics Data System (ADS)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.

  4. Mapping Natech risk due to earthquakes using RAPID-N

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2013-04-01

    Natural hazard-triggered technological accidents (so-called Natech accidents) at hazardous installations are an emerging risk with possibly serious consequences due to the potential for release of hazardous materials, fires or explosions. For the reduction of Natech risk, one of the highest priority needs is the identification of Natech-prone areas and the systematic assessment of Natech risks. With hardly any Natech risk maps existing within the EU the European Commission's Joint Research Centre has developed a Natech risk analysis and mapping tool called RAPID-N, that estimates the overall risk of natural-hazard impact to industrial installations and its possible consequences. The results are presented as risk summary reports and interactive risk maps which can be used for decision making. Currently, RAPID-N focuses on Natech risk due to earthquakes at industrial installations. However, it will be extended to also analyse and map Natech risk due to floods in the near future. The RAPID-N methodology is based on the estimation of on-site natural hazard parameters, use of fragility curves to determine damage probabilities of plant units for various damage states, and the calculation of spatial extent, severity, and probability of Natech events potentially triggered by the natural hazard. The methodology was implemented as a web-based risk assessment and mapping software tool which allows easy data entry, rapid local or regional risk assessment and mapping. RAPID-N features an innovative property estimation framework to calculate on-site natural hazard parameters, industrial plant and plant unit characteristics, and hazardous substance properties. Custom damage states and fragility curves can be defined for different types of plant units. Conditional relationships can be specified between damage states and Natech risk states, which describe probable Natech event scenarios. Natech consequences are assessed using a custom implementation of U.S. EPA's Risk Management Program (RMP) Guidance for Offsite Consequence Analysis methodology. This custom implementation is based on the property estimation framework and allows the easy modification of model parameters and the substitution of equations with alternatives. RAPID-N can be applied at different stages of the Natech risk management process: It allows on the one hand the analysis of hypothetical Natech scenarios to prevent or prepare for a Natech accident by supporting land-use and emergency planning. On the other hand, once a natural disaster occurs RAPID-N can be used for rapidly locating facilities with potential Natech accident damage based on actual natural-hazard information. This provides a means to warn the population in the vicinity of the facilities in a timely manner. This presentation will introduce the specific features of RAPID-N and show the use of the tool by application to a case-study area.

  5. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    PubMed

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  6. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...

  7. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...

  8. Probabilistic seismic hazard study based on active fault and finite element geodynamic models

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-04-01

    We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and with their internal variability together with the choice of the ground motion prediction equations (GMPEs) are the most influencing parameter. Both of these parameters have significan affect on the hazard results. Thus having good knowledge of the existence of active faults and their geometric and activity characteristics is of key importance. We also show that PSHA models based exclusively on active faults and geodynamic inputs, which are thus not dependent on past earthquake occurrences, provide a valid method for seismic hazard calculation.

  9. Combined fluvial and pluvial urban flood hazard analysis: concept development and application to Can Tho city, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen

    2016-04-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of coincidence into account. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation taking into account the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and their usage in flood risk management are outlined.

  10. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrencemore » relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.« less

  11. Comparison of methods for estimating the attributable risk in the context of survival analysis.

    PubMed

    Gassama, Malamine; Bénichou, Jacques; Dartois, Laureen; Thiébaut, Anne C M

    2017-01-23

    The attributable risk (AR) measures the proportion of disease cases that can be attributed to an exposure in the population. Several definitions and estimation methods have been proposed for survival data. Using simulations, we compared four methods for estimating AR defined in terms of survival functions: two nonparametric methods based on Kaplan-Meier's estimator, one semiparametric based on Cox's model, and one parametric based on the piecewise constant hazards model, as well as one simpler method based on estimated exposure prevalence at baseline and Cox's model hazard ratio. We considered a fixed binary exposure with varying exposure probabilities and strengths of association, and generated event times from a proportional hazards model with constant or monotonic (decreasing or increasing) Weibull baseline hazard, as well as from a nonproportional hazards model. We simulated 1,000 independent samples of size 1,000 or 10,000. The methods were compared in terms of mean bias, mean estimated standard error, empirical standard deviation and 95% confidence interval coverage probability at four equally spaced time points. Under proportional hazards, all five methods yielded unbiased results regardless of sample size. Nonparametric methods displayed greater variability than other approaches. All methods showed satisfactory coverage except for nonparametric methods at the end of follow-up for a sample size of 1,000 especially. With nonproportional hazards, nonparametric methods yielded similar results to those under proportional hazards, whereas semiparametric and parametric approaches that both relied on the proportional hazards assumption performed poorly. These methods were applied to estimate the AR of breast cancer due to menopausal hormone therapy in 38,359 women of the E3N cohort. In practice, our study suggests to use the semiparametric or parametric approaches to estimate AR as a function of time in cohort studies if the proportional hazards assumption appears appropriate.

  12. Research on response spectrum of dam based on scenario earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoliang; Zhang, Yushan

    2017-10-01

    Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.

  13. A complete electrical shock hazard classification system and its application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, Lloyd; Cartelli, Laura; Graham, Nicole

    Current electrical safety standards evolved to address the hazards of 60-Hz power that are faced primarily by electricians, linemen, and others performing facility and utility work. As a result, this leaves a substantial gap in the management of electrical hazards in Research and Development (R&D) and specialized high voltage and high power equipment. We find substantial use of direct current (dc) electrical energy, and the use of capacitors, inductors, batteries, and radiofrequency (RF) power. The electrical hazards of these forms of electricity and their systems are different than for 50/60 Hz power. This paper proposes a method of classifying allmore » of the electrical shock hazards found in all types of R&D and utilization equipment. Examples of the variation of these hazards from NFPA 70E include (a) high voltage can be harmless, if the available current is sufficiently low, (b) low voltage can be harmful if the available current/power is high, (c) high voltage capacitor hazards are unique and include severe reflex action, affects on the heart, and tissue damage, and (d) arc flash hazard analysis for dc and capacitor systems are not provided in existing standards. This work has led to a comprehensive electrical hazard classification system that is based on various research conducted over the past 100 years, on analysis of such systems in R&D, and on decades of experience. Lastly, the new comprehensive electrical shock hazard classification system uses a combination of voltage, shock current available, fault current available, power, energy, and waveform to classify all forms of electrical hazards.« less

  14. A complete electrical shock hazard classification system and its application

    DOE PAGES

    Gordon, Lloyd; Cartelli, Laura; Graham, Nicole

    2018-02-08

    Current electrical safety standards evolved to address the hazards of 60-Hz power that are faced primarily by electricians, linemen, and others performing facility and utility work. As a result, this leaves a substantial gap in the management of electrical hazards in Research and Development (R&D) and specialized high voltage and high power equipment. We find substantial use of direct current (dc) electrical energy, and the use of capacitors, inductors, batteries, and radiofrequency (RF) power. The electrical hazards of these forms of electricity and their systems are different than for 50/60 Hz power. This paper proposes a method of classifying allmore » of the electrical shock hazards found in all types of R&D and utilization equipment. Examples of the variation of these hazards from NFPA 70E include (a) high voltage can be harmless, if the available current is sufficiently low, (b) low voltage can be harmful if the available current/power is high, (c) high voltage capacitor hazards are unique and include severe reflex action, affects on the heart, and tissue damage, and (d) arc flash hazard analysis for dc and capacitor systems are not provided in existing standards. This work has led to a comprehensive electrical hazard classification system that is based on various research conducted over the past 100 years, on analysis of such systems in R&D, and on decades of experience. Lastly, the new comprehensive electrical shock hazard classification system uses a combination of voltage, shock current available, fault current available, power, energy, and waveform to classify all forms of electrical hazards.« less

  15. Quantification of hazard prediction ability at hazard prediction training (Kiken-Yochi Training: KYT) by free-response receiver-operating characteristic (FROC) analysis.

    PubMed

    Hashida, Masahiro; Kamezaki, Ryousuke; Goto, Makoto; Shiraishi, Junji

    2017-03-01

    The ability to predict hazards in possible situations in a general X-ray examination room created for Kiken-Yochi training (KYT) is quantified by use of free-response receiver-operating characteristics (FROC) analysis for determining whether the total number of years of clinical experience, involvement in general X-ray examinations, occupation, and training each have an impact on the hazard prediction ability. Twenty-three radiological technologists (RTs) (years of experience: 2-28), four nurses (years of experience: 15-19), and six RT students observed 53 scenes of KYT: 26 scenes with hazardous points (hazardous points are those that might cause injury to patients) and 27 scenes without points. Based on the results of these observations, we calculated the alternative free-response receiver-operating characteristic (AFROC) curve and the figure of merit (FOM) to quantify the hazard prediction ability. The results showed that the total number of years of clinical experience did not have any impact on hazard prediction ability, whereas recent experience with general X-ray examinations greatly influenced this ability. In addition, the hazard prediction ability varied depending on the occupations of the observers while they were observing the same scenes in KYT. The hazard prediction ability of the radiologic technology students was improved after they had undergone patient safety training. This proposed method with FROC observer study enabled the quantification and evaluation of the hazard prediction capability, and the application of this approach to clinical practice may help to ensure the safety of examinations and treatment in the radiology department.

  16. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.

  17. Tracking Hazard Analysis Data in a Jungle of Changing Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Robin S.; Young, Jonathan

    2006-05-14

    The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site inmore » Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.« less

  18. USEPA EXAMPLE EXIT LEVEL ANALYSIS RESULTS

    EPA Science Inventory

    Developed by NERL/ERD for the Office of Solid Waste, the enclosed product provides an example uncertainty analysis (UA) and initial process-based sensitivity analysis (SA) of hazardous waste "exit" concentrations for 7 chemicals and metals using the 3MRA Version 1.0 Modeling Syst...

  19. Dynamic wake prediction and visualization with uncertainty analysis

    NASA Technical Reports Server (NTRS)

    Holforty, Wendy L. (Inventor); Powell, J. David (Inventor)

    2005-01-01

    A dynamic wake avoidance system utilizes aircraft and atmospheric parameters readily available in flight to model and predict airborne wake vortices in real time. A novel combination of algorithms allows for a relatively simple yet robust wake model to be constructed based on information extracted from a broadcast. The system predicts the location and movement of the wake based on the nominal wake model and correspondingly performs an uncertainty analysis on the wake model to determine a wake hazard zone (no fly zone), which comprises a plurality of wake planes, each moving independently from another. The system selectively adjusts dimensions of each wake plane to minimize spatial and temporal uncertainty, thereby ensuring that the actual wake is within the wake hazard zone. The predicted wake hazard zone is communicated in real time directly to a user via a realistic visual representation. In an example, the wake hazard zone is visualized on a 3-D flight deck display to enable a pilot to visualize or see a neighboring aircraft as well as its wake. The system substantially enhances the pilot's situational awareness and allows for a further safe decrease in spacing, which could alleviate airport and airspace congestion.

  20. Detecting Spatial Patterns of Natural Hazards from the Wikipedia Knowledge Base

    NASA Astrophysics Data System (ADS)

    Fan, J.; Stewart, K.

    2015-07-01

    The Wikipedia database is a data source of immense richness and variety. Included in this database are thousands of geotagged articles, including, for example, almost real-time updates on current and historic natural hazards. This includes usercontributed information about the location of natural hazards, the extent of the disasters, and many details relating to response, impact, and recovery. In this research, a computational framework is proposed to detect spatial patterns of natural hazards from the Wikipedia database by combining topic modeling methods with spatial analysis techniques. The computation is performed on the Neon Cluster, a high performance-computing cluster at the University of Iowa. This work uses wildfires as the exemplar hazard, but this framework is easily generalizable to other types of hazards, such as hurricanes or flooding. Latent Dirichlet Allocation (LDA) modeling is first employed to train the entire English Wikipedia dump, transforming the database dump into a 500-dimension topic model. Over 230,000 geo-tagged articles are then extracted from the Wikipedia database, spatially covering the contiguous United States. The geo-tagged articles are converted into an LDA topic space based on the topic model, with each article being represented as a weighted multidimension topic vector. By treating each article's topic vector as an observed point in geographic space, a probability surface is calculated for each of the topics. In this work, Wikipedia articles about wildfires are extracted from the Wikipedia database, forming a wildfire corpus and creating a basis for the topic vector analysis. The spatial distribution of wildfire outbreaks in the US is estimated by calculating the weighted sum of the topic probability surfaces using a map algebra approach, and mapped using GIS. To provide an evaluation of the approach, the estimation is compared to wildfire hazard potential maps created by the USDA Forest service.

  1. An EM-based semi-parametric mixture model approach to the regression analysis of competing-risks data.

    PubMed

    Ng, S K; McLachlan, G J

    2003-04-15

    We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright 2003 John Wiley & Sons, Ltd.

  2. [Relations of landslide and debris flow hazards to environmental factors].

    PubMed

    Zhang, Guo-ping; Xu, Jing; Bi, Bao-gui

    2009-03-01

    To clarify the relations of landslide and debris flow hazards to environmental factors is of significance to the prediction and evaluation of landslide and debris flow hazards. Base on the latitudinal and longitudinal information of 18431 landslide and debris flow hazards in China, and the 1 km x 1 km grid data of elevation, elevation difference, slope, slope aspect, vegetation type, and vegetation coverage, this paper analyzed the relations of landslide and debris flow hazards in this country to above-mentioned environmental factors by the analysis method of frequency ratio. The results showed that the landslide and debris flow hazards in China more occurred in lower elevation areas of the first and second transitional zones. When the elevation difference within a 1 km x 1 km grid cell was about 300 m and the slope was around 30 degree, there was the greatest possibility of the occurrence of landslide and debris hazards. Mountain forest land and slope cropland were the two land types the hazards most easily occurred. The occurrence frequency of the hazards was the highest when the vegetation coverage was about 80%-90%.

  3. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  4. Earthquake Hazard and Risk Assessment based on Unified Scaling Law for Earthquakes: Altai-Sayan Region

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Nekrasova, A.

    2017-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes, USLE, which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10N(M, L) = A + B·(5 - M) + C·log10L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum credible magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g. peak ground acceleration, PGA, or macro-seismic intensity etc.). After a rigorous testing against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory, etc.). This, USLE based, methodology of seismic hazard and risks assessment is applied to the territory of Altai-Sayan Region, of Russia. The study supported by the Russian Science Foundation Grant No. 15-17-30020.

  5. Reviewing and visualizing the interactions of natural hazards

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2014-12-01

    This paper presents a broad overview, characterization, and visualization of the interaction relationships between 21 natural hazards, drawn from six hazard groups (geophysical, hydrological, shallow Earth, atmospheric, biophysical, and space hazards). A synthesis is presented of the identified interaction relationships between these hazards, using an accessible visual format particularly suited to end users. Interactions considered are primarily those where a primary hazard triggers or increases the probability of secondary hazards occurring. In this paper we do the following: (i) identify, through a wide-ranging review of grey- and peer-review literature, 90 interactions; (ii) subdivide the interactions into three levels, based on how well we can characterize secondary hazards, given information about the primary hazard; (iii) determine the spatial overlap and temporal likelihood of the triggering relationships occurring; and (iv) examine the relationship between primary and secondary hazard intensities for each identified hazard interaction and group these into five possible categories. In this study we have synthesized, using accessible visualization techniques, large amounts of information drawn from many scientific disciplines. We outline the importance of constraining hazard interactions and reinforce the importance of a holistic (or multihazard) approach to natural hazard assessment. This approach allows those undertaking research into single hazards to place their work within the context of other hazards. It also communicates important aspects of hazard interactions, facilitating an effective analysis by those working on reducing and managing disaster risk within both the policy and practitioner communities.

  6. A stereo-vision hazard-detection algorithm to increase planetary lander autonomy

    NASA Astrophysics Data System (ADS)

    Woicke, Svenja; Mooij, Erwin

    2016-05-01

    For future landings on any celestial body, increasing the lander autonomy as well as decreasing risk are primary objectives. Both risk reduction and an increase in autonomy can be achieved by including hazard detection and avoidance in the guidance, navigation, and control loop. One of the main challenges in hazard detection and avoidance is the reconstruction of accurate elevation models, as well as slope and roughness maps. Multiple methods for acquiring the inputs for hazard maps are available. The main distinction can be made between active and passive methods. Passive methods (cameras) have budgetary advantages compared to active sensors (radar, light detection and ranging). However, it is necessary to proof that these methods deliver sufficiently good maps. Therefore, this paper discusses hazard detection using stereo vision. To facilitate a successful landing not more than 1% wrong detections (hazards that are not identified) are allowed. Based on a sensitivity analysis it was found that using a stereo set-up at a baseline of ≤ 2 m is feasible at altitudes of ≤ 200 m defining false positives of less than 1%. It was thus shown that stereo-based hazard detection is an effective means to decrease the landing risk and increase the lander autonomy. In conclusion, the proposed algorithm is a promising candidate for future landers.

  7. Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard.

    PubMed

    Liu, Rui; Chen, Yun; Wu, Jianping; Gao, Lei; Barrett, Damian; Xu, Tingbao; Li, Xiaojuan; Li, Linyi; Huang, Chang; Yu, Jia

    2017-04-01

    Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies. © 2016 Society for Risk Analysis.

  8. Evaluation of a Home-Based Environmental and Educational Intervention to Improve Health in Vulnerable Households: Southeastern Pennsylvania Lead and Healthy Homes Program

    PubMed Central

    Mankikar, Deepa; Campbell, Carla; Greenberg, Rachael

    2016-01-01

    This evaluation examined whether participation in a home-based environmental educational intervention would reduce exposure to health and safety hazards and asthma-related medical visits. The home intervention program focused on vulnerable, low-income households, where children had asthma, were at risk for lead poisoning, or faced multiple unsafe housing conditions. Home visitors conducted two home visits, two months apart, consisting of an environmental home assessment, Healthy Homes education, and distribution of Healthy Homes supplies. Measured outcomes included changes in participant knowledge and awareness of environmental home-based hazards, rate of children’s asthma-related medical use, and the presence of asthma triggers and safety hazards. Analysis of 2013–2014 baseline and post-intervention program data for a cohort of 150 families revealed a significantly lower three-month rate (p < 0.05) of children’s asthma-related doctor visits and hospital admissions at program completion. In addition, there were significantly reduced reports of the presence of home-based hazards, including basement or roof leaks (p = 0.011), plumbing leaks (p = 0.019), and use of an oven to heat the home (p < 0.001). Participants’ pre- and post- test scores showed significant improvement (p < 0.05) in knowledge and awareness of home hazards. Comprehensive home interventions may effectively reduce environmental home hazards and improve the health of asthmatic children in the short term. PMID:27618087

  9. Risk assessment of major hazards and its application in urban planning: a case study.

    PubMed

    Zhou, Yafei; Liu, Mao

    2012-03-01

    With the rapid development of industry in China, the number of establishments that are proposed or under construction is increasing year by year, and many are industries that handle flammable, explosive, toxic, harmful, and dangerous substances. Accidents such as fire, explosion, and toxic diffusion inevitably happen. Accidents resulting from these major hazards in cities cause a large number of casualties and property losses. It is increasingly important to analyze the risk of major hazards in cities realistically and to suitably plan and utilize the surrounding land based on the risk analysis results, thereby reducing the hazards. A theoretical system for risk assessment of major hazards in cities is proposed in this article, and the major hazard risk for the entire city is analyzed quantitatively. Risks of various major accidents are considered together, superposition effect is analyzed, individual risk contours of the entire city are drawn out, and the level of risk in the city is assessed using "as low as reasonably practicable" guidelines. After the entire city's individual risk distribution is obtained, risk zones are divided according to corresponding individual risk value of HSE, and land-use planning suggestions are proposed. Finally, a city in China is used as an example to illustrate the risk assessment process of the city's major hazard and its application in urban land-use planning. The proposed method has a certain theoretical and practical significance in establishing and improving risk analysis of major hazard and urban land-use planning. On the one hand, major urban public risk is avoided; further, the land is utilized in the best possible way in order to obtain the maximum benefit from its use. © 2011 Society for Risk Analysis.

  10. Earthquake hazard and risk assessment based on Unified Scaling Law for Earthquakes: Greater Caucasus and Crimea

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir G.; Nekrasova, Anastasia K.

    2018-05-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10 N(M, L) = A + B·(5 - M) + C·log10 L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within a seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g., peak ground acceleration, PGA, or macro-seismic intensity). After a rigorous verification against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory). The methodology of seismic hazard and risk assessment is illustrated by application to the territory of Greater Caucasus and Crimea.

  11. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  12. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  13. Overview of the U.S. Nuclear Regulatory Commission collaborative research program to assess tsunami hazard for nuclear power plants on the Atlantic and Gulf Coasts

    USGS Publications Warehouse

    Kammerer, A.M.; ten Brink, Uri S.; Titov, V.V.

    2017-01-01

    In response to the 2004 Indian Ocean Tsunami, the United States Nuclear Regulatory Commission (US NRC) initiated a long-term research program to improve understanding of tsunami hazard levels for nuclear facilities in the United States. For this effort, the US NRC organized a collaborative research program with the United States Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA) with a goal of assessing tsunami hazard on the Atlantic and Gulf Coasts of the United States. Necessarily, the US NRC research program includes both seismic- and landslide-based tsunamigenic sources in both the near and the far fields. The inclusion of tsunamigenic landslides, an important category of sources that impact tsunami hazard levels for the Atlantic and Gulf Coasts is a key difference between this program and most other tsunami hazard assessment programs. The initial phase of this work consisted of collection, interpretation, and analysis of available offshore data, with significant effort focused on characterizing offshore near-field landslides and analyzing their tsunamigenic potential and properties. In the next phase of research, additional field investigations will be conducted in key locations of interest and additional analysis will be undertaken. Simultaneously, the MOST tsunami generation and propagation model used by NOAA will first be enhanced to include landslide-based initiation mechanisms and then will be used to investigate the impact of the tsunamigenic sources identified and characterized by the USGS. The potential for probabilistic tsunami hazard assessment will also be explore in the final phases of the program.

  14. Automated Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riddle, F. J.

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less

  15. Automation for System Safety Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  16. 78 FR 38730 - Announcement of Funding Awards for Lead-Based Paint Hazard Control, and Lead Hazard Reduction...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-27

    ... Awards for Lead-Based Paint Hazard Control, and Lead Hazard Reduction Demonstration Grant Programs for... (OHHLHC) Lead-Based Paint Hazard Control, and Lead Hazard Reduction Demonstration Grant Program Notices of... Grants.gov on December 3, 2012, and amended on January 18, 2013, for the Lead Based Paint Hazard Control...

  17. WE-G-BRA-06: Application of Systems and Control Theory-Based Hazard Analysis to Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlicki, T; Samost, A; Leveson, N

    Purpose: The process of delivering radiation occurs in a complex socio-technical system heavily reliant on human operators. Furthermore, both humans and software are notoriously challenging to account for in traditional hazard analysis models. High reliability industries such as aviation have approached this problem through using hazard analysis techniques grounded in systems and control theory. The purpose of this work is to apply the Systems Theoretic Accident Model Processes (STAMP) hazard model to radiotherapy. In particular, the System-Theoretic Process Analysis (STPA) approach is used to perform a hazard analysis of a proposed on-line adaptive cranial radiosurgery procedure that omits the CTmore » Simulation step and uses only CBCT for planning, localization, and treatment. Methods: The STPA procedure first requires the definition of high-level accidents and hazards leading to those accidents. From there, hierarchical control structures were created followed by the identification and description of control actions for each control structure. Utilizing these control structures, unsafe states of each control action were created. Scenarios contributing to unsafe control action states were then identified and translated into system requirements to constrain process behavior within safe boundaries. Results: Ten control structures were created for this new CBCT-only process which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Twenty three control actions were identified that contributed to over 80 unsafe states of those control actions resulting in over 220 failure scenarios. Conclusion: The interaction of people, hardware, and software are highlighted through the STPA approach. STPA provides a hierarchical model for understanding the role of management decisions in impacting system safety so that a process design requirement can be traced back to the hazard and accident that it is intended to mitigate. Varian Medical Systems, Inc.« less

  18. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    NASA Technical Reports Server (NTRS)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  19. Probabilistic performance-based design for high performance control systems

    NASA Astrophysics Data System (ADS)

    Micheli, Laura; Cao, Liang; Gong, Yongqiang; Cancelli, Alessandro; Laflamme, Simon; Alipour, Alice

    2017-04-01

    High performance control systems (HPCS) are advanced damping systems capable of high damping performance over a wide frequency bandwidth, ideal for mitigation of multi-hazards. They include active, semi-active, and hybrid damping systems. However, HPCS are more expensive than typical passive mitigation systems, rely on power and hardware (e.g., sensors, actuators) to operate, and require maintenance. In this paper, a life cycle cost analysis (LCA) approach is proposed to estimate the economic benefit these systems over the entire life of the structure. The novelty resides in the life cycle cost analysis in the performance based design (PBD) tailored to multi-level wind hazards. This yields a probabilistic performance-based design approach for HPCS. Numerical simulations are conducted on a building located in Boston, MA. LCA are conducted for passive control systems and HPCS, and the concept of controller robustness is demonstrated. Results highlight the promise of the proposed performance-based design procedure.

  20. Community exposure to potential climate-driven changes to coastal-inundation hazards for six communities in Essex County, Massachusetts

    USGS Publications Warehouse

    Abdollahian, Nina; Ratliff, Jamie L.; Wood, Nathan J.

    2016-11-09

    IntroductionUnderstanding if and how community exposure to coastal hazards may change over time is crucial information for coastal managers tasked with developing climate adaptation plans. This report summarizes estimates of population and asset exposure to coastal-inundation hazards associated with sea-level-rise and storm scenarios in six coastal communities of the Great Marsh region of Essex County, Massachusetts. This U.S. Geological Survey (USGS) analysis was conducted in collaboration with National Wildlife Federation (NWF) representatives, who are working with local stakeholders to develop local climate adaptation plans for the Towns of Salisbury, Newbury, Rowley, Ipswich, and Essex and the City of Newburyport (hereafter referred to as communities). Community exposure was characterized by integrating various community indicators (land cover and land use, population, economic assets, critical facilities, and infrastructure) with coastal-hazard zones that estimate inundation extents and water depth for three time periods.Estimates of community exposure are based on the presence of people, businesses, and assets in hazard zones that are calculated from geospatial datasets using geographic-information-system (GIS) tools. Results are based on current distributions of people and assets in hazard zones and do not take into account projections of human population, asset, or land-use changes over time. Results are not loss estimates based on engineering analysis or field surveys for any particular facility and do not take into account aspects of individual and household preparedness before an extreme event, adaptive capacity of a community during an event, or long-term resilience of individuals and communities after an event. Potential losses would match reported inventories only if all residents, business owners, public managers, and elected officials were unaware of what to do if warned of an imminent threat, failed to take protective measures during an extreme event, or failed to implement any long-term strategies to mitigate potential impacts. This analysis is intended to serve as a foundation for additional risk-related studies, plans, and mitigation efforts that are tailored to local needs. After a summary of the geospatial methods used in the analysis, results are organized by community so that local officials can easily use them in their local adaptation planning efforts.

  1. The national biennial RCRA hazardous waste report (based on 1999 data) : state detail analysis

    DOT National Transportation Integrated Search

    2001-06-01

    The State Detail Analysis is a detailed look at each State's waste handling practices, including overall totals for generation, management, and shipments and receipts, as well as totals for the largest fifty facilities.

  2. Safety analysis report for the SR-101 inert reservoir package

    DOT National Transportation Integrated Search

    1998-11-01

    Department of Energy (DOE) AL Weapons Surety Division (WSD) requires the SR-101 Inert Reservoir Package to : meet applicable hazardous material transportation requirements. This Safety Analysis Report (SAR) is based on : requirements in place at the ...

  3. Clinical and multiple gene expression variables in survival analysis of breast cancer: Analysis with the hypertabastic survival model

    PubMed Central

    2012-01-01

    Background We explore the benefits of applying a new proportional hazard model to analyze survival of breast cancer patients. As a parametric model, the hypertabastic survival model offers a closer fit to experimental data than Cox regression, and furthermore provides explicit survival and hazard functions which can be used as additional tools in the survival analysis. In addition, one of our main concerns is utilization of multiple gene expression variables. Our analysis treats the important issue of interaction of different gene signatures in the survival analysis. Methods The hypertabastic proportional hazards model was applied in survival analysis of breast cancer patients. This model was compared, using statistical measures of goodness of fit, with models based on the semi-parametric Cox proportional hazards model and the parametric log-logistic and Weibull models. The explicit functions for hazard and survival were then used to analyze the dynamic behavior of hazard and survival functions. Results The hypertabastic model provided the best fit among all the models considered. Use of multiple gene expression variables also provided a considerable improvement in the goodness of fit of the model, as compared to use of only one. By utilizing the explicit survival and hazard functions provided by the model, we were able to determine the magnitude of the maximum rate of increase in hazard, and the maximum rate of decrease in survival, as well as the times when these occurred. We explore the influence of each gene expression variable on these extrema. Furthermore, in the cases of continuous gene expression variables, represented by a measure of correlation, we were able to investigate the dynamics with respect to changes in gene expression. Conclusions We observed that use of three different gene signatures in the model provided a greater combined effect and allowed us to assess the relative importance of each in determination of outcome in this data set. These results point to the potential to combine gene signatures to a greater effect in cases where each gene signature represents some distinct aspect of the cancer biology. Furthermore we conclude that the hypertabastic survival models can be an effective survival analysis tool for breast cancer patients. PMID:23241496

  4. Trajectory-Based Performance Assessment for Aviation Weather Information

    NASA Technical Reports Server (NTRS)

    Vigeant-Langlois, Laurence; Hansman, R. John, Jr.

    2003-01-01

    Based on an analysis of aviation decision-makers' time-related weather information needs, an abstraction of the aviation weather decision task was developed, that involves 4-D intersection testing between aircraft trajectory hypertubes and hazardous weather hypervolumes. The framework builds on the hypothesis that hazardous meteorological fields can be simplified using discrete boundaries of surrogate threat attributes. The abstractions developed in the framework may be useful in studying how to improve the performance of weather forecasts from the trajectory-centric perspective, as well as for developing useful visualization techniques of weather information.

  5. Protection of agriculture against drought in Slovenia based on vulnerability and risk assessment

    NASA Astrophysics Data System (ADS)

    Dovžak, M.; Stanič, S.; Bergant, K.; Gregorič, G.

    2012-04-01

    Past and recent extreme events, like earthquakes, extreme droughts, heat waves, flash floods and volcanic eruptions continuously remind us that natural hazards are an integral component of the global environment. Despite rapid improvement of detection techniques many of these events evade long-term or even mid-term prediction and can thus have disastrous impacts on affected communities and environment. Effective mitigation and preparedness strategies will be possible to develop only after gaining the understanding on how and where such hazards may occur, what causes them, what circumstances increase their severity, and what their impacts may be and their study has the recent years emerged as under the common title of natural hazard management. The first step in natural risk management is risk identification, which includes hazard analysis and monitoring, vulnerability analysis and determination of the risk level. The presented research focuses on drought, which is at the present already the most widespread as well as still unpredictable natural hazard. Its primary aim was to assess the frequency and the consequences of droughts in Slovenia based on drought events in the past, to develop methodology for drought vulnerability and risk assessment that can be applied in Slovenia and wider in South-Eastern Europe, to prepare maps of drought risk and crop vulnerability and to guidelines to reduce the vulnerability of the crops. Using the amounts of plant available water in the soil, slope inclination, solar radiation, land use and irrigation infrastructure data sets as inputs, we obtained vulnerability maps for Slovenia using GIS-based multi-criteria decision analysis with a weighted linear combination of the input parameters. The weight configuration was optimized by comparing the modelled crop damage to the assessed actual damage, which was available for the extensive drought case in 2006. Drought risk was obtained quantitatively as a function of hazard and vulnerability and presented in the same way as the vulnerability, as a GIS-based map. Risk maps show geographic regions in Slovenia where droughts pose a major threat to the agriculture and together with the vulnerability maps provide the basis for drought management, in particular for the appropriate mitigation and response actions in specific regions. The developed methodology is expected to be applied to the entire region of South-Eastern Europe within the initiative of the Drought Management Centre for Southeastern Europe.

  6. Remote sensing and GIS-based landslide hazard analysis and cross-validation using multivariate logistic regression model on three test areas in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet

    2010-05-01

    This paper presents the results of the cross-validation of a multivariate logistic regression model using remote sensing data and GIS for landslide hazard analysis on the Penang, Cameron, and Selangor areas in Malaysia. Landslide locations in the study areas were identified by interpreting aerial photographs and satellite images, supported by field surveys. SPOT 5 and Landsat TM satellite imagery were used to map landcover and vegetation index, respectively. Maps of topography, soil type, lineaments and land cover were constructed from the spatial datasets. Ten factors which influence landslide occurrence, i.e., slope, aspect, curvature, distance from drainage, lithology, distance from lineaments, soil type, landcover, rainfall precipitation, and normalized difference vegetation index (ndvi), were extracted from the spatial database and the logistic regression coefficient of each factor was computed. Then the landslide hazard was analysed using the multivariate logistic regression coefficients derived not only from the data for the respective area but also using the logistic regression coefficients calculated from each of the other two areas (nine hazard maps in all) as a cross-validation of the model. For verification of the model, the results of the analyses were then compared with the field-verified landslide locations. Among the three cases of the application of logistic regression coefficient in the same study area, the case of Selangor based on the Selangor logistic regression coefficients showed the highest accuracy (94%), where as Penang based on the Penang coefficients showed the lowest accuracy (86%). Similarly, among the six cases from the cross application of logistic regression coefficient in other two areas, the case of Selangor based on logistic coefficient of Cameron showed highest (90%) prediction accuracy where as the case of Penang based on the Selangor logistic regression coefficients showed the lowest accuracy (79%). Qualitatively, the cross application model yields reasonable results which can be used for preliminary landslide hazard mapping.

  7. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  8. 16 CFR 1000.26 - Directorate for Epidemiology.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... things, incidents associated with consumer products, based on news clips, medical examiner reports, hotline reports, Internet complaints, and referrals. The Hazard Analysis Division conducts statistical...

  9. 78 FR 15894 - Draft Qualitative Risk Assessment of Risk of Activity/Food Combinations for Activities (Outside...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-13

    ... science-based risk analysis of those activity/food combinations that would be considered low risk. We... proposed requirements of the Federal Food, Drug, and Cosmetic Act for hazard analysis and risk-based... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 117 [Docket No...

  10. 78 FR 64735 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ...The Food and Drug Administration (FDA) is proposing regulations for domestic and foreign facilities that are required to register under the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to establish requirements for current good manufacturing practice in manufacturing, processing, packing, and holding of animal food. FDA also is proposing regulations to require that certain facilities establish and implement hazard analysis and risk-based preventive controls for food for animals. FDA is taking this action to provide greater assurance that animal food is safe and will not cause illness or injury to animals or humans and is intended to build an animal food safety system for the future that makes modern, science and risk-based preventive controls the norm across all sectors of the animal food system.

  11. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves.

    PubMed

    Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J

    2012-02-01

    The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  12. The price of safety: costs for mitigating and coping with Alpine hazards

    NASA Astrophysics Data System (ADS)

    Pfurtscheller, C.; Thieken, A. H.

    2013-10-01

    Due to limited public budgets and the need to economize, the analysis of costs of hazard mitigation and emergency management of natural hazards becomes increasingly important for public natural hazard and risk management. In recent years there has been a growing body of literature on the estimation of losses which supported to help to determine benefits of measures in terms of prevented losses. On the contrary, the costs of mitigation are hardly addressed. This paper thus aims to shed some light on expenses for mitigation and emergency services. For this, we analysed the annual costs of mitigation efforts in four regions/countries of the Alpine Arc: Bavaria (Germany), Tyrol (Austria), South Tyrol (Italy) and Switzerland. On the basis of PPP values (purchasing power parities), annual expenses on public safety ranged from EUR 44 per capita in the Free State of Bavaria to EUR 216 in the Autonomous Province of South Tyrol. To analyse the (variable) costs for emergency services in case of an event, we used detailed data from the 2005 floods in the Federal State of Tyrol (Austria) as well as aggregated data from the 2002 floods in Germany. The analysis revealed that multi-hazards, the occurrence and intermixture of different natural hazard processes, contribute to increasing emergency costs. Based on these findings, research gaps and recommendations for costing Alpine natural hazards are discussed.

  13. HACCP-Based Programs for Preventing Disease and Injury from Premise Plumbing: A Building Consensus

    PubMed Central

    McCoy, William F.; Rosenblatt, Aaron A.

    2015-01-01

    Thousands of preventable injuries and deaths are annually caused by microbial, chemical and physical hazards from building water systems. Water is processed in buildings before use; this can degrade the quality of the water. Processing steps undertaken on-site in buildings often include conditioning, filtering, storing, heating, cooling, pressure regulation and distribution through fixtures that restrict flow and temperature. Therefore, prevention of disease and injury requires process management. A process management framework for buildings is the hazard analysis and critical control point (HACCP) adaptation of failure mode effects analysis (FMEA). It has been proven effective for building water system management. Validation is proof that hazards have been controlled under operating conditions and may include many kinds of evidence including cultures of building water samples to detect and enumerate potentially pathogenic microorganisms. However, results from culture tests are often inappropriately used because the accuracy and precision are not sufficient to support specifications for control limit or action triggers. A reliable negative screen is based on genus-level Polymerase Chain Reaction (PCR) for Legionella in building water systems; however, building water samples with positive results from this test require further analysis by culture methods. PMID:26184325

  14. HACCP-Based Programs for Preventing Disease and Injury from Premise Plumbing: A Building Consensus.

    PubMed

    McCoy, William F; Rosenblatt, Aaron A

    2015-07-09

    Thousands of preventable injuries and deaths are annually caused by microbial, chemical and physical hazards from building water systems. Water is processed in buildings before use; this can degrade the quality of the water. Processing steps undertaken on-site in buildings often include conditioning, filtering, storing, heating, cooling, pressure regulation and distribution through fixtures that restrict flow and temperature. Therefore, prevention of disease and injury requires process management. A process management framework for buildings is the hazard analysis and critical control point (HACCP) adaptation of failure mode effects analysis (FMEA). It has been proven effective for building water system management. Validation is proof that hazards have been controlled under operating conditions and may include many kinds of evidence including cultures of building water samples to detect and enumerate potentially pathogenic microorganisms. However, results from culture tests are often inappropriately used because the accuracy and precision are not sufficient to support specifications for control limit or action triggers. A reliable negative screen is based on genus-level Polymerase Chain Reaction (PCR) for Legionella in building water systems; however, building water samples with positive results from this test require further analysis by culture methods.

  15. Tsunami Hazard Assessment of Coastal South Africa Based on Mega-Earthquakes of Remote Subduction Zones

    NASA Astrophysics Data System (ADS)

    Kijko, Andrzej; Smit, Ansie; Papadopoulos, Gerassimos A.; Novikova, Tatyana

    2018-04-01

    After the mega-earthquakes and concomitant devastating tsunamis in Sumatra (2004) and Japan (2011), we launched an investigation into the potential risk of tsunami hazard to the coastal cities of South Africa. This paper presents the analysis of the seismic hazard of seismogenic sources that could potentially generate tsunamis, as well as the analysis of the tsunami hazard to coastal areas of South Africa. The subduction zones of Makran, South Sandwich Island, Sumatra, and the Andaman Islands were identified as possible sources of mega-earthquakes and tsunamis that could affect the African coast. Numerical tsunami simulations were used to investigate the realistic and worst-case scenarios that could be generated by these subduction zones. The simulated tsunami amplitudes and run-up heights calculated for the coastal cities of Cape Town, Durban, and Port Elizabeth are relatively small and therefore pose no real risk to the South African coast. However, only distant tsunamigenic sources were considered and the results should therefore be viewed as preliminary.

  16. Tsunami Hazard Assessment of Coastal South Africa Based on Mega-Earthquakes of Remote Subduction Zones

    NASA Astrophysics Data System (ADS)

    Kijko, Andrzej; Smit, Ansie; Papadopoulos, Gerassimos A.; Novikova, Tatyana

    2017-11-01

    After the mega-earthquakes and concomitant devastating tsunamis in Sumatra (2004) and Japan (2011), we launched an investigation into the potential risk of tsunami hazard to the coastal cities of South Africa. This paper presents the analysis of the seismic hazard of seismogenic sources that could potentially generate tsunamis, as well as the analysis of the tsunami hazard to coastal areas of South Africa. The subduction zones of Makran, South Sandwich Island, Sumatra, and the Andaman Islands were identified as possible sources of mega-earthquakes and tsunamis that could affect the African coast. Numerical tsunami simulations were used to investigate the realistic and worst-case scenarios that could be generated by these subduction zones. The simulated tsunami amplitudes and run-up heights calculated for the coastal cities of Cape Town, Durban, and Port Elizabeth are relatively small and therefore pose no real risk to the South African coast. However, only distant tsunamigenic sources were considered and the results should therefore be viewed as preliminary.

  17. Rapid SAR and GPS Measurements and Models for Hazard Science and Situational Awareness

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Moore, A. W.; Rosen, P. A.; Simons, M.; Webb, F.; Linick, J.; Fielding, E. J.; Lundgren, P.; Sacco, G. F.; Polet, J.; Manipon, G.

    2016-12-01

    The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating higher level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR), Differential Global Positioning System (DGPS), SAR-based change detection, and image pixel tracking have recently become critical additions to our toolset for understanding and mapping the damage caused by earthquakes, volcanic eruptions, landslides, and floods. Analyses of these data sets are still largely handcrafted following each event and are not generated rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by California Institute of Technology (Caltech) and by NASA through the Jet Propulsion Laboratory (JPL), has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition, the ARIA project is developing the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the imminent increase in raw data from geodetic imaging missions planned for launch by NASA, as well as international space agencies. We will present the progress we have made on automating the analysis of SAR data for hazard monitoring and response using data from Sentinel 1a/b as well as continuous GPS stations. Since the beginning of our project, our team has imaged events and generated response products for events around the world. These response products have enabled many conversations with those in the disaster response community about the potential usefulness of rapid SAR and GPS-based information. We will present progress on our data system technology that enables rapid and reliable production of imagery, as well as lessons learned from our engagement with FEMA and others in the hazard response community on the important actionable information that they need.

  18. An Extreme Meteorological Events Analysis For Nuclear Power Plant (NPP) Siting Project at Bangka Island, Indonesia

    NASA Astrophysics Data System (ADS)

    Septiadi, Deni; S, Yarianto Sugeng B.; Sriyana; Anzhar, Kurnia; Suntoko, Hadi

    2018-03-01

    The potential sources of meteorological phenomena in Nuclear Power Plant (NPP) area of interest are identified and the extreme values of the possible resulting hazards associated which such phenomena are evaluated to derive the appropriate design bases for the NPP. The appropriate design bases shall be determined according to the Nuclear Energy Regulatory Agency (Bapeten) applicable regulations, which presently do not indicate quantitative criteria for purposes of determining the design bases for meteorological hazards. These meteorological investigations are also carried out to evaluate the regional and site specific meteorological parameters which affect the transport and dispersion of radioactive effluents on the environment of the region around the NPP site. The meteorological hazards are to be monitored and assessed periodically over the lifetime of the plant to ensure that consistency with the design assumptions is maintained throughout the full lifetime of the facility.

  19. The national biennial RCRA hazardous waste report (based on 1997 data) : national analysis

    DOT National Transportation Integrated Search

    1999-09-01

    National Analysis presents a detailed look at waste-handling practices in the EPA Regions, States, and largest facilities nationally, including (1) the quantity of waste generated, managed, shipped and received, and imported and exported between Stat...

  20. Using expert opinion surveys to rank threats to endangered species: a case study with sea turtles.

    PubMed

    Donlan, C Josh; Wingfield, Dana K; Crowder, Larry B; Wilcox, Chris

    2010-12-01

    Little is known about how specific anthropogenic hazards affect the biology of organisms. Quantifying the effect of regional hazards is particularly challenging for species such as sea turtles because they are migratory, difficult to study, long lived, and face multiple anthropogenic threats. Expert elicitation, a technique used to synthesize opinions of experts while assessing uncertainty around those views, has been in use for several decades in the social science and risk assessment sectors. We conducted an internet-based survey to quantify expert opinion on the relative magnitude of anthropogenic hazards to sea turtle populations at the regional level. Fisheries bycatch and coastal development were most often ranked as the top hazards to sea turtle species in a geographic region. Nest predation and direct take followed as the second and third greatest threats, respectively. Survey results suggest most experts believe sea turtles are threatened by multiple factors, including substantial at-sea threats such as fisheries bycatch. Resources invested by the sea turtle community, however, appear biased toward terrestrial-based impacts. Results from the survey are useful for conservation planning because they provide estimates of relative impacts of hazards on sea turtles and a measure of consensus on the magnitude of those impacts among researchers and practitioners. Our survey results also revealed patterns of expert bias, which we controlled for in our analysis. Respondents with no experience with respect to a sea turtle species tended to rank hazards affecting that sea turtle species higher than respondents with experience. A more-striking pattern was with hazard-based expertise: the more experience a respondent had with a specific hazard, the higher the respondent scored the impact of that hazard on sea turtle populations. Bias-controlled expert opinion surveys focused on threatened species and their hazards can help guide and expedite species recovery plans. © 2010 Society for Conservation Biology.

  1. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    NASA Technical Reports Server (NTRS)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  2. ORNL necessary and sufficient standards for environment, safety, and health. Final report of the Identification Team for other industrial, radiological, and non-radiological hazard facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-07-01

    This Necessary and Sufficient (N and S) set of standards is for Other Industrial, Radiological, and Non-Radiological Hazard Facilities at Oak Ridge National Laboratory (ORNL). These facility classifications are based on a laboratory-wide approach to classify facilities by hazard category. An analysis of the hazards associated with the facilities at ORNL was conducted in 1993. To identify standards appropriate for these Other Industrial, Radiological, and Non-Radiological Hazard Facilities, the activities conducted in these facilities were assessed, and the hazards associated with the activities were identified. A preliminary hazards list was distributed to all ORNL organizations. The hazards identified in priormore » hazard analyses are contained in the list, and a category of other was provided in each general hazard area. A workshop to assist organizations in properly completing the list was held. Completed hazard screening lists were compiled for each ORNL division, and a master list was compiled for all Other Industrial, Radiological Hazard, and Non-Radiological facilities and activities. The master list was compared against the results of prior hazard analyses by research and development and environment, safety, and health personnel to ensure completeness. This list, which served as a basis for identifying applicable environment, safety, and health standards, appears in Appendix A.« less

  3. [Hazard function and life table: an introduction to the failure time analysis].

    PubMed

    Matsushita, K; Inaba, H

    1987-04-01

    Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.

  4. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  5. Pacific Basin conference on hazardous waste: Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This conference was held November 4--8, 1996 in Kuala Lumpur, Malaysia. The purpose of this conference was to provide a multidisciplinary forum for exchange of state-of-the-art information on the problems of hazardous waste. Topics of discussion deal with pollution prevention, waste treatment technology, health and ecosystem effects research, analysis and assessment, and regulatory management techniques. Individual papers have been processed separately for inclusion in the appropriate data bases.

  6. Safety issues of high-concentrated hydrogen peroxide production used as rocket propellant

    NASA Astrophysics Data System (ADS)

    Romantsova, O. V.; Ulybin, V. B.

    2015-04-01

    The article dwells on the possibility of production of high-concentrated hydrogen peroxide with the Russian technology of isopropyl alcohol autoxidation. Analysis of fire/explosion hazards and reasons of insufficient quality is conducted for the technology. Modified technology is shown. Non-standard fire/explosion characteristics required for integrated fire/explosion hazards rating for modified hydrogen peroxide production based on the autoxidation of isopropyl alcohol are defined.

  7. Remote rainfall sensing for landslide hazard analysis

    USGS Publications Warehouse

    Wieczorek, Gerald F.; McWreath, Harry; Davenport, Clay

    2001-01-01

    Methods of assessing landslide hazards and providing warnings are becoming more advanced as remote sensing of rainfall provides more detailed temporal and spatial data on rainfall distribution. Two recent landslide disasters are examined noting the potential for using remotely sensed rainfall data for landslide hazard analysis. For the June 27, 1995, storm in Madison County, Virginia, USA, National Weather Service WSR-88D Doppler radar provided rainfall estimates based on a relation between cloud reflectivity and moisture content on a 1 sq. km. resolution every 6 minutes. Ground-based measurements of rainfall intensity and precipitation total, in addition to landslide timing and distribution, were compared with the radar-derived rainfall data. For the December 14-16, 1999, storm in Vargas State, Venezuela, infrared sensing from the GOES-8 satellite of cloud top temperatures provided the basis for NOAA/NESDIS rainfall estimates on a 16 sq. km. resolution every 30 minutes. These rainfall estimates were also compared with ground-based measurements of rainfall and landslide distribution. In both examples, the remotely sensed data either overestimated or underestimated ground-based values by up to a factor of 2. The factors that influenced the accuracy of rainfall data include spatial registration and map projection, as well as prevailing wind direction, cloud orientation, and topography.

  8. [Design of a Hazard Analysis and Critical Control Points (HACCP) plan to assure the safety of a bologna product produced by a meat processing plant].

    PubMed

    Bou Rached, Lizet; Ascanio, Norelis; Hernández, Pilar

    2004-03-01

    The Hazard Analysis and Critical Control Point (HACCP) is a systematic integral program used to identify and estimate the hazards (microbiological, chemical and physical) and the risks generated during the primary production, processing, storage, distribution, expense and consumption of foods. To establish a program of HACCP has advantages, being some of them: to emphasize more in the prevention than in the detection, to diminish the costs, to minimize the risk of manufacturing faulty products, to allow bigger trust to the management, to strengthen the national and international competitiveness, among others. The present work is a proposal based on the design of an HACCP program to guarantee the safety of the Bologna Special Type elaborated by a meat products industry, through the determination of hazards (microbiological, chemical or physical), the identification of critical control points (CCP), the establishment of critical limits, plan corrective actions and the establishment of documentation and verification procedures. The used methodology was based in the application of the seven basic principles settled down by the Codex Alimentarius, obtaining the design of this program. In view of the fact that recently the meat products are linked with pathogens like E. coli O157:H7 and Listeria monocytogenes, these were contemplated as microbiological hazard for the establishment of the HACCP plan whose application will guarantee the obtaining of a safe product.

  9. Two models for evaluating landslide hazards

    USGS Publications Warehouse

    Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.

    2006-01-01

    Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.

  10. Evaluation of biological methods for a future methodological implementation of the Hazard criterion H14 'ecotoxic' in the European waste list (2000/532/EC).

    PubMed

    Moser, Heidrun; Roembke, Joerg; Donnevert, Gerhild; Becker, Roland

    2011-02-01

    The ecotoxicological characterization of waste is part of its assessment as hazardous or non-hazardous according to the European Waste List. For this classification 15 hazard criteria are derived from the Council Directive 91/689/EEC on hazardous waste. Some of the hazard criteria are based on the content of dangerous substances. The criterion H14 'ecotoxic' lacks of an assessment and testing strategy and no specific threshold values have been defined so far. Based on the recommendations of CEN guideline 14735 (2005), an international round robin test (ring test) was organized by the German Federal Environment Agency in order to define suitable test methods for the biological assessment of waste and waste eluates. A basic test battery, consisting of three aquatic and three terrestrial tests, was compiled. In addition, data were submitted for ten additional tests (five aquatic (including a genotoxicity test) and five terrestrial ones). The tests were performed with three representative waste types: an ash from an incineration plant, a soil containing high concentrations of organic contaminants (polycyclic aromatic hydrocarbons) and a preserved wood waste. The results of this ring test confirm that a combination of a battery of biological tests and chemical residual analysis is needed for an ecotoxicological characterization of wastes. With small modifications the basic test battery is considered to be well suitable for the hazard and risk assessment of wastes and waste eluates. All results and documents are accessible via a web-based data bank application.

  11. Modeling Compound Flood Hazards in Coastal Embayments

    NASA Astrophysics Data System (ADS)

    Moftakhari, H.; Schubert, J. E.; AghaKouchak, A.; Luke, A.; Matthew, R.; Sanders, B. F.

    2017-12-01

    Coastal cities around the world are built on lowland topography adjacent to coastal embayments and river estuaries, where multiple factors threaten increasing flood hazards (e.g. sea level rise and river flooding). Quantitative risk assessment is required for administration of flood insurance programs and the design of cost-effective flood risk reduction measures. This demands a characterization of extreme water levels such as 100 and 500 year return period events. Furthermore, hydrodynamic flood models are routinely used to characterize localized flood level intensities (i.e., local depth and velocity) based on boundary forcing sampled from extreme value distributions. For example, extreme flood discharges in the U.S. are estimated from measured flood peaks using the Log-Pearson Type III distribution. However, configuring hydrodynamic models for coastal embayments is challenging because of compound extreme flood events: events caused by a combination of extreme sea levels, extreme river discharges, and possibly other factors such as extreme waves and precipitation causing pluvial flooding in urban developments. Here, we present an approach for flood risk assessment that coordinates multivariate extreme analysis with hydrodynamic modeling of coastal embayments. First, we evaluate the significance of correlation structure between terrestrial freshwater inflow and oceanic variables; second, this correlation structure is described using copula functions in unit joint probability domain; and third, we choose a series of compound design scenarios for hydrodynamic modeling based on their occurrence likelihood. The design scenarios include the most likely compound event (with the highest joint probability density), preferred marginal scenario and reproduced time series of ensembles based on Monte Carlo sampling of bivariate hazard domain. The comparison between resulting extreme water dynamics under the compound hazard scenarios explained above provides an insight to the strengths/weaknesses of each approach and helps modelers choose the appropriate scenario that best fit to the needs of their project. The proposed risk assessment approach can help flood hazard modeling practitioners achieve a more reliable estimate of risk, by cautiously reducing the dimensionality of the hazard analysis.

  12. SLUDGE TREATMENT PROJECT KOP CONCEPTUAL DESIGN CONTROL DECISION REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CARRO CA

    2010-03-09

    This control decision addresses the Knock-Out Pot (KOP) Disposition KOP Processing System (KPS) conceptual design. The KPS functions to (1) retrieve KOP material from canisters, (2) remove particles less than 600 {micro}m in size and low density materials from the KOP material, (3) load the KOP material into Multi-Canister Overpack (MCO) baskets, and (4) stage the MCO baskets for subsequent loading into MCOs. Hazard and accident analyses of the KPS conceptual design have been performed to incorporate safety into the design process. The hazard analysis is documented in PRC-STP-00098, Knock-Out Pot Disposition Project Conceptual Design Hazard Analysis. The accident analysismore » is documented in PRC-STP-CN-N-00167, Knock-Out Pot Disposition Sub-Project Canister Over Lift Accident Analysis. Based on the results of these analyses, and analyses performed in support of MCO transportation and MCO processing and storage activities at the Cold Vacuum Drying Facility (CVDF) and Canister Storage Building (CSB), control decision meetings were held to determine the controls required to protect onsite and offsite receptors and facility workers. At the conceptual design stage, these controls are primarily defined by their safety functions. Safety significant structures, systems, and components (SSCs) that could provide the identified safety functions have been selected for the conceptual design. It is anticipated that some safety SSCs identified herein will be reclassified based on hazard and accident analyses performed in support of preliminary and detailed design.« less

  13. European Natural Disaster Coordination and Information System for Aviation (EUNADICS-AV)

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Hirtl, Marcus; Arnold, Delia; Katzler-Fuchs, Susanne; Pappalardo, Gelsomina; Mona, Lucia; Sofiev, Mikhail; de Leeuw, Gerrit; Theys, Nicolas; Brenot, Hugues; Plu, Matthieu; Rockitansky, Carl-Herbert; Eschbacher, Kurt; Apituley, Arnoud; Som de Cerff, Wim

    2017-04-01

    Commercial aviation is one of the key infrastructures of our modern world. Even short interruptions can cause economic damages summing up to the Billion-Euro range. As evident from the past, aviation shows vulnerability with regard to natural hazards. Safe flight operations, air traffic management and air traffic control is a shared responsibility of EUROCONTROL, national authorities, airlines and pilots. All stakeholders have one common goal, namely to warrant and maintain the safety of flight crews and passengers. Currently, however, there is a significant gap in the Europe-wide availability of real time hazard measurement and monitoring information for airborne hazards describing "what, where, how much" in 3 dimensions, combined with a near-real-time European data analysis and assimilation system. This gap creates circumstances where various stakeholders in the system may base their decisions on different data and information. The H-2020 project EUNADICS-AV ("European Natural Disaster Coordination and Information System for Aviation"), started in October 2016, intends to close this gap in data and information availability, enabling all stakeholders in the aviation system to obtain fast, coherent and consistent information. The project intends to combine and harmonize data from satellite earth observation, ground based and airborne platforms, and to integrate them into state-of-the art data assimilation and analysis systems. Besides operational data sources, data from the research community are integrated as well. Hazards considered in the project include volcano eruptions, nuclear accidents and events, and forest fires. The availability of consistent and coherent data analysis fields based on all available measurements will greatly enhances our capability to respond to disasters effectively and efficiently, minimizing system downtimes and thus economic damage while maintaining the safety of millions of passengers.

  14. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, A; Samost, A; Viswanathan, A

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios weremore » then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in brachytherapy and may prove to be an alternative to other hazard analysis techniques.« less

  15. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  16. Determination of eye safety filter protection factors associated with retinal thermal hazard and blue light photochemical hazard for intense pulsed light sources.

    PubMed

    Clarkson, D McG

    2006-02-21

    An assessment is provided of protection factors afforded for retinal thermal hazard and blue light photochemical hazard for a range of filters used with intense pulsed light sources (IPLs). A characteristic IPL spectrum based on black body radiation at 5000 K with a low cut filter at 515 nm was identified as suitable for such estimations. Specific filters assessed included types with idealized transmission properties and also a range of types whose transmission characteristics were measured by means of a Bentham DMc150 spectroradiometer. Predicted behaviour based on these spectra is outlined which describes both the effectiveness of protection and the level of luminous transmittance afforded. The analysis showed it was possible to describe a figure of merit for a particular filter material relating the degree of protection provided and corresponding value of luminous transmittance. This consideration is important for providing users of IPL equipment with safety eyewear with adequate level of visual transmittance.

  17. NOTE: Determination of eye safety filter protection factors associated with retinal thermal hazard and blue light photochemical hazard for intense pulsed light sources

    NASA Astrophysics Data System (ADS)

    McG Clarkson, D.

    2006-02-01

    An assessment is provided of protection factors afforded for retinal thermal hazard and blue light photochemical hazard for a range of filters used with intense pulsed light sources (IPLs). A characteristic IPL spectrum based on black body radiation at 5000 K with a low cut filter at 515 nm was identified as suitable for such estimations. Specific filters assessed included types with idealized transmission properties and also a range of types whose transmission characteristics were measured by means of a Bentham DMc150 spectroradiometer. Predicted behaviour based on these spectra is outlined which describes both the effectiveness of protection and the level of luminous transmittance afforded. The analysis showed it was possible to describe a figure of merit for a particular filter material relating the degree of protection provided and corresponding value of luminous transmittance. This consideration is important for providing users of IPL equipment with safety eyewear with adequate level of visual transmittance.

  18. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    PubMed

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  19. Evaluation of an active learning module to teach hazard and risk in Hazard Analysis and Critical Control Points (HACCP) classes.

    PubMed

    Oyarzabal, Omar A; Rowe, Ellen

    2017-04-01

    The terms hazard and risk are significant building blocks for the organization of risk-based food safety plans. Unfortunately, these terms are not clear for some personnel working in food manufacturing facilities. In addition, there are few examples of active learning modules for teaching adult participants the principles of hazard analysis and critical control points (HACCP). In this study, we evaluated the effectiveness of an active learning module to teach hazard and risk to participants of HACCP classes provided by the University of Vermont Extension in 2015 and 2016. This interactive module is comprised of a questionnaire; group playing of a dice game that we have previously introduced in the teaching of HACCP; the discussion of the terms hazard and risk; and a self-assessment questionnaire to evaluate the teaching of hazard and risk. From 71 adult participants that completed this module, 40 participants (56%) provided the most appropriate definition of hazard, 19 participants (27%) provided the most appropriate definition of risk, 14 participants (20%) provided the most appropriate definitions of both hazard and risk, and 23 participants (32%) did not provide an appropriate definition for hazard or risk. Self-assessment data showed an improvement in the understanding of these terms (P < 0.05). Thirty participants (42%) stated that the most valuable thing they learned with this interactive module was the difference between hazard and risk, and 40 participants (65%) responded that they did not attend similar presentations in the past. The fact that less than one third of the participants answered properly to the definitions of hazard and risk at baseline is not surprising. However, these results highlight the need for the incorporation of modules to discuss these important food safety terms and include more active learning modules to teach food safety classes. This study suggests that active learning helps food personnel better understand important food safety terms that serve as building blocks for the understanding of more complex food safety topics.

  20. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carson, Susan D.; Hunter, Regina L.; Link, Madison D.

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database containsmore » both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.« less

  1. 77 FR 66486 - Applications and Amendments to Facility Operating Licenses and Combined Licenses Involving...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-05

    .... The analyses for LSCS, Unit 2, Cycle 15 have concluded that a two-loop MCPR SL of >= 1.14, based on... safety. The NRC staff has reviewed the licensee's analysis and, based on this review, it appears that the... has provided its analysis of the issue of no significant hazards consideration, which is presented...

  2. The Typhoon Disaster Analysis Based on Weibo Topic Heat

    NASA Astrophysics Data System (ADS)

    Yuan, J.; Gong, A.; Wang, J.; Li, J.

    2018-04-01

    Could social media data be utilized in hazard evaluation? Typhoon disaster as one of the costly disaster has become devastating threats for human. Moreover, social media change the communication way of human and citizens can turn to this platform to express disasterrelated information at real time. Therefore, social media improves situational awareness and widens the method of hazard information acquiring. With more and more studies investigating in relationship between social media response and degree of damage, the strong correlation has been proved. Weibo as one of the most popular social media in China can provide data with posted text, location, user identification and other additional information. Combining with 10 tropical cyclones and Weibo data in 2013, We perform a quantitative analysis between the grade of hazard situation and Weibo related topic heat in province scale. We provide a new model of Weibo topic heat to evaluate the Weibo activity in study area. Also we demonstrate the hazard assessing formula is H = 1.8845 ln(α) + 15.636 in tropical cyclone disaster. High level goodness of curve fitting also suggest that this equation can be used for rapid assessment of hazard caused by tropical cyclones.

  3. Hazardous geology zoning and influence factorsin the near-shore shallow strata and seabed surfaceof the modern Yellow River Delta, China

    NASA Astrophysics Data System (ADS)

    Li, P.

    2016-12-01

    In this study, on the basis of 3,200 km shallow stratigraphic section and sidescan sonar data of the coastal area of the Yellow River Delta, we delineated and interpreted a total of seven types of typical hazardous geologies, including the hazardous geology in the shallow strata (buried ancient channel and strata disturbance) and hazardous geology in the seabed surface strata (pit, erosive residual body, sand patch, sand wave and scour channel). We selected eight parameters representing the development scale of the hazardous geology as the zoning indexes, including the number of hazardous geology types, pit depth, height of erosive residual body, length of scour channel, area of sand patch, length of sand wave, width of the buried ancient channel and depth of strata disturbance, and implemented the grid processing of the research area to calculate the arithmetic sum of the zoning indexes of each unit grid one by one. We then adopted the clustering analysis method to divide the near-shore waters of the Yellow River Delta into five hazardous geology areas, namely the serious erosion disaster area controlled by Diaokou lobe waves, hazardous geology area of multi-disasters under the combined action of the Shenxiangou lobe river wave flow, accumulation type hazardous geology area controlled by the current estuary river, hazardous geology area of single disaster in the deep water area and potential hazardous geology area of the Chengdao Oilfield. All four of the main factors affecting the development of hazardous geology, namely the diffusion and movement of sediment flux of the Yellow River water entering the sea, seabed stability, bottom sediment type and distribution, as well as the marine hydrodynamic characteristics, show significant regional differentiation characteristics and laws. These characteristics and laws are consistent with the above-mentioned zoning results, in which the distribution, scale and genetic mechanism of hazardous geology are considered comprehensively. This indicates that the hazardous geology zoning based on the cluster analysis is a new attempt in research regarding the hazardous geology zoning of the near-shore waters of the modern Yellow River Delta and that this type of zoning has a high level of reasonability.

  4. Geological, geomechanical and geostatistical assessment of rockfall hazard in San Quirico Village (Abruzzo, Italy)

    NASA Astrophysics Data System (ADS)

    Chiessi, Vittorio; D'Orefice, Maurizio; Scarascia Mugnozza, Gabriele; Vitale, Valerio; Cannese, Christian

    2010-07-01

    This paper describes the results of a rockfall hazard assessment for the village of San Quirico (Abruzzo region, Italy) based on an engineering-geological model. After the collection of geological, geomechanical, and geomorphological data, the rockfall hazard assessment was performed based on two separate approaches: i) simulation of detachment of rock blocks and their downhill movement using a GIS; and ii) application of geostatistical techniques to the analysis of georeferenced observations of previously fallen blocks, in order to assess the probability of arrival of blocks due to potential future collapses. The results show that the trajectographic analysis is significantly influenced by the input parameters, with particular reference to the coefficients of restitution values. In order to solve this problem, the model was calibrated based on repeated field observations. The geostatistical approach is useful because it gives the best estimation of point-source phenomena such as rockfalls; however, the sensitivity of results to basic assumptions, e.g. assessment of variograms and choice of a threshold value, may be problematic. Consequently, interpolations derived from different variograms have been used and compared among them; hence, those showing the lowest errors were adopted. The data sets which were statistically analysed are relevant to both kinetic energy and surveyed rock blocks in the accumulation area. The obtained maps highlight areas susceptible to rock block arrivals, and show that the area accommodating the new settlement of S. Quirico Village has the highest level of hazard according to both probabilistic and deterministic methods.

  5. Proton pump inhibitor use and risk of adverse cardiovascular events in aspirin treated patients with first time myocardial infarction: nationwide propensity score matched study

    PubMed Central

    Grove, Erik L; Hansen, Peter Riis; Olesen, Jonas B; Ahlehoff, Ole; Selmer, Christian; Lindhardsen, Jesper; Madsen, Jan Kyst; Køber, Lars; Torp-Pedersen, Christian; Gislason, Gunnar H

    2011-01-01

    Objective To examine the effect of proton pump inhibitors on adverse cardiovascular events in aspirin treated patients with first time myocardial infarction. Design Retrospective nationwide propensity score matched study based on administrative data. Setting All hospitals in Denmark. Participants All aspirin treated patients surviving 30 days after a first myocardial infarction from 1997 to 2006, with follow-up for one year. Patients treated with clopidogrel were excluded. Main outcome measures The risk of the combined end point of cardiovascular death, myocardial infarction, or stroke associated with use of proton pump inhibitors was analysed using Kaplan-Meier analysis, Cox proportional hazard models, and propensity score matched Cox proportional hazard models. Results 3366 of 19 925 (16.9%) aspirin treated patients experienced recurrent myocardial infarction, stroke, or cardiovascular death. The hazard ratio for the combined end point in patients receiving proton pump inhibitors based on the time dependent Cox proportional hazard model was 1.46 (1.33 to 1.61; P<0.001) and for the propensity score matched model based on 8318 patients it was 1.61 (1.45 to 1.79; P<0.001). A sensitivity analysis showed no increase in risk related to use of H2 receptor blockers (1.04, 0.79 to 1.38; P=0.78). Conclusion In aspirin treated patients with first time myocardial infarction, treatment with proton pump inhibitors was associated with an increased risk of adverse cardiovascular events. PMID:21562004

  6. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  7. GIS-based probability assessment of natural hazards in forested landscapes of Central and South-Eastern Europe.

    PubMed

    Lorz, C; Fürst, C; Galic, Z; Matijasic, D; Podrazky, V; Potocic, N; Simoncic, P; Strauch, M; Vacik, H; Makeschin, F

    2010-12-01

    We assessed the probability of three major natural hazards--windthrow, drought, and forest fire--for Central and South-Eastern European forests which are major threats for the provision of forest goods and ecosystem services. In addition, we analyzed spatial distribution and implications for a future oriented management of forested landscapes. For estimating the probability of windthrow, we used rooting depth and average wind speed. Probabilities of drought and fire were calculated from climatic and total water balance during growing season. As an approximation to climate change scenarios, we used a simplified approach with a general increase of pET by 20%. Monitoring data from the pan-European forests crown condition program and observed burnt areas and hot spots from the European Forest Fire Information System were used to test the plausibility of probability maps. Regions with high probabilities of natural hazard are identified and management strategies to minimize probability of natural hazards are discussed. We suggest future research should focus on (i) estimating probabilities using process based models (including sensitivity analysis), (ii) defining probability in terms of economic loss, (iii) including biotic hazards, (iv) using more detailed data sets on natural hazards, forest inventories and climate change scenarios, and (v) developing a framework of adaptive risk management.

  8. Passive imaging based multi-cue hazard detection spacecraft safe landing

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Cheng, Yang; Madison, Richard

    2006-01-01

    Accurate assessment of potentially damaging ground hazards during the spacecraft EDL (Entry, Descent and Landing) phase is crucial to insure a high probability of safe landing. A lander that encounters a large rock, falls off a cliff, or tips over on a steep slope can sustain mission ending damage. Guided entry is expected to shrink landing ellipses from 100-300 km to -10 km radius for the second generation landers as early as 2009. Regardless of size and location, however, landing ellipses will almost always contain hazards such as craters, discontinuities, steep slopes, and large rocks. It is estimated that an MSL (Mars Science Laboratory)-sized lander should detect and avoid 16- 150m diameter craters, vertical drops similar to the edges of 16m or 3.75m diameter crater, for high and low altitude HAD (Hazard Detection and Avoidance) respectively. It should also be able to detect slopes 20' or steeper, and rocks 0.75m or taller. In this paper we will present a passive imaging based, multi-cue hazard detection and avoidance (HDA) system suitable for Martian and other lander missions. This is the first passively imaged HDA system that seamlessly integrates multiple algorithm-crater detection, slope estimation, rock detection and texture analysis, and multicues- crater morphology, rock distribution, to detect these hazards in real time.

  9. A comprehensive review of the implementation of hazard analysis critical control point (HACCP) to the production of flour and flour-based products.

    PubMed

    Arvanitoyannis, Ioannis S; Traikou, Athina

    2005-01-01

    The production of flour and semolina and their ensuing products, such as bread, cake, spaghetti, noodles, and corn flakes, is of major importance, because these products constitute some of the main ingredients of the human diet. The Hazard Analysis Critical Control Point (HACCP) system aims at ensuring the safety of these products. HACCP has been implemented within the frame of this study on various products of both Asian and European origin; the hazards, critical control limits (CCLs), observation practices, and corrective actions have been summarized in comprehensive tables. Furthermore, the various production steps, packaging included, were thoroughly analyzed, and reference was made to both the traditional and new methodologies in an attempt to pinpoint the occurring differences (advantages and disadvantages) per process.

  10. An assessment of the crash fire hazard of liquid hydrogen fueled aircraft

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The crash fire hazards of liquid hydrogen fueled aircraft relative to those of mission equivalent aircraft fueled either with conventional fuel or with liquefied methane were evaluated. The aircraft evaluated were based on Lockheed Corporation design for 400 passenger, Mach 0.85, 5500 n. mile aircraft. Four crash scenarios were considered ranging from a minor incident causing some loss of fuel system integrity to a catastrophic crash. Major tasks included a review of hazardous properties of the alternate fuels and of historic crash fire data; a comparative hazard evluation for each of the three fuels under four crash scenarios a comprehensive review and analysis and an identification of areas further development work. The conclusion was that the crash fire hazards are not significantly different when compared in general for the three fuels, although some fuels showed minor advantages in one respect or another.

  11. Pyrotechnic hazards classification and evaluation program. Phase 2, segment 3: Test plan for determining hazards associated with pyrotechnic manufacturing processes

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A comprehensive test plan for determining the hazards associated with pyrotechnic manufacturing processes is presented. The rationale for each test is based on a systematic analysis of historical accounts of accidents and a detailed study of the characteristics of each manufacturing process. The most hazardous manufacturing operations have been determined to be pressing, mixing, reaming, and filling. The hazard potential of a given situation is evaluated in terms of the probabilities of initiation, communication, and transition to detonation (ICT). The characteristics which affect the ICT probabilities include the ignition mechanisms which are present either in normal or abnormal operation, the condition and properties of the pyrotechnic material, and the configuration of the processing equipment. Analytic expressions are derived which describe the physical conditions of the system, thus permitting a variety of processes to be evaluated in terms of a small number of experiments.

  12. Chemical hazards analysis of resilient flooring for healthcare.

    PubMed

    Lent, Tom; Silas, Julie; Vallette, Jim

    2010-01-01

    This article addresses resilient flooring, evaluating the potential health effects of vinyl flooring and the leading alternatives-synthetic rubber, polyolefin, and linoleum-currently used in the healthcare marketplace. The study inventories chemicals incorporated as components of each of the four material types or involved in their life cycle as feedstocks, intermediary chemicals, or emissions. It then characterizes those chemicals using a chemical hazard-based framework that addresses persistence and bioaccumulation, human toxicity, and human exposures.

  13. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.

  14. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  15. The Long Range Reconnaissance and Observation System (LORROS) with the Kollsman, Inc. Model LH-40, Infrared (Erbium) Laser Rangefinder hazard analysis and safety assessment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustoni, Arnold L.

    A laser hazard analysis and safety assessment was performed for the LH-40 IR Laser Rangefinder based on the 2000 version of the American National Standard Institute's Standard Z136.1, for the Safe Use of Lasers and Z136.6, for the Safe Use of Lasers Outdoors. The LH-40 IR Laser is central to the Long Range Reconnaissance and Observation System (LORROS). The LORROS is being evaluated by the Department 4149 Group to determine its capability as a long-range assessment tool. The manufacture lists the laser rangefinder as 'eye safe' (Class 1 laser classified under the CDRH Compliance Guide for Laser Products and 21more » CFR 1040 Laser Product Performance Standard). It was necessary that SNL validate this prior to its use involving the general public. A formal laser hazard analysis is presented for the typical mode of operation.« less

  16. Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS

    NASA Astrophysics Data System (ADS)

    Ahmad, Raed; Adris, Ahmad; Singh, Ramesh

    2016-07-01

    In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.

  17. Flood Hazard Mapping by Applying Fuzzy TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.

    2017-12-01

    There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS Acknowlegement This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  18. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  19. A conflict model for the international hazardous waste disposal dispute.

    PubMed

    Hu, Kaixian; Hipel, Keith W; Fang, Liping

    2009-12-15

    A multi-stage conflict model is developed to analyze international hazardous waste disposal disputes. More specifically, the ongoing toxic waste conflicts are divided into two stages consisting of the dumping prevention and dispute resolution stages. The modeling and analyses, based on the methodology of graph model for conflict resolution (GMCR), are used in both stages in order to grasp the structure and implications of a given conflict from a strategic viewpoint. Furthermore, a specific case study is investigated for the Ivory Coast hazardous waste conflict. In addition to the stability analysis, sensitivity and attitude analyses are conducted to capture various strategic features of this type of complicated dispute.

  20. Occupational-level interactions between physical hazards and cognitive ability and skill requirements in predicting injury incidence rates.

    PubMed

    Ford, Michael T; Wiggins, Bryan K

    2012-07-01

    Interactions between occupational-level physical hazards and cognitive ability and skill requirements were examined as predictors of injury incidence rates as reported by the U. S. Bureau of Labor Statistics. Based on ratings provided in the Occupational Information Network (O*NET) database, results across 563 occupations indicate that physical hazards at the occupational level were strongly related to injury incidence rates. Also, as expected, the physical hazard-injury rate relationship was stronger among occupations with high cognitive ability and skill requirements. In addition, there was an unexpected main effect such that occupations with high cognitive ability and skill requirements had lower injury rates even after controlling for physical hazards. The main effect of cognitive ability and skill requirements, combined with the interaction with physical hazards, resulted in unexpectedly high injury rates for low-ability and low-skill occupations with low physical hazard levels. Substantive and methodological explanations for these interactions and their theoretical and practical implications are offered. Results suggest that organizations and occupational health and safety researchers and practitioners should consider the occupational level of analysis and interactions between physical hazards and cognitive requirements in future research and practice when attempting to understand and prevent injuries.

  1. Revealing the underlying drivers of disaster risk: a global analysis

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2017-04-01

    Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.

  2. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spangler, Lorenz R.; Most, Wm. A.

    The Waste Isolation Pilot Plant (WIPP) Hazardous Waste Facility Permit (HWFP) limits the allowable emissions of volatile organic compounds (VOCs) from contact handled (CH) transuranic (TRU) waste. The environmental Performance standard within the HWFP, Module IV, Table IV.D.1, prescribes the allowed VOC emissions from the waste to ensure protection of human health and the environment. Compliance with the performance standard to ensure control of VOC emissions is based on VOC concentrations and monitoring in the underground. One of the mechanisms used to ensure compliance with the emissions standards is measuring the VOC concentration in the headspace gas of waste containersmore » prior to disposal. Headspace gas sampling and analysis is the waste characterization activity used to determine the concentration of VOCs in the headspace of waste containers. In addition to being used to demonstrate compliance with the emissions standards of Module IV, Table IV.D.1, the results of the headspace gas sampling and analysis are used to confirm the hazardous wastes identified in the acceptable knowledge (AK) process. Headspace gas sampling and analysis has been an ongoing part of the CH TRU waste characterization program and therefore data are now available concerning its use and applicability. The information from approved Waste Stream Profile Forms (WSPFs) and the headspace gas sampling and analysis results for over 16,000 containers of CH TRU waste were considered as part of this study. The headspace gas sampling and analysis results are based on data from the WIPP Waste Information System (WWIS). These results were evaluated to determine the usefulness of headspace gas sampling and analysis for confirming AK information. The evaluation shows that the reliability of using the results of headspace gas sampling and analysis to confirm AK information can be grouped by mixed and non-mixed waste streams. In general, for mixed waste streams due to VOCs (i.e., carrying VOC-related hazardous waste numbers), there is no reliable comparison that can be made for the detection of a particular target analyte and its associated hazardous waste number(s) based on the AK information on a compound by compound basis. However, for non-mixed waste streams, the results of headspace gas sampling and analysis show a better correlation to the AK information.« less

  4. An Integrated Scenario Ensemble-Based Framework for Hurricane Evacuation Modeling: Part 2-Hazard Modeling.

    PubMed

    Blanton, Brian; Dresback, Kendra; Colle, Brian; Kolar, Randy; Vergara, Humberto; Hong, Yang; Leonardo, Nicholas; Davidson, Rachel; Nozick, Linda; Wachtendorf, Tricia

    2018-04-25

    Hurricane track and intensity can change rapidly in unexpected ways, thus making predictions of hurricanes and related hazards uncertain. This inherent uncertainty often translates into suboptimal decision-making outcomes, such as unnecessary evacuation. Representing this uncertainty is thus critical in evacuation planning and related activities. We describe a physics-based hazard modeling approach that (1) dynamically accounts for the physical interactions among hazard components and (2) captures hurricane evolution uncertainty using an ensemble method. This loosely coupled model system provides a framework for probabilistic water inundation and wind speed levels for a new, risk-based approach to evacuation modeling, described in a companion article in this issue. It combines the Weather Research and Forecasting (WRF) meteorological model, the Coupled Routing and Excess STorage (CREST) hydrologic model, and the ADvanced CIRCulation (ADCIRC) storm surge, tide, and wind-wave model to compute inundation levels and wind speeds for an ensemble of hurricane predictions. Perturbations to WRF's initial and boundary conditions and different model physics/parameterizations generate an ensemble of storm solutions, which are then used to drive the coupled hydrologic + hydrodynamic models. Hurricane Isabel (2003) is used as a case study to illustrate the ensemble-based approach. The inundation, river runoff, and wind hazard results are strongly dependent on the accuracy of the mesoscale meteorological simulations, which improves with decreasing lead time to hurricane landfall. The ensemble envelope brackets the observed behavior while providing "best-case" and "worst-case" scenarios for the subsequent risk-based evacuation model. © 2018 Society for Risk Analysis.

  5. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time‐to‐Event Analysis

    PubMed Central

    Gong, Xiajing; Hu, Meng

    2018-01-01

    Abstract Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time‐to‐event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high‐dimensional data featured by a large number of predictor variables. Our results showed that ML‐based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high‐dimensional data. The prediction performances of ML‐based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML‐based methods provide a powerful tool for time‐to‐event analysis, with a built‐in capacity for high‐dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. PMID:29536640

  6. Prioritization of reproductive toxicants in unconventional oil and gas operations using a multi-country regulatory data-driven hazard assessment.

    PubMed

    Inayat-Hussain, Salmaan H; Fukumura, Masao; Muiz Aziz, A; Jin, Chai Meng; Jin, Low Wei; Garcia-Milian, Rolando; Vasiliou, Vasilis; Deziel, Nicole C

    2018-08-01

    Recent trends have witnessed the global growth of unconventional oil and gas (UOG) production. Epidemiologic studies have suggested associations between proximity to UOG operations with increased adverse birth outcomes and cancer, though specific potential etiologic agents have not yet been identified. To perform effective risk assessment of chemicals used in UOG production, the first step of hazard identification followed by prioritization specifically for reproductive toxicity, carcinogenicity and mutagenicity is crucial in an evidence-based risk assessment approach. To date, there is no single hazard classification list based on the United Nations Globally Harmonized System (GHS), with countries applying the GHS standards to generate their own chemical hazard classification lists. A current challenge for chemical prioritization, particularly for a multi-national industry, is inconsistent hazard classification which may result in misjudgment of the potential public health risks. We present a novel approach for hazard identification followed by prioritization of reproductive toxicants found in UOG operations using publicly available regulatory databases. GHS classification for reproductive toxicity of 157 UOG-related chemicals identified as potential reproductive or developmental toxicants in a previous publication was assessed using eleven governmental regulatory agency databases. If there was discordance in classifications across agencies, the most stringent classification was assigned. Chemicals in the category of known or presumed human reproductive toxicants were further evaluated for carcinogenicity and germ cell mutagenicity based on government classifications. A scoring system was utilized to assign numerical values for reproductive health, cancer and germ cell mutation hazard endpoints. Using a Cytoscape analysis, both qualitative and quantitative results were presented visually to readily identify high priority UOG chemicals with evidence of multiple adverse effects. We observed substantial inconsistencies in classification among the 11 databases. By adopting the most stringent classification within and across countries, 43 chemicals were classified as known or presumed human reproductive toxicants (GHS Category 1), while 31 chemicals were classified as suspected human reproductive toxicants (GHS Category 2). The 43 reproductive toxicants were further subjected to analysis for carcinogenic and mutagenic properties. Calculated hazard scores and Cytoscape visualization yielded several high priority chemicals including potassium dichromate, cadmium, benzene and ethylene oxide. Our findings reveal diverging GHS classification outcomes for UOG chemicals across regulatory agencies. Adoption of the most stringent classification with application of hazard scores provides a useful approach to prioritize reproductive toxicants in UOG and other industries for exposure assessments and selection of safer alternatives. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Recent Progress in Understanding Natural-Hazards-Generated TEC Perturbations: Measurements and Modeling Results

    NASA Astrophysics Data System (ADS)

    Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.

    2015-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.

  8. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a GIS-based system can be for effective and efficient disaster response management. In the coming years our GIS application will be a data base containing all information needed for the evaluation of risk sites along the Gotthard line. Our GIS application can help the technical management to decide about protection measures because of, in addition to the visualisation, tools for spatial data analysis will be available. REFERENCES Bründl M. (Ed.) 2009 : Risikokonzept für Naturgefahren - Leitfaden. Nationale Plattform für Naturgefahren PLANAT, Bern. 416 S. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004: La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Maggi R. et al, 2009: Evaluation of the optimal resilience for vulnerable infrastructure networks. An interdisciplinary pilot study on the transalpine transportation corridors, NRP 54 "Sustainable Development of the Built Environment", Projekt Nr. 405 440, Final Scientific Report, Lugano

  9. Expert systems in civil engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostem, C.N.; Maher, M.L.

    1986-01-01

    This book presents the papers given at a symposium on expert systems in civil engineering. Topics considered at the symposium included problem solving using expert system techniques, construction schedule analysis, decision making and risk analysis, seismic risk analysis systems, an expert system for inactive hazardous waste site characterization, an expert system for site selection, knowledge engineering, and knowledge-based expert systems in seismic analysis.

  10. Use of cloud computing technology in natural hazard assessment and emergency management

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Dehn, J.

    2015-12-01

    During a natural hazard event, the most up-to-date data needs to be in the hands of those on the front line. Decision support system tools can be developed to provide access to pre-made outputs to quickly assess the hazard and potential risk. However, with the ever growing availability of new satellite data as well as ground and airborne data generated in real-time there is a need to analyze the large volumes of data in an easy-to-access and effective environment. With the growth in the use of cloud computing, where the analysis and visualization system can grow with the needs of the user, then these facilities can used to provide this real-time analysis. Think of a central command center uploading the data to the cloud compute system and then those researchers in-the-field connecting to a web-based tool to view the newly acquired data. New data can be added by any user and then viewed instantly by anyone else in the organization through the cloud computing interface. This provides the ideal tool for collaborative data analysis, hazard assessment and decision making. We present the rationale for developing a cloud computing systems and illustrate how this tool can be developed for use in real-time environments. Users would have access to an interactive online image analysis tool without the need for specific remote sensing software on their local system therefore increasing their understanding of the ongoing hazard and mitigate its impact on the surrounding region.

  11. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located about 5 km from the southern boundary of Budapest. The quake caused serious damages in the epicentral area and in the southern districts of the capital. The epicentral area of the earthquake is located along the Danube River. Sand boils were observed in some locations that indicated the occurrence of liquefaction. Because their exact locations were recorded at the time of the earthquake, in situ geotechnical measurements (CPT and SPT) could be performed at two (Dunaharaszti and Taksony) sites. The different types of measurements enabled the probabilistic liquefaction hazard computations at the two studied sites. We have compared the return periods of liquefaction that were computed using different built-in simplified stress based methods.

  12. Toward Building a New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  13. Probability-Based Design Criteria of the ASCE 7 Tsunami Loads and Effects Provisions (Invited)

    NASA Astrophysics Data System (ADS)

    Chock, G.

    2013-12-01

    Mitigation of tsunami risk requires a combination of emergency preparedness for evacuation in addition to providing structural resilience of critical facilities, infrastructure, and key resources necessary for immediate response and economic and social recovery. Critical facilities would include emergency response, medical, tsunami refuges and shelters, ports and harbors, lifelines, transportation, telecommunications, power, financial institutions, and major industrial/commercial facilities. The Tsunami Loads and Effects Subcommittee of the ASCE/SEI 7 Standards Committee is developing a proposed new Chapter 6 - Tsunami Loads and Effects for the 2016 edition of the ASCE 7 Standard. ASCE 7 provides the minimum design loads and requirements for structures subject to building codes such as the International Building Code utilized in the USA. In this paper we will provide a review emphasizing the intent of these new code provisions and explain the design methodology. The ASCE 7 provisions for Tsunami Loads and Effects enables a set of analysis and design methodologies that are consistent with performance-based engineering based on probabilistic criteria. . The ASCE 7 Tsunami Loads and Effects chapter will be initially applicable only to the states of Alaska, Washington, Oregon, California, and Hawaii. Ground shaking effects and subsidence from a preceding local offshore Maximum Considered Earthquake will also be considered prior to tsunami arrival for Alaska and states in the Pacific Northwest regions governed by nearby offshore subduction earthquakes. For national tsunami design provisions to achieve a consistent reliability standard of structural performance for community resilience, a new generation of tsunami inundation hazard maps for design is required. The lesson of recent tsunami is that historical records alone do not provide a sufficient measure of the potential heights of future tsunamis. Engineering design must consider the occurrence of events greater than scenarios in the historical record, and should properly be based on the underlying seismicity of subduction zones. Therefore, Probabilistic Tsunami Hazard Analysis (PTHA) consistent with source seismicity must be performed in addition to consideration of historical event scenarios. A method of Probabilistic Tsunami Hazard Analysis has been established that is generally consistent with Probabilistic Seismic Hazard Analysis in the treatment of uncertainty. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. Structural member acceptability criteria will be based on performance objectives for a 2,500-year Maximum Considered Tsunami. The approach developed by the ASCE Tsunami Loads and Effects Subcommittee of the ASCE 7 Standard would result in the first national unification of tsunami hazard criteria for design codes reflecting the modern approach of Performance-Based Engineering.

  14. Place-classification analysis of community vulnerability to near-field tsunami threats in the U.S. Pacific Northwest (Invited)

    NASA Astrophysics Data System (ADS)

    Wood, N. J.; Jones, J.; Spielman, S.

    2013-12-01

    Near-field tsunami hazards are credible threats to many coastal communities throughout the world. Along the U.S. Pacific Northwest coast, low-lying areas could be inundated by a series of catastrophic tsunami waves that begin to arrive in a matter of minutes following a Cascadia subduction zone (CSZ) earthquake. This presentation summarizes analytical efforts to classify communities with similar characteristics of community vulnerability to tsunami hazards. This work builds on past State-focused inventories of community exposure to CSZ-related tsunami hazards in northern California, Oregon, and Washington. Attributes used in the classification, or cluster analysis, include demography of residents, spatial extent of the developed footprint based on mid-resolution land cover data, distribution of the local workforce, and the number and type of public venues, dependent-care facilities, and community-support businesses. Population distributions also are characterized by a function of travel time to safety, based on anisotropic, path-distance, geospatial modeling. We used an unsupervised-model-based clustering algorithm and a v-fold, cross-validation procedure (v=50) to identify the appropriate number of community types. We selected class solutions that provided the appropriate balance between parsimony and model fit. The goal of the vulnerability classification is to provide emergency managers with a general sense of the types of communities in tsunami hazard zones based on similar characteristics instead of only providing an exhaustive list of attributes for individual communities. This classification scheme can be then used to target and prioritize risk-reduction efforts that address common issues across multiple communities. The presentation will include a discussion of the utility of proposed place classifications to support regional preparedness and outreach efforts.

  15. Physically-Based Probabilistic Seismic Hazard Analysis Using Broad-Band Ground Motion Simulation: a Case Study for Prince Islands Fault, Marmara Sea

    NASA Astrophysics Data System (ADS)

    Mert, A.

    2016-12-01

    The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.

  16. 'Worst case' methodology for the initial assessment of societal risk from proposed major accident installations.

    PubMed

    Carter, D A; Hirst, I L

    2000-01-07

    This paper considers the application of one of the weighted risk indicators used by the Major Hazards Assessment Unit (MHAU) of the Health and Safety Executive (HSE) in formulating advice to local planning authorities on the siting of new major accident hazard installations. In such cases the primary consideration is to ensure that the proposed installation would not be incompatible with existing developments in the vicinity, as identified by the categorisation of the existing developments and the estimation of individual risk values at those developments. In addition a simple methodology, described here, based on MHAU's "Risk Integral" and a single "worst case" even analysis, is used to enable the societal risk aspects of the hazardous installation to be considered at an early stage of the proposal, and to determine the degree of analysis that will be necessary to enable HSE to give appropriate advice.

  17. An approach to trial design and analysis in the era of non-proportional hazards of the treatment effect.

    PubMed

    Royston, Patrick; Parmar, Mahesh K B

    2014-08-07

    Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.

  18. Risk assessment based on a combination of historical analysis, a detailed field study and numerical modeling on the alluvial fan Gadeinerbach as a basis for a risk management concept

    NASA Astrophysics Data System (ADS)

    Moser, M.

    2009-04-01

    The catchment Gadeinerbach in the District of Lungau/Salzburg/Austria is prone to debris flows. Large debris flow events dates back from the years 1934 and 1953. In the upper catchment large mass movements represent debris sources. A field study shows the debris potential and the catchment looks like a "sleeping torrential giant". To carry out mitigation measures a detailed risk management concept, based on a risk assessment in combination of historical analysis, field study and numerical modeling on the alluvial fan was conducted. Human activities have partly altered the surface of the alluvial fan Gadeinerbach but nevertheless some important hazard indicators could be found. With the hazard indicators and photo analysis from the large debris flow event 1934 the catchment character could be pointed out. With the help of these historical data sets (hazard indicators, sediment and debris amount...) it is possible to calibrate the provided numerical models and to win useful knowledge over the pro and cons and their application. The results were used to simulate the design event and furthermore to derive mitigation measures. Therefore the most effective protection against debris with a reduction of the high energy level to a lower level under particular energy change in combination with a debris/bedload deposition place has been carried out. Expert opinion, the study of historical data and a field work is in addition to numerical simulation techniques very necessary for the work in the field of natural hazard management.

  19. Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang

    2017-09-01

    Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.

  20. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  1. Physically-based extreme flood frequency with stochastic storm transposition and paleoflood data on large watersheds

    NASA Astrophysics Data System (ADS)

    England, John F.; Julien, Pierre Y.; Velleux, Mark L.

    2014-03-01

    Traditionally, deterministic flood procedures such as the Probable Maximum Flood have been used for critical infrastructure design. Some Federal agencies now use hydrologic risk analysis to assess potential impacts of extreme events on existing structures such as large dams. Extreme flood hazard estimates and distributions are needed for these efforts, with very low annual exceedance probabilities (⩽10-4) (return periods >10,000 years). An integrated data-modeling hydrologic hazard framework for physically-based extreme flood hazard estimation is presented. Key elements include: (1) a physically-based runoff model (TREX) coupled with a stochastic storm transposition technique; (2) hydrometeorological information from radar and an extreme storm catalog; and (3) streamflow and paleoflood data for independently testing and refining runoff model predictions at internal locations. This new approach requires full integration of collaborative work in hydrometeorology, flood hydrology and paleoflood hydrology. An application on the 12,000 km2 Arkansas River watershed in Colorado demonstrates that the size and location of extreme storms are critical factors in the analysis of basin-average rainfall frequency and flood peak distributions. Runoff model results are substantially improved by the availability and use of paleoflood nonexceedance data spanning the past 1000 years at critical watershed locations.

  2. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less

  3. Framework for Automation of Hazard Log Management on Large Critical Projects

    NASA Astrophysics Data System (ADS)

    Vinerbi, Lorenzo; Babu, Arun P.

    2016-08-01

    Hazard log is a database of all risk management activities in a project. Maintaining its correctness and consistency on large safety/mission critical projects involving multiple vendors, suppliers, and partners is critical and challenging. IBM DOORS is one of the popular tool used for hazard management in space applications. However, not all stake- holders are familiar with it. Also, It is not always feasible to expect all stake-holders to provide correct and consistent hazard data.The current work describes the process and tools to simplify the process of hazard data collection on large projects. It demonstrates how the collected data from all stake-holders is merged to form the hazard log while ensuring data consistency and correctness.The data provided by all parties are collected using a template containing scripts. The scripts check for mistakes based on internal standards of company in charge of hazard management. The collected data is then subjected to merging in DOORS, which also contain scripts to check and import data to form the hazard log. The proposed tool has been applied to a mission critical project, and has been found to save time and reduce the number of mistakes while creating the hazard log. The use of automatic checks paves the way for correct tracking of risk and hazard analysis activities for large critical projects.

  4. K Basin Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PECH, S.H.

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  5. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dionne, B.J.; Morris, S.C. III; Baum, J.W.

    1998-01-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example ofmore » a risk-based decision technique. This document contains the Appendices for the report.« less

  6. 40 CFR 745.223 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Abatement means any measure or set of measures designed to permanently eliminate lead-based paint hazards... elimination of lead-based paint hazards; or (B) Are designed to permanently eliminate lead-based paint hazards..., when such activities are not designed to permanently eliminate lead-based paint hazards, but, instead...

  7. 40 CFR 745.223 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... Abatement means any measure or set of measures designed to permanently eliminate lead-based paint hazards... elimination of lead-based paint hazards; or (B) Are designed to permanently eliminate lead-based paint hazards..., when such activities are not designed to permanently eliminate lead-based paint hazards, but, instead...

  8. 40 CFR 745.223 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... Abatement means any measure or set of measures designed to permanently eliminate lead-based paint hazards... elimination of lead-based paint hazards; or (B) Are designed to permanently eliminate lead-based paint hazards..., when such activities are not designed to permanently eliminate lead-based paint hazards, but, instead...

  9. 40 CFR 745.223 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... Abatement means any measure or set of measures designed to permanently eliminate lead-based paint hazards... elimination of lead-based paint hazards; or (B) Are designed to permanently eliminate lead-based paint hazards..., when such activities are not designed to permanently eliminate lead-based paint hazards, but, instead...

  10. 40 CFR 745.223 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Abatement means any measure or set of measures designed to permanently eliminate lead-based paint hazards... elimination of lead-based paint hazards; or (B) Are designed to permanently eliminate lead-based paint hazards..., when such activities are not designed to permanently eliminate lead-based paint hazards, but, instead...

  11. Deterministic seismic hazard macrozonation of India

    NASA Astrophysics Data System (ADS)

    Kolathayar, Sreevalsa; Sitharam, T. G.; Vipin, K. S.

    2012-10-01

    Earthquakes are known to have occurred in Indian subcontinent from ancient times. This paper presents the results of seismic hazard analysis of India (6°-38°N and 68°-98°E) based on the deterministic approach using latest seismicity data (up to 2010). The hazard analysis was done using two different source models (linear sources and point sources) and 12 well recognized attenuation relations considering varied tectonic provinces in the region. The earthquake data obtained from different sources were homogenized and declustered and a total of 27,146 earthquakes of moment magnitude 4 and above were listed in the study area. The sesismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones which are associated with earthquakes of magnitude 4 and above. A new program was developed in MATLAB for smoothing of the point sources. For assessing the seismic hazard, the study area was divided into small grids of size 0.1° × 0.1° (approximately 10 × 10 km), and the hazard parameters were calculated at the center of each of these grid cells by considering all the seismic sources within a radius of 300 to 400 km. Rock level peak horizontal acceleration (PHA) and spectral accelerations for periods 0.1 and 1 s have been calculated for all the grid points with a deterministic approach using a code written in MATLAB. Epistemic uncertainty in hazard definition has been tackled within a logic-tree framework considering two types of sources and three attenuation models for each grid point. The hazard evaluation without logic tree approach also has been done for comparison of the results. The contour maps showing the spatial variation of hazard values are presented in the paper.

  12. Landslide Hazard from Coupled Inherent and Dynamic Probabilities

    NASA Astrophysics Data System (ADS)

    Strauch, R. L.; Istanbulluoglu, E.; Nudurupati, S. S.

    2015-12-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We sought to unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach couples an empirical inherent landslide probability, based on a frequency ratio analysis, with a numerical dynamic probability, generated by combining subsurface water recharge and surface runoff from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model. Landslide hazard mapping is advanced by combining static and dynamic models of stability into a probabilistic measure of geohazard prediction in both space and time. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex in northern Washington State.

  13. Designsafe-Ci a Cyberinfrastructure for Natural Hazard Simulation and Data

    NASA Astrophysics Data System (ADS)

    Dawson, C.; Rathje, E.; Stanzione, D.; Padgett, J.; Pinelli, J. P.

    2017-12-01

    DesignSafe is the web-based research platform of the Natural Hazards Engineering Research Infrastructure (NHERI) network that provides the computational tools needed to manage and analyze critical data for natural hazards research, with wind and storm surge related hazards being a primary focus. One of the simulation tools under DesignSafe is the Advanced Circulation (ADCIRC) model, a coastal ocean model used in storm surge analysis. ADCIRC is an unstructured, finite element model with high resolution capabilities for studying storm surge impacts, and has long been used in storm surge hind-casting and forecasting. In this talk, we will demonstrate the use of ADCIRC within the DesignSafe platform and its use for forecasting Hurricane Harvey. We will also demonstrate how to analyze, visualize and archive critical storm surge related data within DesignSafe.

  14. DTREEv2, a computer-based support system for the risk assessment of genetically modified plants.

    PubMed

    Pertry, Ine; Nothegger, Clemens; Sweet, Jeremy; Kuiper, Harry; Davies, Howard; Iserentant, Dirk; Hull, Roger; Mezzetti, Bruno; Messens, Kathy; De Loose, Marc; de Oliveira, Dulce; Burssens, Sylvia; Gheysen, Godelieve; Tzotzos, George

    2014-03-25

    Risk assessment of genetically modified organisms (GMOs) remains a contentious area and a major factor influencing the adoption of agricultural biotech. Methodologically, in many countries, risk assessment is conducted by expert committees with little or no recourse to databases and expert systems that can facilitate the risk assessment process. In this paper we describe DTREEv2, a computer-based decision support system for the identification of hazards related to the introduction of GM-crops into the environment. DTREEv2 structures hazard identification and evaluation by means of an Event-Tree type of analysis. The system produces an output flagging identified hazards and potential risks. It is intended to be used for the preparation and evaluation of biosafety dossiers and, as such, its usefulness extends to researchers, risk assessors and regulators in government and industry. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Complete hazard ranking to analyze right-censored data: An ALS survival study.

    PubMed

    Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang

    2017-12-01

    Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.

  16. Using Integrated Earth and Social Science Data for Disaster Risk Assessment

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.; Yetman, G.

    2016-12-01

    Society faces many different risks from both natural and technological hazards. In some cases, disaster risk managers focus on only a few risks, e.g., in regions where a single hazard such as earthquakes dominate. More often, however, disaster risk managers deal with multiple hazards that pose diverse threats to life, infrastructure, and livelihoods. From the viewpoint of scientists, hazards are often studied based on traditional disciplines such as seismology, hydrology, climatology, and epidemiology. But from the viewpoint of disaster risk managers, data are needed on all hazards in a specific region and on the exposure and vulnerability of population, infrastructure, and economic resources and activity. Such managers also need to understand how hazards, exposures, and vulnerabilities may interact, and human and environmental systems respond, to hazard events, as in the case of the Fukushima nuclear disaster that followed from the Sendai earthquake and tsunami. In this regard, geospatial tools that enable visualization and analysis of both Earth and social science data can support the use case of disaster risk managers who need to quickly assess where specific hazard events occur relative to population and critical infrastructure. Such information can help them assess the potential severity of actual or predicted hazard events, identify population centers or key infrastructure at risk, and visualize hazard dynamics, e.g., earthquakes and their aftershocks or the paths of severe storms. This can then inform efforts to mitigate risks across multiple hazards, including reducing exposure and vulnerability, strengthening system resiliency, improving disaster response mechanisms, and targeting mitigation resources to the highest or most critical risks. We report here on initial efforts to develop hazard mapping tools that draw on open web services and support simple spatial queries about population exposure. The NASA Socioeconomic Data and Applications Center (SEDAC) Hazards Mapper, a web-based mapping tool, enables users to estimate population living in areas subject to flood or tornado warnings, near recent earthquakes, or around critical infrastructure. The HazPop mobile app, implemented for iOS devices, utilizes location services to support disaster risk managers working in field conditions.

  17. The Advanced Rapid Imaging and Analysis (ARIA) Project: Status of SAR products for Earthquakes, Floods, Volcanoes and Groundwater-related Subsidence

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Sacco, G. F.; Manipon, G.; Linick, J. P.; Fielding, E. J.; Lundgren, P.; Farr, T. G.; Webb, F.; Rosen, P. A.; Simons, M.

    2017-12-01

    The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating high-level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques including Interferometric Synthetic Aperture Radar (InSAR), differential Global Positioning System, and SAR-based change detection have become critical additions to our toolset for understanding and mapping the damage and deformation caused by earthquakes, volcanic eruptions, floods, landslides, and groundwater extraction. Up until recently, processing of these data sets has been handcrafted for each study or event and has not generated products rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by the California Institute of Technology and by NASA through the Jet Propulsion Laboratory, has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition to supporting the growing science and hazard response communities, the ARIA project has developed the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the influx of raw SAR data from geodetic imaging missions such as ESA's Sentinel-1A/B, now operating with repeat intervals as short as 6 days, and the upcoming NASA NISAR mission. We will present the progress and results we have made on automating the analysis of Sentinel-1A/B SAR data for hazard monitoring and response, with emphasis on recent developments and end user engagement in flood extent mapping and deformation time series for both volcano monitoring and mapping of groundwater-related subsidence

  18. A decision analysis framework for estimating the potential hazards for drinking water resources of chemicals used in hydraulic fracturing fluids.

    PubMed

    Yost, Erin E; Stanek, John; Burgoon, Lyle D

    2017-01-01

    Despite growing concerns over the potential for hydraulic fracturing to impact drinking water resources, there are limited data available to identify chemicals used in hydraulic fracturing fluids that may pose public health concerns. In an effort to explore these potential hazards, a multi-criteria decision analysis (MCDA) framework was employed to analyze and rank selected subsets of these chemicals by integrating data on toxicity, frequency of use, and physicochemical properties that describe transport in water. Data used in this analysis were obtained from publicly available databases compiled by the United States Environmental Protection Agency (EPA) as part of a larger study on the potential impacts of hydraulic fracturing on drinking water. Starting with nationwide hydraulic fracturing chemical usage data from EPA's analysis of the FracFocus Chemical Disclosure Registry 1.0, MCDAs were performed on chemicals that had either noncancer toxicity values (n=37) or cancer-specific toxicity values (n=10). The noncancer MCDA was then repeated for subsets of chemicals reported in three representative states (Texas, n=31; Pennsylvania, n=18; and North Dakota, n=20). Within each MCDA, chemicals received scores based on relative toxicity, relative frequency of use, and physicochemical properties (mobility in water, volatility, persistence). Results show a relative ranking of these chemicals based on hazard potential, and provide preliminary insight into chemicals that may be more likely than others to impact drinking water resources. Comparison of nationwide versus state-specific analyses indicates regional differences in the chemicals that may be of more concern to drinking water resources, although many chemicals were commonly used and received similar overall hazard rankings. Several chemicals highlighted by these MCDAs have been reported in groundwater near areas of hydraulic fracturing activity. This approach is intended as a preliminary analysis, and represents one possible method for integrating data to explore potential public health impacts. Published by Elsevier B.V.

  19. Integration of Aquifer Storage Transfer and Recovery and HACCP for Ensuring Drinking Water Quality

    NASA Astrophysics Data System (ADS)

    Lee, S. I.; Ji, H. W.

    2015-12-01

    The integration of ASTR (Aquifer Storage Transfer and Recovery) and HACCP (Hazard Analysis and Critical Control Point) is being attempted to ensure drinking water quality in a delta area. ASTR is a water supply system in which surface water is injected into a well for storage and recovered from a different well. During the process natural water treatment is achieved in the aquifer. ASTR has advantages over surface reservoirs in that the water is protected from external contaminants and free from water loss by evaporation. HACCP, originated from the food industry, can efficiently manage hazards and reduce risks when it is introduced to the drinking water production. The study area is the located in the Nakdong River Delta, South Korea. Water quality of this region has been deteriorated due to the increased pollution loads from the upstream cities and industrial complexes. ASTR equipped with HACCP system is suggested as a means to heighten the public trust in drinking water. After the drinking water supply system using ASTR was decomposed into ten processes, principles of HACCP were applied. Hazardous event analysis was conducted for 114 hazardous events and nine major hazardous events were identified based on the likelihood and the severity assessment. Potential risk of chemical hazards, as a function of amounts, travel distance and toxicity, was evaluated and the result shows the relative threat a city poses to the drinking water supply facility. Next, critical control points were determined using decision tree analysis. Critical limits, maximum and/or minimum values to which biological, chemical or physical parameters must be controlled, were established. Other procedures such as monitoring, corrective actions and will be presented.

  20. Optimizing Web-Based Instruction: A Case Study Using Poultry Processing Unit Operations

    ERIC Educational Resources Information Center

    O' Bryan, Corliss A.; Crandall, Philip G.; Shores-Ellis, Katrina; Johnson, Donald M.; Ricke, Steven C.; Marcy, John

    2009-01-01

    Food companies and supporting industries need inexpensive, revisable training methods for large numbers of hourly employees due to continuing improvements in Hazard Analysis Critical Control Point (HACCP) programs, new processing equipment, and high employee turnover. HACCP-based food safety programs have demonstrated their value by reducing the…

  1. Industrial Accidents Triggered by Natural Hazards: an Emerging Risk Issue

    NASA Astrophysics Data System (ADS)

    Renni, Elisabetta; Krausmann, Elisabeth; Basco, Anna; Salzano, Ernesto; Cozzani, Valerio

    2010-05-01

    Natural disasters such as earthquakes, tsunamis, flooding or hurricanes have recently and dramatically hit several countries worldwide. Both direct and indirect consequences involved the population, causing on the one hand a high number of fatalities and on the other hand so relevant economical losses that the national gross product may be affected for many years. Loss of critical industrial infrastructures (electricity generation and distribution, gas pipelines, oil refineries, etc.) also occurred, causing further indirect damage to the population. In several cases, accident scenarios with large releases of hazardous materials were triggered by these natural events, causing so-called "Natech events", in which the overall damage resulted from the simultaneous consequences of the natural event and of the release of hazardous substances. Toxic releases, large fires and explosions, as well as possible long-term environmental pollution, economical losses, and overloading of emergency systems were recognised by post-event studies as the main issues of these Natech scenarios. In recent years the increasing frequency and severity of some natural hazards due to climate change has slowly increased the awareness of Natech risk as an emerging risk among the stakeholders. Indeed, the iNTeg-Risk project, co-funded by the European Commission within the 7th Framework Program specifically addresses these scenarios among new technological issues on public safety. The present study, in part carried out within the iNTeg-Risk project, was aimed at the analysis and further development of methods and tools for the assessment and mitigation of Natech accidents. Available tools and knowledge gaps in the assessment of Natech scenarios were highlighted. The analysis mainly addressed the potential impact of flood, lightning and earthquake events on industrial installations where hazardous substances are present. Preliminary screening methodologies and more detailed methods based on quantitative risk analysis were developed. Strategies based on the use of multiple information layers aiming at the identification of mitigation and early warning systems were also explored. A case-study in the Emilia-Romagna region is presented.

  2. Microbiological Analysis of the Food Preparation and Dining Facilities at Fort Myer and Bolling Air Force Base

    DTIC Science & Technology

    1975-02-01

    the viewpoint of microbiological safety one would be tempted to conclude that Ft. Myer had a much lower risk hazard than Bolting Air Force Base. The...I TECHNICAL REPORT I I 76·63-FSL MICROBIOLOGICAL ANAL.YSIS OF THE FOOD PREPARATION AND DINING FACILITIES AT FORT MYER AND BOLLING AIR FORCE...RECIPIENT’ S CATALOG NUMBER 75-53-ESL 4. TITLE (and Subtltlo) 5. TYPE OF REPOR T & PERIOD COVERED Microbiological Analysis of the Food Preparation and

  3. 75 FR 58423 - Notice of Funding Availability for HUD's Fiscal Year (FY) 2010 Lead-Based Paint Hazard Control...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... Availability for HUD's Fiscal Year (FY) 2010 Lead-Based Paint Hazard Control Grant Program and Lead Hazard Reduction Demonstration Grant Program; Technical Correction AGENCY: Office of Healthy Homes and Lead Hazard...://www.Grants.gov its Notice of Funding Availability (NOFA) for HUD's FY2010 Lead-Based Paint Hazard...

  4. Including foreshocks and aftershocks in time-independent probabilistic seismic hazard analyses

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    Time‐independent probabilistic seismic‐hazard analysis treats each source as being temporally and spatially independent; hence foreshocks and aftershocks, which are both spatially and temporally dependent on the mainshock, are removed from earthquake catalogs. Yet, intuitively, these earthquakes should be considered part of the seismic hazard, capable of producing damaging ground motions. In this study, I consider the mainshock and its dependents as a time‐independent cluster, each cluster being temporally and spatially independent from any other. The cluster has a recurrence time of the mainshock; and, by considering the earthquakes in the cluster as a union of events, dependent events have an opportunity to contribute to seismic ground motions and hazard. Based on the methods of the U.S. Geological Survey for a high‐hazard site, the inclusion of dependent events causes ground motions that are exceeded at probability levels of engineering interest to increase by about 10% but could be as high as 20% if variations in aftershock productivity can be accounted for reliably.

  5. Robust inference in discrete hazard models for randomized clinical trials.

    PubMed

    Nguyen, Vinh Q; Gillen, Daniel L

    2012-10-01

    Time-to-event data in which failures are only assessed at discrete time points are common in many clinical trials. Examples include oncology studies where events are observed through periodic screenings such as radiographic scans. When the survival endpoint is acknowledged to be discrete, common methods for the analysis of observed failure times include the discrete hazard models (e.g., the discrete-time proportional hazards and the continuation ratio model) and the proportional odds model. In this manuscript, we consider estimation of a marginal treatment effect in discrete hazard models where the constant treatment effect assumption is violated. We demonstrate that the estimator resulting from these discrete hazard models is consistent for a parameter that depends on the underlying censoring distribution. An estimator that removes the dependence on the censoring mechanism is proposed and its asymptotic distribution is derived. Basing inference on the proposed estimator allows for statistical inference that is scientifically meaningful and reproducible. Simulation is used to assess the performance of the presented methodology in finite samples.

  6. Coastal vulnerability: climate change and natural hazards perspectives

    NASA Astrophysics Data System (ADS)

    Romieu, E.; Vinchon, C.

    2009-04-01

    Introduction Studying coastal zones as a territorial concept (Integrated coastal zone management) is an essential issue for managers, as they have to consider many different topics (natural hazards, resources management, tourism, climate change…). The recent approach in terms of "coastal vulnerability" studies (since the 90's) is the main tool used nowadays to help them in evaluating impacts of natural hazards on coastal zones, specially considering climate change. This present communication aims to highlight the difficulties in integrating this concept in risk analysis as it is usually practiced in natural hazards sciences. 1) Coastal vulnerability as a recent issue The concept of coastal vulnerability mainly appears in the International panel on climate change works of 1992 (IPCC. 2001), where it is presented as essential for climate change adaptation. The concept has been defined by a common methodology which proposes the assessment of seven indicators, in regards to a sea level rise of 1m in 2100: people affected, people at risk, capital value at loss, land at loss, wetland at loss, potential adaptation costs, people at risk assuming this adaptation. Many national assessments have been implemented (Nicholls, et al. 1995) and a global assessment was proposed for three indicators (Nicholls, et al. 1999). The DINAS-Coast project reuses this methodology to produce the DIVA-tool for coastal managers (Vafeidis, et al. 2004). Besides, many other methodologies for national or regional coastal vulnerability assessments have been developed (review by (UNFCCC. 2008). The use of aggregated vulnerability indicators (including geomorphology, hydrodynamics, climate change…) is widespread: the USGS coastal vulnerability index is used worldwide and was completed by a social vulnerability index (Boruff, et al. 2005). Those index-based methods propose a vulnerability mapping which visualise indicators of erosion, submersion and/or socio economic sensibility in coastal zones. This concept is a great tool for policy makers to help managing their action and taking into account climate change (McFadden, et al. 2006). However, in those approaches, vulnerability is the output itself (cost of effective impacts, geomorphologic impacts…), but is not integrated it in a risk analysis. Furthermore, those studies emerged from a climatic perspective, which leads to consider climate change as a hazard or pressure whereas risk studies commonly consider hazards such as erosion and flooding, where climate change modifies the drivers of the hazard. 2) The natural hazards and socio economic perspectives In order to reduce impacts of natural hazards, decision makers need a complete risk assessment (probability of losses). Past studies on natural risks (landslide, earthquake...) highlighted the pertinence of defining risk as a combination of : (1)hazard occurrence and intensity, (2) exposition and (3)vulnerability of assets and population to this hazard (e.g. Douglas. 2007, Sarewitz, et al. 2003). Following the Renn and Klinke risk assessment frame, high uncertainties associated with coastal risks considering climatic and anthropic change highlights the importance of working on that concept of "vulnerability" (Klinke and Renn. 2002). Past studies on vulnerability assessment showed a frequently mentioned gap between "impact based" and "human based" points of view. It is nowadays a great issue for natural risk sciences. Many research efforts in FP7 projects such as MOVE and ENSURE focus on integrating the different dimensions of vulnerability (Turner, et al. 2003, Birkmann. 2006). Coastal risk studies highlight another issue of concern. We previously detailed the different use of the term "vulnerability" in the coastal context, quite different of the "natural risk's" use. Interaction of social, economic and physical sciences is considered within two french research projects (Vulsaco, Miseeva), in order to identify the vulnerability of a system to flooding or erosion (i.e. its characteristics that create potential harm), and integrate them in a risk assessment. Global change is considered by modifications of hazard, anthropogenic pressure and exposition, in order to point out possible modification of vulnerabilities. 3) Learning from both perspectives Coastal vulnerability in its "end in itself" and climate change dimension is a widespread tool for decision makers but it can be inadequate when vulnerability is a component of risk. This is mainly due to the consideration of climate change as a "hazard", so that coastal vulnerability is seen as the possible adverse impacts of climate change. As a matter of fact, this concept is clearly well considered by managers, who feel deeply concerned by climate change. However, coastal risk managers would gain in considering climate change more like a driver able to modify existing hazards than like the pressure in itself. Using this concept could lead to new perspectives of coastal risk mitigation for decision makers (social vulnerability, risk perception…), learning from other disciplines and sciences thanks to research projects such as MOVE (FP7). Acknowledgements The authors would like to thank the BRGM coastal team for rich discussions and fruitful collaborations in coastal vulnerability studies, more specially Déborah Idier for animating the Vulsaco project and Manuel Garcin for his work on tsunamis in Sri Lanka. They are also grateful to the MISEEVA and MOVE teams, which are doing some great trans-disciplinary work. References Birkmann, J., 2006. Measuring vulnerability to Natural Hazards : towards disaster resilient societies. United Nations University Press. Boruff, B. J., Emrich, C., Cutter, S. L., 2005. Erosion hazard vulnerability of US coastal counties. Journal of Coastal Research. 21, 932-942. Douglas, J., 2007. Physical vulnerability modelling in natural hazard risk assessment. Natural Hazards and Earth System Sciences. 7, 283-288. IPCC, 2001. Climate change 2001 : synthesis report. A contribution of working groups I, II and III to the Third Assesment Report of the Intergovernmental Panel on Climate Change. Klinke, A. and Renn, O., 2002. A new approach to risk evaluation and management : risk based, precaution based and discourse based strategies. Risk Analysis. 22, 1071-1094. McFadden, L., Nicholls, R.J., Penning-Rowsell, E. (Eds.), 2006. Managing coastal vulnerability. Elsevier Science. Nicholls, R. J., Hoozemans, F. M. J., Marchand, M., 1999. Increasing flood risk and wetland losses due to global sea-level rise: regional and global analyses. Global Environmental Change, Part A: Human and Policy Dimensions. 9, S69-S87. Nicholls, R. J., Leatherman, S. P., Volonte, C. R., 1995. Impacts and responses to sea-level rise; qualitative and quantitative assessments; Potential impacts of accelerated sea-level rise on developing countries. Journal of Coastal Research. Special issue 14, 26-43. Sarewitz, D., Pielke, R., Keykhah, M., 2003. Vulnerability and Risk: Some Thoughts from a Political and Policy Perspective. Risk Analysis. 23, 805-810. Turner, B. L.,II, Kasperson, R. E., Matson, P. A., McCarthy, J. J., Corell, R. W., Christensen, L., Eckley, N., Kasperson, J. X., Luers, A., Martello, M. L., Polsky, C., Pulsipher, A., Schiller, A., 2003. A framework for vulnerability analysis in sustainability science. Proceedings of the National Academy of Sciences, USA. 100, 8074-8079. UNFCCC, 2008. Compendium on methods and tools to evaluate impacts of vulnerability and adaptation to climate change. Vafeidis, A., Nicholls, R., McFadden, L., 2004. Developing a database for global vulnerability analysis of coastal zones: The DINAS-COAST project and the DIVA tool.

  7. Analysis on Two Typical Landslide Hazard Phenomena in The Wenchuan Earthquake by Field Investigations and Shaking Table Tests.

    PubMed

    Yang, Changwei; Zhang, Jianjing; Liu, Feicheng; Bi, Junwei; Jun, Zhang

    2015-08-06

    Based on our field investigations of landslide hazards in the Wenchuan earthquake, some findings can be reported: (1) the multi-aspect terrain facing empty isolated mountains and thin ridges reacted intensely to the earthquake and was seriously damaged; (2) the slope angles of most landslides was larger than 45°. Considering the above disaster phenomena, the reasons are analyzed based on shaking table tests of one-sided, two-sided and four-sided slopes. The analysis results show that: (1) the amplifications of the peak accelerations of four-sided slopes is stronger than that of the two-sided slopes, while that of the one-sided slope is the weakest, which can indirectly explain the phenomena that the damage is most serious; (2) the amplifications of the peak accelerations gradually increase as the slope angles increase, and there are two inflection points which are the point where the slope angle is 45° and where the slope angle is 50°, respectively, which can explain the seismic phenomenon whereby landslide hazards mainly occur on the slopes whose slope angle is bigger than 45°. The amplification along the slope strike direction is basically consistent, and the step is smooth.

  8. Earthquake Intensity and Strong Motion Analysis Within SEISCOMP3

    NASA Astrophysics Data System (ADS)

    Becker, J.; Weber, B.; Ghasemi, H.; Cummins, P. R.; Murjaya, J.; Rudyanto, A.; Rößler, D.

    2017-12-01

    Measuring and predicting ground motion parameters including seismic intensities for earthquakes is crucial and subject to recent research in engineering seismology.gempa has developed the new SIGMA module for Seismic Intensity and Ground Motion Analysis. The module is based on the SeisComP3 framework extending it in the field of seismic hazard assessment and engineering seismology. SIGMA may work with or independently of SeisComP3 by supporting FDSN Web services for importing earthquake or station information and waveforms. It provides a user-friendly and modern graphical interface for semi-automatic and interactive strong motion data processing. SIGMA provides intensity and (P)SA maps based on GMPE's or recorded data. It calculates the most common strong motion parameters, e.g. PGA/PGV/PGD, Arias intensity and duration, Tp, Tm, CAV, SED and Fourier-, power- and response spectra. GMPE's are configurable. Supporting C++ and Python plug-ins, standard and customized GMPE's including the OpenQuake Hazard Library can be easily integrated and compared. Originally tailored to specifications by Geoscience Australia and BMKG (Indonesia) SIGMA has become a popular tool among SeisComP3 users concerned with seismic hazard and strong motion seismology.

  9. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  10. Development of an expert analysis tool based on an interactive subsidence hazard map for urban land use in the city of Celaya, Mexico

    NASA Astrophysics Data System (ADS)

    Alloy, A.; Gonzalez Dominguez, F.; Nila Fonseca, A. L.; Ruangsirikulchai, A.; Gentle, J. N., Jr.; Cabral, E.; Pierce, S. A.

    2016-12-01

    Land Subsidence as a result of groundwater extraction in central Mexico's larger urban centers initiated in the 80's as a result of population and economic growth. The city of Celaya has undergone subsidence for a few decades and a consequence is the development of an active normal fault system that affects its urban infrastructure and residential areas. To facilitate its analysis and a land use decision-making process we created an online interactive map enabling users to easily obtain information associated with land subsidence. Geological and socioeconomic data of the city was collected, including fault location, population data, and other important infrastructure and structural data has been obtained from fieldwork as part of a study abroad interchange undergraduate course. The subsidence and associated faulting hazard map was created using an InSAR derived subsidence velocity map and population data from INEGI to identify hazard zones using a subsidence gradient spatial analysis approach based on a subsidence gradient and population risk matrix. This interactive map provides a simple perspective of different vulnerable urban elements. As an accessible visualization tool, it will enhance communication between scientific and socio-economic disciplines. Our project also lays the groundwork for a future expert analysis system with an open source and easily accessible Python coded, SQLite database driven website which archives fault and subsidence data along with visual damage documentation to civil structures. This database takes field notes and provides an entry form for uniform datasets, which are used to generate a JSON. Such a database is useful because it allows geoscientists to have a centralized repository and access to their observations over time. Because of the widespread presence of the subsidence phenomena throughout cities in central Mexico, the spatial analysis has been automated using the open source software R. Raster, rgeos, shapefiles, and rgdal libraries have been used to develop the script which permits to obtain the raster maps of horizontal gradient and population density. An advantage is that this analysis can be automated for periodic updates or repurposed for similar analysis in other cities, providing an easily accessible tool for land subsidence hazard assessments.

  11. Using human factors engineering to improve patient safety in the cardiovascular operating room.

    PubMed

    Gurses, Ayse P; Martinez, Elizabeth A; Bauer, Laura; Kim, George; Lubomski, Lisa H; Marsteller, Jill A; Pennathur, Priyadarshini R; Goeschel, Chris; Pronovost, Peter J; Thompson, David

    2012-01-01

    Despite significant medical advances, cardiac surgery remains a high risk procedure. Sub-optimal work system design characteristics can contribute to the risks associated with cardiac surgery. However, hazards due to work system characteristics have not been identified in the cardiovascular operating room (CVOR) in sufficient detail to guide improvement efforts. The purpose of this study was to identify and categorize hazards (anything that has the potential to cause a preventable adverse patient safety event) in the CVOR. An interdisciplinary research team used prospective hazard identification methods including direct observations, contextual inquiry, and photographing to collect data in 5 hospitals for a total 22 cardiac surgeries. We performed thematic analysis of the qualitative data guided by a work system model. 60 categories of hazards such as practice variations, high workload, non-compliance with evidence-based guidelines, not including clinicians' in medical device purchasing decisions were found. Results indicated that hazards are common in cardiac surgery and should be eliminated or mitigated to improve patient safety. To improve patient safety in the CVOR, efforts should focus on creating a culture of safety, increasing compliance with evidence based infection control practices, improving communication and teamwork, and designing better tools and technologies through partnership among all stakeholders.

  12. Development of structural vulnerability curve associated with high magnitude torrent occurrences in Switzerland

    NASA Astrophysics Data System (ADS)

    Wing-Yuen Chow, Candace; Bründl, Michael; Keiler, Margreth

    2017-04-01

    In mountain regions, high economic losses have increased significantly in the past decades due to severe hazard processes, in spite of notable investments in hazard management. Assessing the vulnerability of built structures to high magnitude torrent events is a part of consequence analysis, where hazard intensity is related to the degree of loss sustained. While vulnerability curves have been developed for different countries, the presented work contributes new data from Swiss-based case studies that address a known gap associated with the consequences of high magnitude events. Data for this stage of the investigation communicates the degree of loss associated with affected structures and has been provided by local authorities dealing with natural hazards (e.g. Amt für Wald des Kantons Bern (KAWA) and cantonal insurance providers). Information used for the empirical quantification of vulnerability to torrent processes is derived from detailed post-event documentation and the loss database and verified with field visits. Building the initial database supports data sharing and the systematic inclusion of additional case studies as they become available. The collection of this new data is fundamental to the development of a local vulnerability curve based on observed sediment deposition heights, a proxy for describing hazard intensity. The result will then be compared to curves derived from Austrian and Italian datasets.

  13. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.

  14. Landslide risk models for decision making.

    PubMed

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.

  15. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  16. Meta-analysis for aggregated survival data with competing risks: a parametric approach using cumulative incidence functions.

    PubMed

    Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido

    2016-09-01

    Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Hazardous Materials Routing Study Phase II: Analysis of Hazardous Materials Truck Routes in Proximity to the Dallas Central Business District

    DOT National Transportation Integrated Search

    1985-10-01

    This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...

  18. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  19. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  20. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  1. Moraine-dammed lake failures in Patagonia and assessment of outburst susceptibility in the Baker Basin

    NASA Astrophysics Data System (ADS)

    Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.

    2014-07-01

    Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least 7 moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (>8°) to steep (>15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.

  2. Moraine-dammed lake failures in Patagonia and assessment of outburst susceptibility in the Baker Basin

    NASA Astrophysics Data System (ADS)

    Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.

    2014-12-01

    Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥ 106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤ 50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine-dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least seven moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (> 8°) to steep (> 15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.

  3. Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes

    NASA Astrophysics Data System (ADS)

    Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.

    2012-07-01

    Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.

  4. Explosion impacts during transport of hazardous cargo: GIS-based characterization of overpressure impacts and delineation of flammable zones for ammonia.

    PubMed

    Inanloo, Bahareh; Tansel, Berrin

    2015-06-01

    The aim of this research was to investigate accidental releases of ammonia followed by an en-route incident in an attempt to further predict the consequences of hazardous cargo accidents. The air dispersion model Areal Locations of Hazardous Atmospheres (ALOHA) was employed to track the probable outcomes of a hazardous material release of a tanker truck under different explosion scenarios. The significance of identification of the flammable zones was taken into consideration; in case the flammable vapor causes an explosion. The impacted areas and the severity of the probable destructions were evaluated for an explosion by considering the overpressure waves. ALOHA in conjunction with ArcGIS was used to delineate the flammable and overpressure impact zones for different scenarios. Based on the results, flammable fumes were formed in oval shapes having a chief axis along the wind direction at the time of release. The expansions of the impact areas under the overpressure value which can lead to property damage for 2 and 20 tons releases, under very stable and unstable atmospheric conditions were estimated to be around 1708, 1206; 3742, 3527 feet, respectively, toward the wind direction. A sensitivity analysis was done to assess the significance of wind speed on the impact zones. The insight provided by this study can be utilized by decision makers in transportation of hazardous materials as a guide for possible rerouting, rescheduling, or limiting the quantity of hazardous cargo to reduce the possible impacts after hazardous cargo accidents during transport. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Regional earthquake loss estimation in the Autonomous Province of Bolzano - South Tyrol (Italy)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Winter, Benjamin

    2013-04-01

    Beside storm events geophysical events cause a majority of natural hazard losses on a global scale. However, in alpine regions with a moderate earthquake risk potential like in the study area and thereupon connected consequences on the collective memory this source of risk is often neglected in contrast to gravitational and hydrological hazards processes. In this context, the comparative analysis of potential disasters and emergencies on a national level in Switzerland (Katarisk study) has shown that earthquakes are the most serious source of risk in general. In order to estimate the potential losses of earthquake events for different return periods and loss dimensions of extreme events the following study was conducted in the Autonomous Province of Bolzano - South Tyrol (Italy). The applied methodology follows the generally accepted risk concept based on the risk components hazard, elements at risk and vulnerability, whereby risk is not defined holistically (direct, indirect, tangible and intangible) but with the risk category losses on buildings and inventory as a general risk proxy. The hazard analysis is based on a regional macroseismic scenario approach. Thereby, the settlement centre of each community (116 communities) is defined as potential epicentre. For each epicentre four different epicentral scenarios (return periods of 98, 475, 975 and 2475 years) are calculated based on the simple but approved and generally accepted attenuation law according to Sponheuer (1960). The relevant input parameters to calculate the epicentral scenarios are (i) the macroseismic intensity and (ii) the focal depth. The considered macroseismic intensities are based on a probabilistic seismic hazard analysis (PSHA) of the Italian earthquake catalogue on a community level (Dipartimento della Protezione Civile). The relevant focal depth are considered as a mean within a defined buffer of the focal depths of the harmonized earthquake catalogues of Italy and Switzerland as well as earthquake data of the US Geological Survey (USGS). The asset database to identify the elements at risk is developed under consideration of an address dataset, the land-use plan, official building footprints, building heights based on a normalized digital surface model, official construction costs for different building types (buildings cross cubatures), official statistical data concerning households on community level and insurance data based mean inventory values. To analyse the structural vulnerability and consequently the potential structural losses, community specific mean damage ratios based on the EMS-98 approach and the historic development of the building stock within the individual communities are estimated. Inventory losses are assumed with 30 percent of the structural losses. Thus, for each epicentre a loss-frequency-relationship can be calculated and the most severe epicentral scenarios can be identified.

  6. Tsunami hazard and risk assessment in El Salvador

    NASA Astrophysics Data System (ADS)

    González, M.; González-Riancho, P.; Gutiérrez, O. Q.; García-Aguilar, O.; Aniel-Quiroga, I.; Aguirre, I.; Alvarez, J. A.; Gavidia, F.; Jaimes, I.; Larreynaga, J. A.

    2012-04-01

    Tsunamis are relatively infrequent phenomena representing a greater threat than earthquakes, hurricanes and tornadoes, causing the loss of thousands of human lives and extensive damage to coastal infrastructure around the world. Several works have attempted to study these phenomena in order to understand their origin, causes, evolution, consequences, and magnitude of their damages, to finally propose mechanisms to protect coastal societies. Advances in the understanding and prediction of tsunami impacts allow the development of adaptation and mitigation strategies to reduce risk on coastal areas. This work -Tsunami Hazard and Risk Assessment in El Salvador-, funded by AECID during the period 2009-12, examines the state of the art and presents a comprehensive methodology for assessing the risk of tsunamis at any coastal area worldwide and applying it to the coast of El Salvador. The conceptual framework is based on the definition of Risk as the probability of harmful consequences or expected losses resulting from a given hazard to a given element at danger or peril, over a specified time period (European Commission, Schneiderbauer et al., 2004). The HAZARD assessment (Phase I of the project) is based on propagation models for earthquake-generated tsunamis, developed through the characterization of tsunamigenic sources -sismotectonic faults- and other dynamics under study -tsunami waves, sea level, etc.-. The study area is located in a high seismic activity area and has been hit by 11 tsunamis between 1859 and 1997, nine of them recorded in the twentieth century and all generated by earthquakes. Simulations of historical and potential tsunamis with greater or lesser affection to the country's coast have been performed, including distant sources, intermediate and close. Deterministic analyses of the threats under study -coastal flooding- have been carried out, resulting in different hazard maps (maximum wave height elevation, maximum water depth, minimum tsunami arrival time, maximum flooding level or "Run-up", hazard degree for people based on incipient velocity for people instability) along the coast of El Salvador and at some relevant locations (high resolution analysis). The VULNERABILITY assessment of the exposed elements (Phase II of the project) is based on an integrated approach which is essential given the complexity of coastal areas. A set of indices and indicators have been developed supported by a Geographic Information System that allows graphical representation of physical, environmental, social, economic and infrastructure characteristics of the coast. Different spatial and temporal scales have been also considered in this project to calculate the risk, since both factors would change the amount and type of exposed elements and their vulnerability. A final global RISK analysis (hazard, exposure and vulnerability analysis for each dimension -human, environmental, socioeconomic and infrastructure- and both temporal and spatial scales) allows identifying weaknesses, gaps and special needs to cope with a tsunami event and, therefore, will result in a set of risk reduction measures, including adaptation and mitigation measures.

  7. Environmental Assessment (EA) for Construct Base Civil Engineering Complex at McConnell AFB

    DTIC Science & Technology

    2003-07-14

    Engineer Squadron (22 CES/ CEVA ),53000 Hutchinson Street, Suite 109,McConnell AFB,KS,67221-3617 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...Command Air Refueling Group Air Refueling Squadron Air Refueling Wing Bird Aircraft Strike Hazard Base Civil Engineer Best Management Practices...1991, in an "Economic Analysis", performed by Wilson and Company , dated 14 October 1993, and a repeat "Economic Analysis", dated 10 February 2000

  8. Multi Hazard Assessment: The Azores Archipelagos (PT) case

    NASA Astrophysics Data System (ADS)

    Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos

    2016-04-01

    The COPERNICUS EMS Risk & Recovery Mapping (RRM) activity offers services to support efficient design and implementation of mitigation measures and recovery planning based on EO data exploitation. The Azores Archipelagos case was realized in the context of the FWC 259811 Copernicus EMS RRM, and provides potential impact information for a number of natural disasters. The analysis identified population and assets at risk (infrastructures and environment). The risk assessment was based on hazard and vulnerability of structural elements, road network characteristics, etc. Integration of different hazards and risks was accounted in establishing the necessary first response/ first aid infrastructure. EO data (Pleiades and WV-2), were used to establish a detailed background information, common for the assessment of the whole of the risks. A qualitative Flood hazard level was established, through a "Flood Susceptibility Index" that accounts for upstream drainage area and local slope along the drainage network (Manfreda et al. 2014). Indicators, representing different vulnerability typologies, were accounted for. The risk was established through intersecting hazard and vulnerability (risk- specific lookup table). Probabilistic seismic hazards maps (PGA) were obtained by applying the Cornell (1968) methodology as implemented in CRISIS2007 (Ordaz et al. 2007). The approach relied on the identification of potential sources, the assessment of earthquake recurrence and magnitude distribution, the selection of ground motion model, and the mathematical model to calculate seismic hazard. Lava eruption areas and a volcanic activity related coefficient were established through available historical data. Lava flow paths and their convergence were estimated through applying a cellular, automata based, Lava Flow Hazard numerical model (Gestur Leó Gislason, 2013). The Landslide Hazard Index of NGI (Norwegian Geotechnical Institute) for heavy rainfall (100 year extreme monthly rainfall) and earthquake (475 years return period) was used. Topography, lithology, soil moisture and LU/LC were also accounted for. Soil erosion risk was assessed through the empirical model RUSLE (Renard et al. 1991b). Rainfall erosivity, topography and vegetation cover are the main parameters which were used for predicting the proneness to soil loss. Expected, maximum tsunami wave heights were estimated for a specific earthquake scenario at designated forecast points along the coasts. Deformation at the source was calculated by utilizing the Okada code (Okada, 1985). Tsunami waves' generation and propagation is based on the SWAN model (JRC/IPSC modification). To estimate the wave height (forecast points) the Green's Law function was used (JRC Tsunami Analysis Tool). Storm tracks' historical data indicate a return period of 17 /41 years for H1 /H2 hurricane categories respectively. NOAA WAVEWATCH III model hindcast reanalysis was used to estimate the maximum significant wave height (wind and swell) along the coastline during two major storms. The associated storm-surge risk assessment accounted also for the coastline morphology. Seven empirical (independent) indicators were used to express the erosion susceptibility of the coasts. Each indicator is evaluated according to a semi?quantitative score that represents low, medium and high level of erosion risk or impact. The estimation of the coastal erosion hazard was derived through aggregating the indicators in a grid scale.

  9. 77 FR 19306 - Announcement of Funding Awards: Office of Healthy Homes and Lead Hazard Control Grant Programs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... Federal Register notice published on May 18, 2011 for Lead Based Paint Hazard Control and Lead Hazard... Based Paint Hazard Control Grant Program A total of $43,206,000 was awarded to 22 grantees for the Lead Based Paint Hazard Control Grant Program and an additional $1,999,971 was awarded to 20 out of the 29...

  10. 77 FR 19305 - Announcement of Funding Awards: Office of Healthy Homes and Lead Hazard Control Grant Programs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... Federal Register notice published on September 9, 2010 for Lead Based Paint Hazard Control and Lead Hazard... follows: 1. Lead Based Paint Hazard Control Grant Program A total of $69,700,000 was awarded to 29 grantees for the Lead Based Paint Hazard Control Grant Program and an additional $2,388,637 was awarded to...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sesigur, Haluk; Cili, Feridun

    Seismic isolation is an effective design strategy to mitigate the seismic hazard wherein the structure and its contents are protected from the damaging effects of an earthquake. This paper presents the Hangar Project in Sabiha Goekcen Airport which is located in Istanbul, Turkey. Seismic isolation system where the isolation layer arranged at the top of the columns is selected. The seismic hazard analysis, superstructure design, isolator design and testing were based on the Uniform Building Code (1997) and met all requirements of the Turkish Earthquake Code (2007). The substructure which has the steel vertical trusses on facades and RC Hmore » shaped columns in the middle axis of the building was designed with an R factor limited to 2.0 in accordance with Turkish Earthquake Code. In order to verify the effectiveness of the isolation system, nonlinear static and dynamic analyses are performed. The analysis revealed that isolated building has lower base shear (approximately 1/4) against the non-isolated structure.« less

  12. Modelling the changing cumulative vulnerability to climate-related hazards for river basin management using a GIS-based multicriteria decision approach

    NASA Astrophysics Data System (ADS)

    Hung, Hung-Chih; Wu, Ju-Yu; Hung, Chih-Hsuan

    2017-04-01

    1. Background Asia-Pacific region is one of the most vulnerable areas of the world to climate-related hazards and extremes due to rapid urbanization and over-development in hazard-prone areas. It is thus increasingly recognized that the management of land use and reduction of hazard risk are inextricably linked. This is especially critical from the perspective of integrated river basin management. A range of studies has targeted existing vulnerability assessments. However, limited attention has been paid to the cumulative effects of multiple vulnerable factors and their dynamics faced by local communities. This study proposes a novel methodology to access the changing cumulative vulnerability to climate-related hazards, and to examine the relationship between the attraction factors relevant to the general process of urbanization and vulnerability variability with a focus on a river basin management unit. 2. Methods and data The methods applied in this study include three steps. First, using Intergovernmental Panel on Climate Change's (IPCC) approach, a Cumulative Vulnerability Assessment Framework (CVAF) is built with a goal to characterize and compare the vulnerability to climate-related hazards within river basin regions based on a composition of multiple indicators. We organize these indicator metrics into three categories: (1) hazard exposure; (2) socioeconomic sensitivity, and (3) adaptive capacity. Second, the CVAF is applied by combining a geographical information system (GIS)-based spatial statistics technique with a multicriteria decision analysis (MCDA) to assess and map the changing cumulative vulnerability, comparing conditions in 1996 and 2006 in Danshui River Basin, Taiwan. Third, to examine the affecting factors of vulnerability changing, we develop a Vulnerability Changing Model (VCM) using four attraction factors to reflect how the process of urban developments leads to vulnerability changing. The factors are transport networks, land uses, production values of industries, and infrastructures. We then conduct a regression analysis to test the VCM. To illustrate the proposed methodology, the data are collected from the National Science and Technology Center for Disaster Reduction, Taiwan as well as the National Land Use Investigation and official census statistics. 3. Results and policy implications Results of CVAF analysis demonstrate heterogeneous patterns of vulnerability in the region, and highlight trends of long-term changes. The vulnerable areas unfold as clustered patterns and spatial analogues across regions, rather than randomly distributed. Highest cumulative vulnerability is concentrated in densely populated and downstream reaches (such as Taipei City) of the Danshui River in both time periods. When examining the VCM, it indicates that upper stream and more remote areas generally show low vulnerability, increases are observed in some areas between 1996 and 2006 due to land use intensification, industrial and infrastructure expansion. These findings suggest that land use planning should consider the socioeconomic progression and infrastructure investment factors that contribute to urban sprawl and address current as well as future urban developments vulnerable to hazard risk transmission. The cumulative vulnerability assessment, mapping methods and modelling presented here can be applied to other climate change and hazard risks to highlight priority areas for further investigation and contribute towards improving river basin management.

  13. The development of mountain risk governance: challenges for application

    NASA Astrophysics Data System (ADS)

    Link, S.; Stötter, J.

    2015-01-01

    The complexity the management of mountain risks in the Alps has considerably increased since its institutionalisation in the late nineteenth century. In the history of approaches to dealing with mountain risks four successive paradigms can be distinguished on the basis of key indicators such as guiding principles, characteristic elements and typical instruments: "hazard protection", "hazard management", "risk management", and "risk governance". In this contribution, special attention is paid to the development of hazard zone planning and the growing importance of communication and participation over the course of this transformation. At present, the risk management paradigm has reached maturity. In the Alps, risk governance frameworks are not yet applied to risks from natural hazards. Based on a historical analysis, the suitability and applicability of general risk governance frameworks in the context of mountain risks are discussed. Necessary adaptations (e.g., in administration, educational, and legal structures) are proposed for the upcoming transformation towards mountain risk governance.

  14. Bioaerosols, Noise, and Ultraviolet Radiation Exposures for Municipal Solid Waste Handlers

    PubMed Central

    Ncube, Esper Jacobeth; Voyi, Kuku

    2017-01-01

    Few studies have investigated the occupational hazards of municipal solid waste workers, particularly in developing countries. Resultantly these workers are currently exposed to unknown and unabated occupational hazards that may endanger their health. We determined municipal solid waste workers' work related hazards and associated adverse health endpoints. A multifaceted approach was utilised comprising bioaerosols sampling, occupational noise, thermal conditions measurement, and field based waste compositional analysis. Results from our current study showed highest exposure concentrations for Gram-negative bacteria (6.8 × 103 cfu/m3) and fungi (12.8 × 103 cfu/m3), in the truck cabins. Significant proportions of toxic, infectious, and surgical waste were observed. Conclusively, municipal solid waste workers are exposed to diverse work related risks requiring urgent sound interventions. A framework for assessing occupational risks of these workers must prioritize performance of exposure assessment with regard to the physical, biological, and chemical hazards of the job. PMID:28167969

  15. Bioaerosols, Noise, and Ultraviolet Radiation Exposures for Municipal Solid Waste Handlers.

    PubMed

    Ncube, France; Ncube, Esper Jacobeth; Voyi, Kuku

    2017-01-01

    Few studies have investigated the occupational hazards of municipal solid waste workers, particularly in developing countries. Resultantly these workers are currently exposed to unknown and unabated occupational hazards that may endanger their health. We determined municipal solid waste workers' work related hazards and associated adverse health endpoints. A multifaceted approach was utilised comprising bioaerosols sampling, occupational noise, thermal conditions measurement, and field based waste compositional analysis. Results from our current study showed highest exposure concentrations for Gram-negative bacteria (6.8 × 10 3  cfu/m 3 ) and fungi (12.8 × 10 3  cfu/m 3 ), in the truck cabins. Significant proportions of toxic, infectious, and surgical waste were observed. Conclusively, municipal solid waste workers are exposed to diverse work related risks requiring urgent sound interventions. A framework for assessing occupational risks of these workers must prioritize performance of exposure assessment with regard to the physical, biological, and chemical hazards of the job.

  16. Exploiting Synoptic-Scale Climate Processes to Develop Nonstationary, Probabilistic Flood Hazard Projections

    NASA Astrophysics Data System (ADS)

    Spence, C. M.; Brown, C.; Doss-Gollin, J.

    2016-12-01

    Climate model projections are commonly used for water resources management and planning under nonstationarity, but they do not reliably reproduce intense short-term precipitation and are instead more skilled at broader spatial scales. To provide a credible estimate of flood trend that reflects climate uncertainty, we present a framework that exploits the connections between synoptic-scale oceanic and atmospheric patterns and local-scale flood-producing meteorological events to develop long-term flood hazard projections. We demonstrate the method for the Iowa River, where high flow episodes have been found to correlate with tropical moisture exports that are associated with a pressure dipole across the eastern continental United States We characterize the relationship between flooding on the Iowa River and this pressure dipole through a nonstationary Pareto-Poisson peaks-over-threshold probability distribution estimated based on the historic record. We then combine the results of a trend analysis of dipole index in the historic record with the results of a trend analysis of the dipole index as simulated by General Circulation Models (GCMs) under climate change conditions through a Bayesian framework. The resulting nonstationary posterior distribution of dipole index, combined with the dipole-conditioned peaks-over-threshold flood frequency model, connects local flood hazard to changes in large-scale atmospheric pressure and circulation patterns that are related to flooding in a process-driven framework. The Iowa River example demonstrates that the resulting nonstationary, probabilistic flood hazard projection may be used to inform risk-based flood adaptation decisions.

  17. Identification and apportionment of hazardous elements in the sediments in the Yangtze River estuary.

    PubMed

    Wang, Jiawei; Liu, Ruimin; Wang, Haotian; Yu, Wenwen; Xu, Fei; Shen, Zhenyao

    2015-12-01

    In this study, positive matrix factorization (PMF) and principal components analysis (PCA) were combined to identify and apportion pollution-based sources of hazardous elements in the surface sediments in the Yangtze River estuary (YRE). Source identification analysis indicated that PC1, including Al, Fe, Mn, Cr, Ni, As, Cu, and Zn, can be defined as a sewage component; PC2, including Pb and Sb, can be considered as an atmospheric deposition component; and PC3, containing Cd and Hg, can be considered as an agricultural nonpoint component. To better identify the sources and quantitatively apportion the concentrations to their sources, eight sources were identified with PMF: agricultural/industrial sewage mixed (18.6 %), mining wastewater (15.9 %), agricultural fertilizer (14.5 %), atmospheric deposition (12.8 %), agricultural nonpoint (10.6 %), industrial wastewater (9.8 %), marine activity (9.0 %), and nickel plating industry (8.8 %). Overall, the hazardous element content seems to be more connected to anthropogenic activity instead of natural sources. The PCA results laid the foundation for the PMF analysis by providing a general classification of sources. PMF resolves more factors with a higher explained variance than PCA; PMF provided both the internal analysis and the quantitative analysis. The combination of the two methods can provide more reasonable and reliable results.

  18. Determination of the rockfall source in an urban settlement area by using a rule-based fuzzy evaluation

    NASA Astrophysics Data System (ADS)

    Aksoy, H.; Ercanoglu, M.

    2006-10-01

    The evaluation of the rockfall initiation mechanism and the simulation of the runout behavior is an important issue in the prevention and remedial measures for potential rockfall hazards in highway protection, in forest preservation, and especially in urban settlement areas. In most of the studies in the literature, the extent of the rockfall hazard was determined by various techniques basing on the selection of a rockfall source, generally defined as zones of rock bodies having slope angles higher than a certain value, proposed by general practice. In the present study, it was aimed to carry out a rule-based fuzzy analysis on the discontinuity data of andesites in the city of Ankara, Turkey, in order to bring a different and rather systematic approach to determine the source areas for rockfall hazard in an urban settlement, based on the discontinuity and natural slope features. First, to obtain rock source areas (RSAs), data obtained from the field studies were combined with a rule-based fuzzy evaluation, incorporating the altitude difference, the number of discontinuities, the number of wedges and the number of potential slides as the parameters of the fuzzy sets. After processing the outputs of the rule-based fuzzy system and producing the linguistic definitions, it could be possible to obtain potential RSAs. According to the RSA maps, 1.7% of the study area was found to have "high RSA", and 5.8% of the study area was assigned as "medium RSA". Then, potential rockfall hazard map was prepared. At the final stage, based upon the high and medium RSAs, 3.6% of the study area showed "high rockfall potential", while areal distribution of "medium rockfall potential" was found as 7.9%. Both RSA and potential rockfall hazard map were in accordance with the observations performed in the field.

  19. Long term volcanic hazard analysis in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit having enough quality information to map volcanic hazards and to run more reliable models of volcanic hazards, but in addition it aims to become a sharing system, improving communication between researchers, reducing redundant work and to be the reference for geological research in the Canary Islands.

  20. Multi-hazard risk analysis related to hurricanes

    NASA Astrophysics Data System (ADS)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.

  1. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    PubMed

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  2. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  3. Sensemaking of patient safety risks and hazards.

    PubMed

    Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S

    2006-08-01

    In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced.

  4. Sensemaking of Patient Safety Risks and Hazards

    PubMed Central

    Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S

    2006-01-01

    In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced. PMID:16898979

  5. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  6. Simulating Social and Political Influences on Hazard Analysis through a Classroom Role Playing Exercise

    NASA Astrophysics Data System (ADS)

    Hales, T. C.; Cashman, K. V.

    2006-12-01

    Geological hazard mitigation is a complicated process that involves both detailed scientific research and negotiations between community members with competing interests in the solution. Geological hazards classes based around traditional lecture methods have difficulty conveying the decision-making processes that go into these negotiations. To address this deficiency, we have spent five years developing and testing a role- playing exercise based on mitigation of a dam outburst hazard on Ruapehu volcano, New Zealand. In our exercise, students are asked to undertake one of five different roles and decide the best way to mitigate the hazard. Over the course of their discussion students are challenged to reach a consensus decision despite the presence of strongly opposed positions. Key to the success of the exercise are (1) the presence of a facilitator and recorder for each meeting, (2) the provision of unique information for each interested party, and (3) the division of the class into multiple meeting groups, such that everyone is required to participate and individual groups can evolve to different conclusions. The exercise can be completed in a single hour and twenty minute classroom session that is divided into four parts: an introduction, a meeting between members of the same interested party to discuss strategy, a meeting between different interested parties, and a debriefing session. This framework can be readily translated to any classroom hazard problem. In our experience, students have responded positively to the use of role-playing to supplement lectures.

  7. Vulnerability and risk of deltaic social-ecological systems exposed to multiple hazards.

    PubMed

    Hagenlocher, Michael; Renaud, Fabrice G; Haas, Susanne; Sebesvari, Zita

    2018-08-01

    Coastal river deltas are hotspots of global change impacts. Sustainable delta futures are increasingly threatened due to rising hazard exposure combined with high vulnerabilities of deltaic social-ecological systems. While the need for integrated multi-hazard approaches has been clearly articulated, studies on vulnerability and risk in deltas either focus on local case studies or single hazards and do not apply a social-ecological systems perspective. As a result, vulnerabilities and risks in areas with strong social and ecological coupling, such as coastal deltas, are not fully understood and the identification of risk reduction and adaptation strategies are often based on incomplete assumptions. To overcome these limitations, we propose an innovative modular indicator library-based approach for the assessment of multi-hazard risk of social-ecological systems across and within coastal deltas globally, and apply it to the Amazon, Ganges-Brahmaputra-Meghna (GBM), and Mekong deltas. Results show that multi-hazard risk is highest in the GBM delta and lowest in the Amazon delta. The analysis reveals major differences between social and environmental vulnerability across the three deltas, notably in the Mekong and the GBM deltas where environmental vulnerability is significantly higher than social vulnerability. Hotspots and drivers of risk vary spatially, thus calling for spatially targeted risk reduction and adaptation strategies within the deltas. Ecosystems have been identified as both an important element at risk as well as an entry point for risk reduction and adaptation strategies. Copyright © 2018. Published by Elsevier B.V.

  8. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less

  9. Tracking Hazard Analysis Data in a Jungle of Changing Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Robin S.; Young, Jonathan

    2006-05-16

    Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.

  10. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  11. Landslide activity as a threat to infrastructure in river valleys - An example from outer Western Carpathians (Poland)

    NASA Astrophysics Data System (ADS)

    Łuszczyńska, Katarzyna; Wistuba, Małgorzata; Malik, Ireneusz

    2017-11-01

    Intensive development of the area of Polish Carpathians increases the scale of landslide risk. Thus detecting landslide hazards and risks became important issue for spatial planning in the area. We applied dendrochronological methods and GIS analysis for better understanding of landslide activity and related hazards in the test area (3,75 km2): Salomonka valley and nearby slopes in the Beskid Żywiecki Mts., Outer Western Carpathians, southern Poland. We applied eccentricity index of radial growth of trees to date past landslide events. Dendrochronological results allowed us to determine the mean frequency of landsliding at each sampling point which were next interpolated into a map of landslide hazard. In total we took samples at 46 points. In each point we sampled 3 coniferous trees. Landslide hazard map shows a medium (23 sampling points) and low (20 sampling points) level of landslide activity for most of the area. The highest level of activity was recorded for the largest landslide. Results of the dendrochronological study suggest that all landslides reaching downslope to Salomonka valley floor are active. LiDAR-based analysis of relief shows that there is an active coupling between those landslides and river channel. Thus channel damming and formation of an episodic lake are probable. The hazard of flooding valley floor upstream of active landslides should be included in the local spatial planning system and crisis management system.

  12. POHCS AND PICS SCREENING PROTOCOL

    EPA Science Inventory

    The report describes risk-driven analysis strategies and a tiered survey approach of analyses that should be useful for building data bases related to other waste combustion processes. NOTE: The need to characterize hazardous waste incinerator emissions for multiple organic compo...

  13. Evaluating the influence of gully erosion on landslide hazard analysis triggered by heavy rainfall

    NASA Astrophysics Data System (ADS)

    Ruljigaljig, Tjuku; Tsai, Ching-Jun; Peng, Wen-Fei; Yu, Teng-To

    2017-04-01

    During the rainstorm period such as typhoon or heavy rain, the development of gully will induce a large-scale landslide. The purpose of this study is to assess and quantify the existence and development of gully for the purpose of triggering landslides by analyzing the landslides hazard. Firstly, based on multi-scale DEM data, this study uses wavelet transform to construct an automatic algorithm. The 1-meter DEM is used to evaluate the location and type of gully, and to establish an evaluation model for predicting erosion development.In this study, routes in the Chai-Yi were studied to clarify the damage potential of roadways from local gully. The local of gully is regarded as a parameter to reduce the strength parameter. The distribution of factor of safe (F.S.) is compared with the landslide inventory map. The result of this research could be used to increase the prediction accuracy of landslide hazard analysis due to heavy rainfalls.

  14. Application of Hazard Analysis and Critical Control Points (HACCP) to the Cultivation Line of Mushroom and Other Cultivated Edible Fungi.

    PubMed

    Pardo, José E; de Figueirêdo, Vinícius Reis; Alvarez-Ortí, Manuel; Zied, Diego C; Peñaranda, Jesús A; Dias, Eustáquio Souza; Pardo-Giménez, Arturo

    2013-09-01

    The Hazard analysis and critical control points (HACCP) is a preventive system which seeks to ensure food safety and security. It allows product protection and correction of errors, improves the costs derived from quality defects and reduces the final overcontrol. In this paper, the system is applied to the line of cultivation of mushrooms and other edible cultivated fungi. From all stages of the process, only the reception of covering materials (stage 1) and compost (stage 3), the pre-fruiting and induction (step 6) and the harvest (stage 7) have been considered as critical control point (CCP). The main hazards found were the presence of unauthorized phytosanitary products or above the permitted dose (stages 6 and 7), and the presence of pathogenic bacteria (stages 1 and 3) and/or heavy metals (stage 3). The implementation of this knowledge will allow the self-control of their productions based on the system HACCP to any plant dedicated to mushroom or other edible fungi cultivation.

  15. Empirical calibration of a roadside hazardousness index for Spanish two-lane rural roads.

    PubMed

    Pardillo-Mayora, José M; Domínguez-Lira, Carlos A; Jurado-Piña, Rafael

    2010-11-01

    Crash records and roadside data from Spanish two-lane rural roads were analyzed to study the effect of roadside configuration on safety. Four indicators were used to characterize the main roadside features that have an influence on the consequences of roadway departures: roadside slope, non-traversable obstacles distance from the roadway edge, safety barrier installation, and alignment. Based on the analysis of the effect of roadside configuration on the frequency and severity of run-off-road injury crashes, a categorical roadside hazardousness scale was defined. Cluster analysis was applied to group the combinations of the four indicators into categories with homogeneous effects on run-off-road injury crashes frequency and severity. As a result a 5-level Roadside Hazardousness Index (RHI) was defined. RHI can be used as reference to normalize the collection of roadside safety related information. The index can also be used as variable for inclusion of roadside condition information in multivariate crash prediction models. 2010 Elsevier Ltd. All rights reserved.

  16. Evaluation of external hazards to nuclear power plants in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimura, C.Y.; Budnitz, R.J.

    1987-12-01

    The Lawrence Livermore National Laboratory (LLNL) has performed a study of the risk of core damage to nuclear power plants in the United States due to externally initiated events. The broad objective has been to gain an understanding of whether or not each external initiator is among the major potential accident initiators that may pose a threat of severe reactor core damage or of large radioactive release to the environment from the reactor. Four external hazards were investigated in this report. These external hazards are internal fires, high winds/tornadoes, external floods, and transportation accidents. Analysis was based on two figures-of-merit,more » one based on core damage frequency and the other based on the frequency of large radioactive releases. Using these two figures-of-merit as evaluation criteria, it has been feasible to ascertain whether the risk from externally initiated accidents is, or is not, an important contributor to overall risk for the US nuclear power plants studied. This has been accomplished for each initiator separately. 208 refs., 17 figs., 45 tabs.« less

  17. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.

  18. TU-CD-BRB-08: Radiomic Analysis of FDG-PET Identifies Novel Prognostic Imaging Biomarkers in Locally Advanced Pancreatic Cancer Patients Treated with SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Y; Shirato, H; Song, J

    2015-06-15

    Purpose: This study aims to identify novel prognostic imaging biomarkers in locally advanced pancreatic cancer (LAPC) using quantitative, high-throughput image analysis. Methods: 86 patients with LAPC receiving chemotherapy followed by SBRT were retrospectively studied. All patients had a baseline FDG-PET scan prior to SBRT. For each patient, we extracted 435 PET imaging features of five types: statistical, morphological, textural, histogram, and wavelet. These features went through redundancy checks, robustness analysis, as well as a prescreening process based on their concordance indices with respect to the relevant outcomes. We then performed principle component analysis on the remaining features (number ranged frommore » 10 to 16), and fitted a Cox proportional hazard regression model using the first 3 principle components. Kaplan-Meier analysis was used to assess the ability to distinguish high versus low-risk patients separated by median predicted survival. To avoid overfitting, all evaluations were based on leave-one-out cross validation (LOOCV), in which each holdout patient was assigned to a risk group according to the model obtained from a separate training set. Results: For predicting overall survival (OS), the most dominant imaging features were wavelet coefficients. There was a statistically significant difference in OS between patients with predicted high and low-risk based on LOOCV (hazard ratio: 2.26, p<0.001). Similar imaging features were also strongly associated with local progression-free survival (LPFS) (hazard ratio: 1.53, p=0.026) on LOOCV. In comparison, neither SUVmax nor TLG was associated with LPFS (p=0.103, p=0.433) (Table 1). Results for progression-free survival and distant progression-free survival showed similar trends. Conclusion: Radiomic analysis identified novel imaging features that showed improved prognostic value over conventional methods. These features characterize the degree of intra-tumor heterogeneity reflected on FDG-PET images, and their biological underpinnings warrant further investigation. If validated in large, prospective cohorts, this method could be used to stratify patients based on individualized risk.« less

  19. Integrated Safety Analysis Tiers

    NASA Technical Reports Server (NTRS)

    Shackelford, Carla; McNairy, Lisa; Wetherholt, Jon

    2009-01-01

    Commercial partnerships and organizational constraints, combined with complex systems, may lead to division of hazard analysis across organizations. This division could cause important hazards to be overlooked, causes to be missed, controls for a hazard to be incomplete, or verifications to be inefficient. Each organization s team must understand at least one level beyond the interface sufficiently enough to comprehend integrated hazards. This paper will discuss various ways to properly divide analysis among organizations. The Ares I launch vehicle integrated safety analyses effort will be utilized to illustrate an approach that addresses the key issues and concerns arising from multiple analysis responsibilities.

  20. Mesh versus bathtub - effects of flood models on exposure analysis in Switzerland

    NASA Astrophysics Data System (ADS)

    Röthlisberger, Veronika; Zischg, Andreas; Keiler, Margreth

    2016-04-01

    In Switzerland, mainly two types of maps that indicate potential flood zones are available for flood exposure analyses: 1) Aquaprotect, a nationwide overview provided by the Federal Office for the Environment and 2) communal flood hazard maps available from the 26 cantons. The model used to produce Aquaprotect can be described as a bathtub approach or linear superposition method with three main parameters, namely the horizontal and vertical distance of a point to water features and the size of the river sub-basin. Whereas the determination of flood zones in Aquaprotect is based on a uniform, nationwide model, the communal flood hazard maps are less homogenous, as they have been elaborated either at communal or cantonal levels. Yet their basic content (i.e. indication of potential flood zones for three recurrence periods, with differentiation of at least three inundation depths) is described in national directives and the vast majority of communal flood hazard maps are based on 2D inundation simulations using meshes. Apart from the methodical differences between Aquaprotect and the communal flood hazard maps (and among different communal flood hazard maps), all of these maps include a layer with a similar recurrence period (i.e. Aquaprotect 250 years, flood hazard maps 300 years) beyond the intended protection level of installed structural systems. In our study, we compare the resulting exposure by overlaying the two types of flood maps with a complete, harmonized, and nationwide dataset of building polygons. We assess the different exposure at the national level, and also consider differences among the 26 cantons and the six biogeographically unique regions, respectively. It was observed that while the nationwide exposure rates for both types of flood maps are similar, the differences within certain cantons and biogeographical regions are remarkable. We conclude that flood maps based on bathtub models are appropriate for assessments at national levels, while maps based on 2D simulations are preferable at sub-national levels.

  1. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  2. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    NASA Astrophysics Data System (ADS)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  3. Rockfall hazard analysis using LiDAR and spatial modeling

    NASA Astrophysics Data System (ADS)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  4. The Handling of Hazard Data on a National Scale: A Case Study from the British Geological Survey

    NASA Astrophysics Data System (ADS)

    Royse, Katherine R.

    2011-11-01

    This paper reviews how hazard data and geological map data have been combined by the British Geological Survey (BGS) to produce a set of GIS-based national-scale hazard susceptibility maps for the UK. This work has been carried out over the last 9 years and as such reflects the combined outputs of a large number of researchers at BGS. The paper details the inception of these datasets from the development of the seamless digital geological map in 2001 through to the deterministic 2D hazard models produced today. These datasets currently include landslides, shrink-swell, soluble rocks, compressible and collapsible deposits, groundwater flooding, geological indicators of flooding, radon potential and potentially harmful elements in soil. These models have been created using a combination of expert knowledge (from both within BGS and from outside bodies such as the Health Protection Agency), national databases (which contain data collected over the past 175 years), multi-criteria analysis within geographical information systems and a flexible rule-based approach for each individual geohazard. By using GIS in this way, it has been possible to model the distribution and degree of geohazards across the whole of Britain.

  5. A procedure to select ground-motion time histories for deterministic seismic hazard analysis from the Next Generation Attenuation (NGA) database

    NASA Astrophysics Data System (ADS)

    Huang, Duruo; Du, Wenqi; Zhu, Hong

    2017-10-01

    In performance-based seismic design, ground-motion time histories are needed for analyzing dynamic responses of nonlinear structural systems. However, the number of ground-motion data at design level is often limited. In order to analyze seismic performance of structures, ground-motion time histories need to be either selected from recorded strong-motion database or numerically simulated using stochastic approaches. In this paper, a detailed procedure to select proper acceleration time histories from the Next Generation Attenuation (NGA) database for several cities in Taiwan is presented. Target response spectra are initially determined based on a local ground-motion prediction equation under representative deterministic seismic hazard analyses. Then several suites of ground motions are selected for these cities using the Design Ground Motion Library (DGML), a recently proposed interactive ground-motion selection tool. The selected time histories are representatives of the regional seismic hazard and should be beneficial to earthquake studies when comprehensive seismic hazard assessments and site investigations are unavailable. Note that this method is also applicable to site-specific motion selections with the target spectra near the ground surface considering the site effect.

  6. Automatic Hazard Detection for Landers

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Cheng, Yang; Matthies, Larry H.

    2008-01-01

    Unmanned planetary landers to date have landed 'blind'; that is, without the benefit of onboard landing hazard detection and avoidance systems. This constrains landing site selection to very benign terrain,which in turn constrains the scientific agenda of missions. The state of the art Entry, Descent, and Landing (EDL) technology can land a spacecraft on Mars somewhere within a 20-100km landing ellipse.Landing ellipses are very likely to contain hazards such as craters, discontinuities, steep slopes, and large rocks, than can cause mission-fatal damage. We briefly review sensor options for landing hazard detection and identify a perception approach based on stereo vision and shadow analysis that addresses the broadest set of missions. Our approach fuses stereo vision and monocular shadow-based rock detection to maximize spacecraft safety. We summarize performance models for slope estimation and rock detection within this approach and validate those models experimentally. Instantiating our model of rock detection reliability for Mars predicts that this approach can reduce the probability of failed landing by at least a factor of 4 in any given terrain. We also describe a rock detector/mapper applied to large-high-resolution images from the Mars Reconnaissance Orbiter (MRO) for landing site characterization and selection for Mars missions.

  7. Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.

    2008-01-01

    Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.

  8. Hazard Detection Analysis for a Forward-Looking Interferometer

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Herkert, Ralph; Smith, William L.; Kireev, Stanislav; Schaffner, Philip R.; Daniels, Taumi S.; Cornman, Larry B.; Sharman, Robert; Weekley, Andrew; hide

    2010-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining the measurements required to alert flight crews to potential weather hazards to safe flight. To meet the needs of the commercial fleet, such a sensor should address multiple hazards to warrant the costs of development, certification, installation, training, and maintenance. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing. These technologies have also been applied to the detection of aerosols and gases for other purposes. The FLI concept is being evaluated for its potential to address multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing during all phases of flight (takeoff, cruise, and landing). The research accomplished in this second phase of the FLI project was in three major areas: further sensitivity studies to better understand the potential capabilities and requirements for an airborne FLI instrument, field measurements that were conducted in an effort to provide empirical demonstrations of radiometric hazard detection, and theoretical work to support the development of algorithms to determine the severity of detected hazards

  9. A methodology for physically based rockfall hazard assessment

    NASA Astrophysics Data System (ADS)

    Crosta, G. B.; Agliardi, F.

    Rockfall hazard assessment is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. The mobility of rockfalls implies a more difficult hazard definition with respect to other slope instabilities with minimal runout. Rockfall hazard assessment involves complex definitions for "occurrence probability" and "intensity". This paper is an attempt to evaluate rockfall hazard using the results of 3-D numerical modelling on a topography described by a DEM. Maps portraying the maximum frequency of passages, velocity and height of blocks at each model cell, are easily combined in a GIS in order to produce physically based rockfall hazard maps. Different methods are suggested and discussed for rockfall hazard mapping at a regional and local scale both along linear features or within exposed areas. An objective approach based on three-dimensional matrixes providing both a positional "Rockfall Hazard Index" and a "Rockfall Hazard Vector" is presented. The opportunity of combining different parameters in the 3-D matrixes has been evaluated to better express the relative increase in hazard. Furthermore, the sensitivity of the hazard index with respect to the included variables and their combinations is preliminarily discussed in order to constrain as objective as possible assessment criteria.

  10. Using Multi-Scenario Tsunami Modelling Results combined with Probabilistic Analyses to provide Hazard Information for the South-WestCoast of Indonesia

    NASA Astrophysics Data System (ADS)

    Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.

    2009-04-01

    Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of the inundation for a specific area, the wave height at coast at this area and the estimated times of arrival (ETAs) of the waves, caused by one tsunamigenic source with a specific magnitude. These parameters from the several scenarios can overlap each other along the coast and must be combined to get one comprehensive hazard assessment for all possible future tsunamis at the region under observation. The simplest way to derive the inundation probability along the coast using the multiscenario approach is to overlay all scenario inundation results and to determine how often a point on land will be significantly inundated from the various scenarios. But this does not take into account that the used tsunamigenic sources for the modeled scenarios have different likelihoods of causing a tsunami. Hence a statistical analysis of historical data and geophysical investigation results based on numerical modelling results is added to the hazard assessment, which clearly improves the significance of the hazard assessment. For this purpose the present method is developed and contains a complex logical combination of the diverse probabilities assessed like probability of occurrence for different earthquake magnitudes at different localities, probability of occurrence for a specific wave height at the coast and the probability for every point on land likely to get hit by a tsunami. The values are combined by a logical tree technique and quantified by statistical analysis of historical data and of the tsunami modelling results as mentioned before. This results in a tsunami inundation probability map covering the South West Coast of Indonesia which nevertheless shows a significant spatial diversity offering a good base for evacuation planning and mitigation strategies. Keywords: tsunami hazard assessment, tsunami modelling, probabilistic analysis, early warning

  11. An integrated risk assessment of the environmental hazards influence on Kazakhstan section (from Khorgas and Dostyk dry ports to Aktau seaport) of the international transport corridor

    NASA Astrophysics Data System (ADS)

    Akiyanova, F. Zh; Arykbayeva, Z. K.; Atalikhova, A. M.; Dauilbayev, B. A.; Zinabdin, N. B.; Kubeyev, A. B.; Tkach, K. A.

    2018-01-01

    The article outlines research results on the assessment of natural hazards impact risk on the international transport corridors’ Kazakhstan section (from Khorgas and Dostyk dry ports to the seaport of Aktau) functioning. Based on the component-by-stage analysis of physical and geographical conditions with the use of qualimetric approach, the areas with different risk levels of natural disasters were identified. To minimize the risk of natural problems exposure, a set of environmental recommendations has been developed.

  12. Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2011-01-01

    Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena

  13. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  14. Regional landslide susceptibility assessment using multi-stage remote sensing data along the coastal range highway in northeastern Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Fang; Huang, Wei-Kai; Chang, Yu-Lin; Chi, Shu-Yeong; Liao, Wu-Chang

    2018-01-01

    Typhoons Megi (2010) and Saola (2012) brought torrential rainfall which triggered regional landslides and flooding hazards along Provincial Highway No. 9 in northeastern Taiwan. To reduce property loss and saving lives, this study combines multi-hazard susceptibility assessment with environmental geology map a rock mass rating system (RMR), remote sensing analysis, and micro-topography interpretation to develop an integrated landslide hazard assessment approach and reflect the intrinsic state of slopeland from the past toward the future. First, the degree of hazard as indicated by historical landslides was used to determine many landslide regions in the past. Secondly, geo-mechanical classification of rock outcroppings was performed by in-situ investigation along the vulnerable road sections. Finally, a high-resolution digital elevation model was extracted from airborne LiDAR and multi-temporal remote sensing images which was analyzed to discover possible catastrophic landslide hotspot shortly. The results of the analysis showed that 37% of the road sections in the study area were highly susceptible to landslide hazards. The spatial distribution of the road sections revealed that those characterized by high susceptibility were located near the boundaries of fault zones and in areas of lithologic dissimilarity. Headward erosion of gullies and concave-shaped topographic features had an adverse effect and was the dominant factor triggering landslides. Regional landslide reactivation on this coastal highway are almost related to the past landslide region based on hazard statistics. The final results of field validation demonstrated that an accuracy of 91% could be achieved for forecasting geohazard followed by intense rainfall events and typhoons.

  15. Southern Dietary Pattern is Associated With Hazard of Acute Coronary Heart Disease in the Reasons for Geographic and Racial Differences in Stroke (REGARDS) Study.

    PubMed

    Shikany, James M; Safford, Monika M; Newby, P K; Durant, Raegan W; Brown, Todd M; Judd, Suzanne E

    2015-09-01

    The association of overall diet, as characterized by dietary patterns, with risk of incident acute coronary heart disease (CHD) has not been studied extensively in samples including sociodemographic and regional diversity. We used data from 17 418 participants in Reasons for Geographic and Racial Differences in Stroke (REGARDS), a national, population-based, longitudinal study of white and black adults aged ≥45 years, enrolled from 2003 to 2007. We derived dietary patterns with factor analysis and used Cox proportional hazards regression to examine hazard of incident acute CHD events - nonfatal myocardial infarction and acute CHD death - associated with quartiles of consumption of each pattern, adjusted for various levels of covariates. Five primary dietary patterns emerged: Convenience, Plant-based, Sweets, Southern, and Alcohol and Salad. A total of 536 acute CHD events occurred over a median (interquartile range) 5.8 (2.1) years of follow-up. After adjustment for sociodemographics, lifestyle factors, and energy intake, highest consumers of the Southern pattern (characterized by added fats, fried food, eggs, organ and processed meats, and sugar-sweetened beverages) experienced a 56% higher hazard of acute CHD (comparing quartile 4 with quartile 1: hazard ratio, 1.56; 95% confidence interval, 1.17-2.08; P for trend across quartiles=0.003). Adding anthropometric and medical history variables to the model attenuated the association somewhat (hazard ratio, 1.37; 95% confidence interval, 1.01-1.85; P=0.036). A dietary pattern characteristic of the southern United States was associated with greater hazard of CHD in this sample of white and black adults in diverse regions of the United States. © 2015 American Heart Association, Inc.

  16. Towards a probabilistic tsunami hazard analysis for the Gulf of Cadiz

    NASA Astrophysics Data System (ADS)

    Løvholt, Finn; Urgeles, Roger

    2017-04-01

    Landslides and volcanic flank collapses constitute a significant portion of all known tsunami sources, and they are less constrained geographically than earthquakes as they are not tied to large fault zones. While landslides have mostly produced local tsunamis historically, prehistoric evidence show that landslides can also produce ocean wide tsunamis. Because the landslide induced tsunami probability is more difficult to quantify than the one induced by earthquakes, our understanding of the landslide tsunami hazard is less understood. To improve our understanding and methodologies to deal with this hazard, we here present results and methods for a preliminary landslide probabilistic tsunami hazard assessment (LPTHA) for the Gulf of Cadiz for submerged landslides. The present literature on LPTHA is sparse, and studies have so far been separated into two groups, the first based on observed magnitude frequency distributions (MFD's), the second based on simplified geotechnical slope stability analysis. We argue that the MFD based approach is best suited when a sufficient amount of data covering a wide range of volumes is available, although uncertainties in the dating of the landslides often represent a potential large source of bias. To this end, the relatively rich availability of landslide data in the Gulf of Cadiz makes this area suitable for developing and testing LPTHA models. In the presentation, we will first explore the landslide data and statistics, including different spatial factors such as slope versus volume relationships, faults etc. Examples of how random realizations can be used to distribute tsunami source over the study area will be demonstrated. Furthermore, computational strategies for simulating both the landslide and the tsunami generation in a simplified way will be described. To this end, we use depth averaged viscoplastic landslide model coupled to the numerical tsunami model to represent a set of idealized tsunami sources, which are in turn put into a regional tsunami model for computing the tsunami propagation. We devote attention to discussing the epistemic uncertainty and sensitivity of the landslide input parameters, and how these may affect the hazard assessment. As the full variability of the landslide parameters cannot be endured, we show that there is a considerable challenge related to the multiple landslide parameter variability. Finally, we discuss some logical next steps in the analysis, as well as possible sources of error.

  17. Towards an integrated approach to natural hazards risk assessment using GIS: with reference to bushfires.

    PubMed

    Chen, Keping; Blong, Russell; Jacobson, Carol

    2003-04-01

    This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.

  18. An evaluation of treatment strategies for head and neck cancer in an African American population.

    PubMed

    Ignacio, D N; Griffin, J J; Daniel, M G; Serlemitsos-Day, M T; Lombardo, F A; Alleyne, T A

    2013-07-01

    This study evaluated treatment strategies for head and neck cancers in a predominantly African American population. Data were collected utilizing medical records and the tumour registry at the Howard University Hospital. Kaplan-Meier method was used for survival analysis and Cox proportional hazards regression analysis predicted the hazard of death. Analysis revealed that the main treatment strategy was radiation combined with platinum for all stages except stage I. Cetuximab was employed in only 1% of cases. Kaplan-Meier analysis revealed stage II patients had poorer outcome than stage IV while Cox proportional hazard regression analysis (p = 0.4662) showed that stage I had a significantly lower hazard of death than stage IV (HR = 0.314; p = 0.0272). Contributory factors included tobacco and alcohol but body mass index (BMI) was inversely related to hazard of death. There was no difference in survival using any treatment modality for African Americans.

  19. Pathway-Based Concentration Response Profiles from Toxicogenomics Data

    EPA Science Inventory

    Microarray analysis of gene expression of in vitro systems could be a powerful tool for assessing chemical hazard. Differentially expressed genes specific to cells, chemicals, and concentrations can be organized into molecular pathways that inform mode of action. An important par...

  20. 7 CFR 210.13 - Facilities management.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... authority with a food safety program based on traditional hazard analysis and critical control point (HACCP... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NATIONAL SCHOOL LUNCH PROGRAM Requirements for School Food Authority...

  1. 7 CFR 210.13 - Facilities management.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... authority with a food safety program based on traditional hazard analysis and critical control point (HACCP... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NATIONAL SCHOOL LUNCH PROGRAM Requirements for School Food Authority...

  2. 7 CFR 210.13 - Facilities management.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... authority with a food safety program based on traditional hazard analysis and critical control point (HACCP... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NATIONAL SCHOOL LUNCH PROGRAM Requirements for School Food Authority...

  3. 7 CFR 210.13 - Facilities management.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... authority with a food safety program based on traditional hazard analysis and critical control point (HACCP... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NATIONAL SCHOOL LUNCH PROGRAM Requirements for School Food Authority...

  4. Physically based probabilistic seismic hazard analysis using broadband ground motion simulation: a case study for the Prince Islands Fault, Marmara Sea

    NASA Astrophysics Data System (ADS)

    Mert, Aydin; Fahjan, Yasin M.; Hutchings, Lawrence J.; Pınar, Ali

    2016-08-01

    The main motivation for this study was the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in the Marmara Sea and the disaster risk around the Marmara region, especially in Istanbul. This study provides the results of a physically based probabilistic seismic hazard analysis (PSHA) methodology, using broadband strong ground motion simulations, for sites within the Marmara region, Turkey, that may be vulnerable to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We included the effects of all considerable-magnitude earthquakes. To generate the high-frequency (0.5-20 Hz) part of the broadband earthquake simulation, real, small-magnitude earthquakes recorded by a local seismic array were used as empirical Green's functions. For the frequencies below 0.5 Hz, the simulations were obtained by using synthetic Green's functions, which are synthetic seismograms calculated by an explicit 2D /3D elastic finite difference wave propagation routine. By using a range of rupture scenarios for all considerable-magnitude earthquakes throughout the PIF segments, we produced a hazard calculation for frequencies of 0.1-20 Hz. The physically based PSHA used here followed the same procedure as conventional PSHA, except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes, and this approach utilizes the full rupture of earthquakes along faults. Furthermore, conventional PSHA predicts ground motion parameters by using empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitudes of earthquakes to obtain ground motion parameters. PSHA results were produced for 2, 10, and 50 % hazards for all sites studied in the Marmara region.

  5. 78 FR 45938 - Final Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-30

    ...] Final Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Final notice. SUMMARY: Flood hazard determinations, which may include additions or modifications of Base Flood Elevations (BFEs), base flood depths, Special Flood Hazard Area (SFHA) boundaries or zone designations, or...

  6. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.

  7. A Cascading Storm-Flood-Landslide Guidance System: Development and Application in China

    NASA Astrophysics Data System (ADS)

    Zeng, Ziyue; Tang, Guoqiang; Long, Di; Ma, Meihong; Hong, Yang

    2016-04-01

    Flash floods and landslides, triggered by storms, often interact and cause cascading effects on human lives and property. Satellite remote sensing data has significant potential use in analysis of these natural hazards. As one of the regions continuously affected by severe flash floods and landslides, Yunnan Province, located in Southwest China, has a complex mountainous hydrometeorology and suffers from frequent heavy rainfalls from May through to late September. Taking Yunnan as a test-bed, this study proposed a Cascading Storm-Flood-Landslide Guidance System to progressively analysis and evaluate the risk of the multi-hazards based on multisource satellite remote sensing data. First, three standardized rainfall amounts (average daily amount in flood seasons, maximum 1h and maximum 6h amount) from the products of Topical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) were used as rainfall indicators to derive the StorM Hazard Index (SMHI). In this process, an integrated approach of the Analytic Hierarchy Process (AHP) and the Information-Entropy theory was adopted to determine the weight of each indicator. Then, land cover and vegetation cover data from the Moderate Resolution Imaging Spectroradiometer (MODIS) products, soil type from the Harmonized World Soil Database (HWSD) soil map, and slope from the Shuttle Radar Topography Mission (SRTM) data were add as semi-static geo-topographical indicators to derive the Flash Flood Hazard Index (FFHI). Furthermore, three more relevant landslide-controlling indicators, including elevation, slope angle and soil text were involved to derive the LandSlide Hazard Index (LSHI). Further inclusion of GDP, population and prevention measures as vulnerability indicators enabled to consecutively predict the risk of storm to flash flood and landslide, respectively. Consequently, the spatial patterns of the hazard indices show that the southeast of Yunnan has more possibility to encounter with storms than other parts, while the northeast of Yunnan are most susceptible to floods and landslides, which agrees with the distribution of observed flood and landslide events. Moreover, risks for the multi-hazards were classified into four categories. Results show a strong correlation between the distributions of flash flood prone and landslide-prone regions and also highlight the counties with high risk of storms (e.g., Funing and Malipo), flash floods (e.g., Gongshan and Yanjing) and landslides (e.g., Zhaotong and Luxi). Compared to other approaches, the Cascading Storm-Flood-Landslide Guidance System uses a straightforward yet useful indicator-based weighted linear combination method and could be a useful prototype in mapping characteristics of storm-triggered hazards for users at different administrative levels (e.g., catchment, town, county, province and even nation) in China.

  8. Safety Guided Design Based on Stamp/STPA for Manned Vehicle in Concept Design Phase

    NASA Astrophysics Data System (ADS)

    Ujiie, Ryo; Katahira, Masafumi; Miyamoto, Yuko; Umeda, Hiroki; Leveson, Nancy; Hoshino, Nobuyuki

    2013-09-01

    In manned vehicles, such as the Soyuz and the Space Shuttle, the crew and computer system cooperate to succeed in returning to the earth. While computers increase the functionality of system, they also increase the complexity of the interaction between the controllers (human and computer) and the target dynamics. In some cases, the complexity can produce a serious accident. To prevent such losses, traditional hazard analysis such as FTA has been applied to system development, however it can be used after creating a detailed system because it focuses on detailed component failures. As a result, it's more difficult to eliminate hazard cause early in the process when it is most feasible.STAMP/STPA is a new hazard analysis that can be applied from the early development phase, with the analysis being refined as more detailed decisions are made. In essence, the analysis and design decisions are intertwined and go hand-in-hand. We have applied STAMP/STPA to a concept design of a new JAXA manned vehicle and tried safety guided design of the vehicle. As a result of this trial, it has been shown that STAMP/STPA can be accepted easily by system engineers and the design has been made more sophisticated from a safety viewpoint. The result also shows that the consequences of human errors on system safety can be analysed in the early development phase and the system designed to prevent them. Finally, the paper will discuss an effective way to harmonize this safety guided design approach with system engineering process based on the result of this experience in this project.

  9. Estimating hazard ratios in cohort data with missing disease information due to death.

    PubMed

    Binder, Nadine; Herrnböck, Anne-Sophie; Schumacher, Martin

    2017-03-01

    In clinical and epidemiological studies information on the primary outcome of interest, that is, the disease status, is usually collected at a limited number of follow-up visits. The disease status can often only be retrieved retrospectively in individuals who are alive at follow-up, but will be missing for those who died before. Right-censoring the death cases at the last visit (ad-hoc analysis) yields biased hazard ratio estimates of a potential risk factor, and the bias can be substantial and occur in either direction. In this work, we investigate three different approaches that use the same likelihood contributions derived from an illness-death multistate model in order to more adequately estimate the hazard ratio by including the death cases into the analysis: a parametric approach, a penalized likelihood approach, and an imputation-based approach. We investigate to which extent these approaches allow for an unbiased regression analysis by evaluating their performance in simulation studies and on a real data example. In doing so, we use the full cohort with complete illness-death data as reference and artificially induce missing information due to death by setting discrete follow-up visits. Compared to an ad-hoc analysis, all considered approaches provide less biased or even unbiased results, depending on the situation studied. In the real data example, the parametric approach is seen to be too restrictive, whereas the imputation-based approach could almost reconstruct the original event history information. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Using prospective hazard analysis to assess an active shooter emergency operations plan.

    PubMed

    Card, Alan J; Harrison, Heidi; Ward, James; Clarkson, P John

    2012-01-01

    Most risk management activity in the healthcare sector is retrospective, based on learning from experience. This is feasible where the risks are routine, but emergency operations plans (EOP) guide the response to events that are both high risk and rare. Under these circumstances, it is important to get the response right the first time, but learning from experience is usually not an option. This case study presents the rationale for taking a proactive approach to improving healthcare organizations' EOP. It demonstrates how the Prospective Hazard Analysis (PHA) Toolkit can drive organizational learning and argues that this toolkit may lead to more efficient improvement than drills and exercises. © 2012 American Society for Healthcare Risk Management of the American Hospital Association.

  11. Seismic Hazard and Risk Assessments for Beijing-Tianjin-Tangshan, China, Area

    USGS Publications Warehouse

    Xie, F.; Wang, Z.; Liu, J.

    2011-01-01

    Seismic hazard and risk in the Beijing-Tianjin-Tangshan, China, area were estimated from 500-year intensity observations. First, we digitized the intensity observations (maps) using ArcGIS with a cell size of 0.1 ?? 0.1??. Second, we performed a statistical analysis on the digitized intensity data, determined an average b value (0.39), and derived the intensity-frequency relationship (hazard curve) for each cell. Finally, based on a Poisson model for earthquake occurrence, we calculated seismic risk in terms of a probability of I ??? 7, 8, or 9 in 50 years. We also calculated the corresponding 10 percent probability of exceedance of these intensities in 50 years. The advantages of assessing seismic hazard and risk from intensity records are that (1) fewer assumptions (i. e., earthquake source and ground motion attenuation) are made, and (2) site-effect is included. Our study shows that the area has high seismic hazard and risk. Our study also suggests that current design peak ground acceleration or intensity for the area may not be adequate. ?? 2010 Birkh??user / Springer Basel AG.

  12. Correlating regional natural hazards for global reinsurance risk assessment

    NASA Astrophysics Data System (ADS)

    Steptoe, Hamish; Maynard, Trevor; Economou, Theo; Fox, Helen; Wallace, Emily; Maisey, Paul

    2016-04-01

    Concurrent natural hazards represent an uncertainty in assessing exposure for the insurance industry. The recently implemented Solvency II Directive requires EU insurance companies to fully understand and justify their capital reserving and portfolio decisions. Lloyd's, the London insurance and reinsurance market, commissioned the Met Office to investigate the dependencies between different global extreme weather events (known to the industry as perils), and the mechanisms for these dependencies, with the aim of helping them assess their compound risk to the exposure of multiple simultaneous hazards. In this work, we base the analysis of hazard-to-hazard dependency on the interaction of different modes of global and regional climate variability. Lloyd's defined 16 key hazard regions, including Australian wildfires, flooding in China and EU windstorms, and we investigate the impact of 10 key climate modes on these areas. We develop a statistical model that facilitates rapid risk assessment whilst allowing for both temporal auto-correlation and, crucially, interdependencies between drivers. The simulator itself is built conditionally using autoregressive regression models for each driver conditional on the others. Whilst the baseline assumption within the (re)insurance industry is that different natural hazards are independent of each other, the assumption of independence of meteorological risks requires greater justification. Although our results suggest that most of the 120 hazard-hazard connections considered are likely to be independent of each other, 13 have significant dependence arising from one or more global modes of climate variability. This allows us to create a matrix of linkages describing the hazard dependency structure that Lloyd's can use to inform their understanding of risk.

  13. Values of Flood Hazard Mapping for Disaster Risk Assessment and Communication

    NASA Astrophysics Data System (ADS)

    Sayama, T.; Takara, K. T.

    2015-12-01

    Flood plains provide tremendous benefits for human settlements. Since olden days people have lived with floods and attempted to control them if necessary. Modern engineering works such as building embankment have enabled people to live even in flood prone areas, and over time population and economic assets have concentrated in these areas. In developing countries also, rapid land use change alters exposure and vulnerability to floods and consequently increases disaster risk. Flood hazard mapping is an essential step for any counter measures. It has various objectives including raising awareness of residents, finding effective evacuation routes and estimating potential damages through flood risk mapping. Depending on the objectives and data availability, there are also many possible approaches for hazard mapping including simulation basis, community basis and remote sensing basis. In addition to traditional paper-based hazard maps, Information and Communication Technology (ICT) promotes more interactive hazard mapping such as movable hazard map to demonstrate scenario simulations for risk communications and real-time hazard mapping for effective disaster responses and safe evacuations. This presentation first summarizes recent advancement of flood hazard mapping by focusing on Japanese experiences and other examples from Asian countries. Then it introduces a flood simulation tool suitable for hazard mapping at the river basin scale even in data limited regions. In the past few years, the tool has been practiced by local officers responsible for disaster management in Asian countries. Through the training activities of hazard mapping and risk assessment, we conduct comparative analysis to identify similarity and uniqueness of estimated economic damages depending on topographic and land use conditions.

  14. Multi-criteria analysis for the detection of the most critical European UNESCO Heritage sites

    NASA Astrophysics Data System (ADS)

    Valagussa, Andrea; Frattini, Paolo; Berta, Nadia; Spizzichino, Daniele; Leoni, Gabriele; Margottini, Claudio; Battista Crosta, Giovanni

    2017-04-01

    A GIS-based multi-criteria analysis has been implemented to identify and to rank the most critical UNESCO Heritage sites at the European scale in the context of PROTHEGO JPI-Project. Two multi-criteria methods have been tested and applied to more than 300 European UNESCO Sites. First, the Analytic Hierarchy Procedure (AHP) was applied to the data of the UNESCO Periodic Report, in relation to 13 natural hazards that have affected or can potentially affect the Heritage sites. According to these reports, 22% of sites are without any documented hazard and 70% of the sites have at least one hazard affecting the site. The most important hazards on the European country are: fire (wildfire), storm, flooding, earthquake and erosion. For each UNESCO site, the potential risk was calculated as a weighed sum of the hazards that affect the site. The weighs of the 13 hazards were obtained by AHP procedure, which is a technique for multi-attribute decision making that enables the decomposition of a problem into hierarchy, based on the opinion of different experts about the dominance of risks. The weights are obtained by rescaling between 0 and 1 the eigenvectors relative to the maximum eigenvalue for the matrix of the coefficients. The internal coherence of the expert's attributions is defined through the calculation of the consistency ratio (Saaty, 1990). The result of the AHP method consists in a map of the UNESCO sites ranked according to the potential risk, where the site most at risk results to be the Geirangerfjord and Nærøyfjord in Norway. However, the quality of these results lies in the reliability of the Period Reports, which are produced by different experts with unknown level of scientific background. To test the reliability of these results, a comparison of the information of the periodic reports with available high-quality datasets (earthquake, volcano and landslide) at the Italian scale has been performed. Sites properly classified by the Period Reports range from 65% (earthquake hazard) to 98% (volcano hazard), with a high underestimation of landslide hazard. Due to this high value of uncertainty, we developed a new methodology to identify and to rank the most critical UNESCO Heritage sites on the basis of three natural hazards (landslide, earthquake, and volcano) for which reliable European-scale hazard maps are available. For each UNESCO site, a potential risk was calculated as the product of hazard (from the available maps) and potential vulnerability. The latter is obtained considering the typology of site (e.g. monument, cultural landscape, and cultural road), the presence or absence of resident and/or tourist, the position of the site (underground/over-ground). Through this methodology, a new ranking of the European UNESCO Sites has been obtained. In this ranking, the historic center of Naples results to be the most-at-danger site of the European continent.

  15. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    PubMed

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  16. Estimating the effect of selected predictors on agricultural confined-space hazard perceptions of Utah farm owner/operators.

    PubMed

    Pate, M L; Dai, X

    2014-04-01

    The purpose of this study was to assess how selected variables affect the confined-space hazard perceptions of farmers in Utah. A confined space was defined as "any space found in an agricultural workplace that was not designed or intended as a regular workstation, has limited or restricted means of entry or exit, and contains potential physical and toxic hazards to workers who intentionally or unintentionally enter the space" (proposed by NCERA-197, 18 May 2011, draft copy). A total of 303 out of 327 farm owner/operators provided complete surveys that were used in the analysis. The state of Utah was grouped into five regions in this study: central, east, northeast, northwest, and southwest. Grain and dairy production comprised 48.7% of the operations responding to the survey. The general linear modeling (GLM) procedure in SAS 9.3 was used to select the models on hazard perception scores for the five studied regions. Interested predictors included response type, production type, safety planning, and injury concerns. Animal production operations had the highest average number of confined spaces (micro = 4, SD = 2.7). Regionally, the northwest region had the highest average number of confined spaces (micro = 4, SD = 2.5). The variables contributing most to confined-space hazard perceptions were injury and death concerns while working alone in confined spaces. Three factors were generated using principle factor analysis (PFA) with orthogonal varimax rotation. Results suggested that factors affect hazard perceptions differently by region. We conclude that outreach and educational efforts to change safety behaviors regarding confined-space hazards should be strategically targeted for each region based on predicting factors. The result can assist agricultural safety and health professionals in targeting agricultural producers' social networks to address human factors such as worker attitudes and/or lack of skills or knowledge that effect hazard perceptions of confined spaces in agriculture.

  17. Analysis on Two Typical Landslide Hazard Phenomena in The Wenchuan Earthquake by Field Investigations and Shaking Table Tests

    PubMed Central

    Yang, Changwei; Zhang, Jianjing; Liu, Feicheng; Bi, Junwei; Jun, Zhang

    2015-01-01

    Based on our field investigations of landslide hazards in the Wenchuan earthquake, some findings can be reported: (1) the multi-aspect terrain facing empty isolated mountains and thin ridges reacted intensely to the earthquake and was seriously damaged; (2) the slope angles of most landslides was larger than 45°. Considering the above disaster phenomena, the reasons are analyzed based on shaking table tests of one-sided, two-sided and four-sided slopes. The analysis results show that: (1) the amplifications of the peak accelerations of four-sided slopes is stronger than that of the two-sided slopes, while that of the one-sided slope is the weakest, which can indirectly explain the phenomena that the damage is most serious; (2) the amplifications of the peak accelerations gradually increase as the slope angles increase, and there are two inflection points which are the point where the slope angle is 45° and where the slope angle is 50°, respectively, which can explain the seismic phenomenon whereby landslide hazards mainly occur on the slopes whose slope angle is bigger than 45°. The amplification along the slope strike direction is basically consistent, and the step is smooth. PMID:26258785

  18. A Probabilistic Tsunami Hazard Assessment Methodology and Its Application to Crescent City, CA

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Leveque, R. J.; Waagan, K.; Adams, L.; Lin, G.

    2012-12-01

    A PTHA methodology, based in large part on Probabilistic Seismic Hazard Assessment methods (e.g., Cornell, 1968; SSHAC, 1997; Geist and Parsons, 2005), was previously applied to Seaside, OR (Gonzalez, et al., 2009). This initial version of the method has been updated to include: a revised method to estimate tidal uncertainty; an improved method for generating stochastic realizations to estimate slip distribution uncertainty (Mai and Beroza, 2002; Blair, et al., 2011); additional near-field sources in the Cascadia Subduction Zone, based on the work of Goldfinger, et al. (2012); far-field sources in Japan, based on information updated since the 3 March 2011 Tohoku tsunami (Japan Earthquake Research Committee, 2011). The GeoClaw tsunami model (Berger, et. al, 2011) is used to simulate generation, propagation and inundation. We will discuss this revised PTHA methodology and the results of its application to Crescent City, CA. Berger, M.J., D. L. George, R. J. LeVeque, and K. T. Mandli, The GeoClaw software for depth-averaged flows with adaptive refinement, Adv. Water Res. 34 (2011), pp. 1195-1206. Blair, J.L., McCrory, P.A., Oppenheimer, D.H., and Waldhauser, F. (2011): A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity: U.S. Geological Survey Data Series 633, v.1.0, available at http://pubs.usgs.gov/ds/633/. Cornell, C. A. (1968): Engineering seismic risk analysis, Bull. Seismol. Soc. Am., 58, 1583-1606. Geist, E. L., and T. Parsons (2005): Probabilistic Analysis of Tsunami Hazards, Nat. Hazards, 37 (3), 277-314. Goldfinger, C., Nelson, C.H., Morey, A.E., Johnson, J.E., Patton, J.R., Karabanov, E., Gutiérrez-Pastor, J., Eriksson, A.T., Gràcia, E., Dunhill, G., Enkin, R.J., Dallimore, A., and Vallier, T. (2012): Turbidite event history—Methods and implications for Holocene paleoseismicity of the Cascadia subduction zone: U.S. Geological Survey Professional Paper 1661-F, 170 p. (Available at http://pubs.usgs.gov/pp/pp1661f/). González, F.I., E.L. Geist, B. Jaffe, U. Kânoglu, H. Mofjeld, C.E. Synolakis, V.V Titov, D. Arcas, D. Bellomo, D. Carlton, T. Horning, J. Johnson, J. Newman, T. Parsons, R. Peters, C. Peterson, G .Priest, A. Venturato, J. Weber, F. Wong, and A. Yalciner (2009): Probabilistic Tsunami Hazard Assessment at Seaside, Oregon, for Near- and Far-Field Seismic Sources, J. Geophys. Res., 114, C11023, doi:10.1029/2008JC005132. Japan Earthquake Research Committee, (2011): http://www.jishin.go.jp/main/p_hyoka02.htm Mai, P. M., and G. C. Beroza (2002): A spatial random field model to characterize complexity in earthquake slip, J. Geophys. Res., 107(B11), 2308, doi:10.1029/2001JB000588. SSHAC (Senior Seismic Hazard Analysis Committee) (1997): Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, Main Report Rep. NUREG/CR-6372 UCRL-ID-122160 Vol. 1, 256 pp, U.S. Nuclear Regulatory Commission.

  19. Coastal dynamics studies for evaluation of hazard and vulnerability for coastal erosion. case study the town La Bocana, Buenaventura, colombian pacific

    NASA Astrophysics Data System (ADS)

    Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza

    2015-04-01

    The analysis of the hazard and vulnerability in coastal areas caused for erosion is based on studies of coastal dynamics since that allows having a better information detail that is useful for decision-making in aspects like prevention, mitigation, disaster reduction and integrated risk management. The Town of La Bocana, located in Buenaventura (Colombian Pacific) was selected to carry out the threat assessment for coastal erosion based on three components: i) magnitude, ii) occurrence and iii) susceptibility. Vulnerability meanwhile, is also composed of three main components for its evaluation: i) exposure ii) fragility and iii) resilience, which in turn are evaluated in 6 dimensions of vulnerability: physical, social, economic, ecological, institutional and cultural. The hazard analysis performed used a semi-quantitative approach, and an index of variables such as type of geomorphological unit, type of beach, exposure of the surfing coast, occurrence, among others. Quantitative data of coastal retreat was measured through the use of DSAS (Digital Shoreline Analysis System) an application of ArcGIS, as well as the development of digital elevation models from the beach and 6 beach profiles strategically located on the coast obtained with GNSS technology. Sediment samples collected from these beaches, medium height and wave direction were used as complementary data. The information was integrated across the coast line into segments of 250 x 250 meters. 4 sectors are part of the coastal area of La Bocana: Pianguita, Vistahermosa, Donwtown and Shangay. 6 vulnerability dimensions units were taken from these population, as well as its density for exposure, wich was analyzed through a multi-array method that include variables such as, land use, population, type of structure, education, basic services, among others, to measure frailty, and their respective indicator of resilience. The hazard analysis results indicate that Vistahermosa is in very high threat, while Donwtown and Pianguita are in a medium hazard. Particularly these two sectors have the mayor population density and the biggest hotel development and services infraestructure; meanwhile Shangay was scored with low hazard because the wave action has no direct impact on it. Vulnerability analysis suggest that the sector of Shangay has a very high vulnerability status because it is a sector that does not have any basic services and have low levels of schooling, meanwhile Downtown, Vistahermosa and Pianguita are in the average of vulnerability. Additionally, it was determined that in recent years the sector of Vista hermosa the erosion rates are up to -xx m yr-1, while in other sectors the regression of the coastline can be associated with local tidal peaks that occur during April and October, while other months of the year are typically for recovery and stability processes.

  20. Safety analysis and review system (SARS) assessment report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Browne, E.T.

    1981-03-01

    Under DOE Order 5481.1, Safety Analysis and Review System for DOE Operations, safety analyses are required for DOE projects in order to ensure that: (1) potential hazards are systematically identified; (2) potential impacts are analyzed; (3) reasonable measures have been taken to eliminate, control, or mitigate the hazards; and (4) there is documented management authorization of the DOE operation based on an objective assessment of the adequacy of the safety analysis. This report is intended to provide the DOE Office of Plans and Technology Assessment (OPTA) with an independent evaluation of the adequacy of the ongoing safety analysis effort. Asmore » part of this effort, a number of site visits and interviews were conducted, and FE SARS documents were reviewed. The latter included SARS Implementation Plans for a number of FE field offices, as well as safety analysis reports completed for certain FE operations. This report summarizes SARS related efforts at the DOE field offices visited and evaluates the extent to which they fulfill the requirements of DOE 5481.1.« less

  1. Poster - 30: Use of a Hazard-Risk Analysis for development of a new eye immobilization tool for treatment of choroidal melanoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prooijen, Monique van; Breen, Stephen

    Purpose: Our treatment for choroidal melanoma utilizes the GTC frame. The patient looks at a small LED to stabilize target position. The LED is attached to a metal arm attached to the GTC frame. A camera on the arm allows therapists to monitor patient compliance. To move to mask-based immobilization we need a new LED/camera attachment mechanism. We used a Hazard-Risk Analysis (HRA) to guide the design of the new tool. Method: A pre-clinical model was built with input from therapy and machine shop personnel. It consisted of an aluminum frame placed in aluminum guide posts attached to the couchmore » top. Further development was guided by the Department of Defense Standard Practice - System Safety hazard risk analysis technique. Results: An Orfit mask was selected because it allowed access to indexes on the couch top which assist with setup reproducibility. The first HRA table was created considering mechanical failure modes of the device. Discussions with operators and manufacturers identified other failure modes and solutions. HRA directed the design towards a safe clinical device. Conclusion: A new immobilization tool has been designed using hazard-risk analysis which resulted in an easier-to-use and safer tool compared to the initial design. The remaining risks are all low probability events and not dissimilar from those currently faced with the GTC setup. Given the gains in ease of use for therapists and patients as well as the lower costs for the hospital, we will implement this new tool.« less

  2. The prevalence of lead-based paint hazards in U.S. housing.

    PubMed Central

    Jacobs, David E; Clickner, Robert P; Zhou, Joey Y; Viet, Susan M; Marker, David A; Rogers, John W; Zeldin, Darryl C; Broene, Pamela; Friedman, Warren

    2002-01-01

    In this study we estimated the number of housing units in the United States with lead-based paint and lead-based paint hazards. We included measurements of lead in intact and deteriorated paint, interior dust, and bare soil. A nationally representative, random sample of 831 housing units was evaluated in a survey between 1998 and 2000; the units and their occupants did not differ significantly from nationwide characteristics. Results indicate that 38 million housing units had lead-based paint, down from the 1990 estimate of 64 million. Twenty-four million had significant lead-based paint hazards. Of those with hazards, 1.2 million units housed low-income families (< 30,000 US dollars/year) with children under 6 years of age. Although 17% of government-supported, low-income housing had hazards, 35% of all low-income housing had hazards. For households with incomes greater than or equal to 30,000 US dollars/year, 19% had hazards. Fourteen percent of all houses had significantly deteriorated lead-based paint, and 16% and 7%, respectively, had dust lead and soil lead levels above current standards of the U.S. Department of Housing and Urban Development and the U.S. Environmental Protection Agency. The prevalence of lead-based paint and hazards increases with age of housing, but most painted surfaces, even in older housing, do not have lead-based paint. Between 2% and 25% of painted building components were coated with lead-based paint. Housing in the Northeast and Midwest had about twice the prevalence of hazards compared with housing in the South and West. The greatest risk occurs in older units with lead-based paint hazards that either will be or are currently occupied by families with children under 6 years of age and are low-income and/or are undergoing renovation or maintenance that disturbs lead-based paint. This study also confirms projections made in 2000 by the President's Task Force on Environmental Health Risks and Safety Risks to Children of the number of houses with lead-based paint hazards. Public- and private-sector resources should be directed to units posing the greatest risk if future lead poisoning is to be prevented. PMID:12361941

  3. Physically-based failure analysis of shallow layered soil deposits over large areas

    NASA Astrophysics Data System (ADS)

    Cuomo, Sabatino; Castorino, Giuseppe Claudio; Iervolino, Aniello

    2014-05-01

    In the last decades, the analysis of slope stability conditions over large areas has become popular among scientists and practitioners (Cascini et al., 2011; Cuomo and Della Sala, 2013). This is due to the availability of new computational tools (Baum et al., 2002; Godt et al., 2008; Baum and Godt, 2012; Salciarini et al., 2012) - implemented in GIS (Geographic Information System) platforms - which allow taking into account the major hydraulic and mechanical issues related to slope failure, even for unsaturated soils, as well as the spatial variability of both topography and soil properties. However, the effectiveness (Sorbino et al., 2010) of the above methods it is still controversial for landslides forecasting especially depending on the accuracy of DTM (Digital Terrain Model) and for the chance that distinct triggering mechanisms may occur over large area. Among the major uncertainties, layering of soil deposits is of primary importance due to soil layer conductivity contrast and differences in shear strength. This work deals with the hazard analysis of shallow landslides over large areas, considering two distinct schematizations of soil stratigraphy, i.e. homogeneous or layered. To this purpose, the physically-based model TRIGRS (Baum et al., 2002) is firstly used, then extended to the case of layered deposit: specifically, a unique set of hydraulic properties is assumed while distinct soil unit weight and shear strength are considered for each soil layer. Both models are applied to a significant study area of Southern Italy, about 4 km2 large, where shallow deposits of air-fall volcanic (pyroclastic) soils have been affected by several landslides, causing victims, damages and economic losses. The achieved results highlight that soil volume globally mobilized over the study area highly depends on local stratigraphy of shallow deposits. This relates to the depth of critical slip surface which rarely corresponds to the bedrock contact where cohesionless coarse materials lie on deeper soil layers with small effective cohesion. It is also shown that, due to a more realistic assessment of soil stratigraphy, the success of the model may increase when performing a back-analysis of a recent real event. References Baum, R. L., W. Z. Savage, and J. W. Godt (2002), TRIGRS-A Fortran program for transient rainfall infiltration and grid-based regional slope-stability analysis. U.S. Geological Survey, Open-file report 02-424, 35 p. Baum, R.L., Godt, J.W. (2012) Assessment of shallow landslide potential using 1-D and 3-D slope stability analysis Landslides and Engineered Slopes: Protecting Society through Improved Understanding - Eberhardt et al. (eds) 2012 Taylor & Francis Group, London, ISBN 978-0-415-62123-6, 1667-1672. Cascini L., Cuomo S., Della Sala M. (2011). Spatial and temporal occurrence of rainfall-induced shallow landslides of flow type: A case of Sarno-Quindici, Italy. Geomorphology, 126(1-2), 148-158. Cuomo S., Della Sala M. (2013). Spatially distributed analysis of shallow landslides and soil erosion induced by rainfall. (submitted to Natural Hazards). Godt, J.W., Baum, R.L., Savage, W.Z., Salciarini, D., Schulz, W.H., Harp, E.L. (2008). Transient deterministic shallow landslide modeling: requirements for susceptibility and hazard assessments in a GIS framework. Engineering Geology 102, 214-226. Salciarini, D., Tamagnini, C., Conversini, P., Rapinesi, S. (2012). Spatially distributed rainfall thresholds for the initiation of shallow landslides. Natural Hazards 61, 229-245. Sorbino G., Sica C., Cascini L. (2010). Susceptibility analysis of shallow landslides source areas using physically based models. Natural Hazards, 53(2), 313-332.

  4. Applying the Land Use Portfolio Model to Estimate Natural-Hazard Loss and Risk - A Hypothetical Demonstration for Ventura County, California

    USGS Publications Warehouse

    Dinitz, Laura B.

    2008-01-01

    With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS-MH currently performs analyses for earthquakes, floods, and hurricane wind. HAZUS-MH loss estimates, however, do not account for some uncertainties associated with the specific natural-hazard scenarios, such as the likelihood of occurrence within a particular time horizon or the effectiveness of alternative risk-reduction options. Because of the uncertainties involved, it is challenging to make informative decisions about how to cost-effectively reduce risk from natural-hazard events. Risk analysis is one approach that decision-makers can use to evaluate alternative risk-reduction choices when outcomes are unknown. The Land Use Portfolio Model (LUPM), developed by the U.S. Geological Survey (USGS), is a geospatial scenario-based tool that incorporates hazard-event uncertainties to support risk analysis. The LUPM offers an approach to estimate and compare risks and returns from investments in risk-reduction measures. This paper describes and demonstrates a hypothetical application of the LUPM for Ventura County, California, and examines the challenges involved in developing decision tools that provide quantitative methods to estimate losses and analyze risk from natural hazards.

  5. Analysis and design of randomised clinical trials involving competing risks endpoints.

    PubMed

    Tai, Bee-Choo; Wee, Joseph; Machin, David

    2011-05-19

    In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.

  6. On the importance of accounting for competing risks in pediatric cancer trials designed to delay or avoid radiotherapy: I. Basic concepts and first analyses.

    PubMed

    Tai, Bee-Choo; Grundy, Richard G; Machin, David

    2010-04-01

    In trials designed to delay or avoid irradiation among children with malignant brain tumor, although irradiation after disease progression is an important event, patients who have disease progression may decline radiotherapy (RT), or those without disease progression may opt for elective RT. To accurately describe the cumulative need for RT in such instances, it is crucial to account for these distinct events and to evaluate how each contributes to the delay or advancement of irradiation via a competing risks analysis. We describe the summary of competing events in such trials using competing risks methods based on cumulative incidence functions and Gray's test. The results obtained are contrasted with standard survival methods based on Kaplan-Meier curves, cause-specific hazard functions and log-rank test. The Kaplan-Meier method overestimates all event-specific rates. The cause-specific hazard analysis showed reduction in hazards for all events (A: RT after progression; B: no RT after progression; C: elective RT) among children with ependymoma. For event A, a higher cumulative incidence was reported for ependymoma. Although Gray's test failed to detect any difference (p = 0.331) between histologic subtypes, the log-rank test suggested marginal evidence (p = 0.057). Similarly, for event C, the log-rank test found stronger evidence of reduction in hazard among those with ependymoma (p = 0.005) as compared with Gray's test (p = 0.086). To evaluate treatment differences, failing to account for competing risks using appropriate methodology may lead to incorrect interpretations.

  7. 40 CFR 745.61 - Scope and applicability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTROL ACT LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Lead-Based Paint Hazards § 745.61 Scope and applicability. (a) This subpart identifies lead-based paint hazards. (b) The standards for lead-based paint hazards apply to target housing and child-occupied facilities. (c) Nothing in...

  8. 40 CFR 745.61 - Scope and applicability.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CONTROL ACT LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Lead-Based Paint Hazards § 745.61 Scope and applicability. (a) This subpart identifies lead-based paint hazards. (b) The standards for lead-based paint hazards apply to target housing and child-occupied facilities. (c) Nothing in...

  9. 40 CFR 745.61 - Scope and applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... CONTROL ACT LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Lead-Based Paint Hazards § 745.61 Scope and applicability. (a) This subpart identifies lead-based paint hazards. (b) The standards for lead-based paint hazards apply to target housing and child-occupied facilities. (c) Nothing in...

  10. 40 CFR 745.61 - Scope and applicability.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CONTROL ACT LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Lead-Based Paint Hazards § 745.61 Scope and applicability. (a) This subpart identifies lead-based paint hazards. (b) The standards for lead-based paint hazards apply to target housing and child-occupied facilities. (c) Nothing in...

  11. 40 CFR 745.61 - Scope and applicability.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CONTROL ACT LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Lead-Based Paint Hazards § 745.61 Scope and applicability. (a) This subpart identifies lead-based paint hazards. (b) The standards for lead-based paint hazards apply to target housing and child-occupied facilities. (c) Nothing in...

  12. Multi-hazard risk assessment applied to hydraulic fracturing operations

    NASA Astrophysics Data System (ADS)

    Garcia-Aristizabal, Alexander; Gasparini, Paolo; Russo, Raffaella; Capuano, Paolo

    2017-04-01

    Without exception, the exploitation of any energy resource produces impacts and intrinsically bears risks. Therefore, to make sound decisions about future energy resource exploitation, it is important to clearly understand the potential environmental impacts in the full life-cycle of an energy development project, distinguishing between the specific impacts intrinsically related to exploiting a given energy resource and those shared with the exploitation of other energy resources. Technological advances as directional drilling and hydraulic fracturing have led to a rapid expansion of unconventional resources (UR) exploration and exploitation; as a consequence, both public health and environmental concerns have risen. The main objective of a multi-hazard risk assessment applied to the development of UR is to assess the rate (or the likelihood) of occurrence of incidents and the relative potential impacts on surrounding environment, considering different hazards and their interactions. Such analyses have to be performed considering the different stages of development of a project; however, the discussion in this paper is mainly focused on the analysis applied to the hydraulic fracturing stage of a UR development project. The multi-hazard risk assessment applied to the development of UR poses a number of challenges, making of this one a particularly complex problem. First, a number of external hazards might be considered as potential triggering mechanisms. Such hazards can be either of natural origin or anthropogenic events caused by the same industrial activities. Second, failures might propagate through the industrial elements, leading to complex scenarios according to the layout of the industrial site. Third, there is a number of potential risk receptors, ranging from environmental elements (as the air, soil, surface water, or groundwater) to local communities and ecosystems. The multi-hazard risk approach for this problem is set by considering multiple hazards (and their possible interactions) as possible sources of system's perturbation that might drive to the development of an incidental event. Given the complexity of the problem, we adopt a multi-level approach: first, perform a qualitative analysis oriented to the identification of a wide range of possible scenarios; this process is based on a review of potential impacts in different risk receptors reported in literature, which is condensed in a number of causal diagrams created for different stages of a UR development project. Second, the most important scenarios for quantitative multi-hazard risk analyses are selected for further quantification. This selection is based on the identification of major risks, i.e., those related with the occurrence of low probability/high impact extreme events. The general framework for the quantitative multi-hazard risk analysis is represented using a so-called bow-tie structure. It is composed of a fault tree on the left hand side of the graphic plot, identifying the possible events causing the critical (or top) event, and an event tree on the right-hand side showing the possible consequences of the critical event. This work was supported under SHEER: "Shale Gas Exploration and Exploitation Induced Risks" project n.640896, funded from Horizon 2020 - R&I Framework Programme, call H2020-LCE-2014-1

  13. Evaluating Alternatives for Drinking Water at Deployed Locations

    DTIC Science & Technology

    2006-03-01

    Tucker and Sands, 1999; Beering , 2002). 1986 Plutonium was found in the New York city drinking water system. Though the concentrations were...based approach called Hazard Analysis and Critical Control Point ( HACCP ). This approach holds that avoidance is practical and effective where other

  14. The Production of Corporate Research to Manufacture Doubt About the Health Hazards of Products: An Overview of the Exponent Bakelite® Simulation Study.

    PubMed

    Egilman, David

    2018-01-01

    Although corporate sponsorship of research does not necessarily lead to biased results, in some industries, it has resulted in the publication of inaccurate and misleading data. Some companies have hired scientific consulting firms to retrospectively calculate exposures to hazardous products during use that are no longer manufactured or sold. As an example, this paper reviews one such study-a litigation-generated study of Union Carbide Corporation's asbestos-containing product, Bakelite®. This analysis is based on previously secret documents, produced as a result of litigation. The study generated asbestos fiber exposure measurements which resulted in underestimates of actual exposures to create doubt about the hazards associated with manufacture and manipulation of Bakelite®.

  15. An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest

    PubMed Central

    Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon

    2017-01-01

    In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further. PMID:29186922

  16. An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest.

    PubMed

    Suh, Jangwon; Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon

    2017-11-27

    In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further.

  17. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  18. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  19. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  20. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  1. Hazard Analysis of Commercial Space Transportation: Volume 1: Operations. Volume 2: Hazards. Volume 3: Risk Analysis

    DOT National Transportation Integrated Search

    1988-05-01

    The report is devoted to the review and discussion of generic hazards associated with the ground, launch, orbital and re-entry phases of space operations. Since the DOT Office of Commercial Space Transportation (OCST) has been charged with protecting...

  2. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  3. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  4. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  5. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  6. A LiDAR based analysis of hydraulic hazard mapping

    NASA Astrophysics Data System (ADS)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed one by one. Therefore, each segment was split into many reaches, so that within any of them the slope of the piezometric line can be approximated to zero. As a consequence, the hydraulic profile (open channel flow) in every reach is assumed horizontal both downslope and on the cross-section. Each reach can be seen as a polygon, delimited laterally by the hazard mapping boundaries and longitudinally by two successive cross sections, usually orthogonal to the talweg line. Simulating the progressive increase of the river stage, with a horizontal piezometric line, allow the definition of the stage-area and stage-volume relationships. Such relationships are obtained exclusively by the geometric information as provided by the high resolution elevation model. The maximum flooded area resulting from the simulation is finally compared to the potentially floodable area described by the hazard maps, to give a flooding index for every reach. Index values lower than 100% show that the mapped hazard area exceeds the maximum floodable area. Very low index values identify spots where there is a significant incongruity between the hazard map and the topography, and where a specific verification is probably needed. The procedure was successfully used for the validation of many hazard maps across Italy.

  7. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  8. A prototype web-GIS application for risk analysis of natural hazards in Switzerland

    NASA Astrophysics Data System (ADS)

    Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien

    2016-04-01

    Following changes in the system of Swiss subsidy in January 2008, the Swiss cantons and the Federal Office for the Environment (FOEN) were forced to prioritize different natural hazard protection projects based on their cost-effectiveness, as a response to limited financial resources (Bründl et al., 2009). For this purpose, applications such as EconoMe (OFEV, 2016) and Valdorisk (DGE, 2016) were developed for risk evaluation and prioritization of mitigation projects. These tools serve as a useful decision-making instrument to the community of practitioners and responsible authorities for natural hazard risk management in Switzerland. However, there are several aspects which could be improved, in particular, the integration and visualization of spatial information interactively through a web-GIS interface for better risk planning and evaluation. Therefore, in this study, we aim to develop an interactive web-GIS application based on the risk concepts applied in Switzerland. The purpose of this tool is to provide a rapid evaluation of risk before and after protection measures, and to test the efficiency of measures by using a simplified cost-benefit analysis within the context of different protection projects. This application allows to integrate different layers which are necessary to calculate risk, in particular, hazard intensity (vector) maps for different scenarios (such as 30, 100 and 300 years of return periods based on Swiss guidelines), exposed objects (such as buildings) and vulnerability information of these objects. Based on provided information and additional parameters, risk is calculated automatically and results are visualized within the web-GIS interface of the application. The users can modify these input information and parameters to create different risk scenarios. Based on the resultant risk scenarios, the users can propose and visualize (preliminary) risk reduction measures before realizing the actual design and dimensions of such protective measures in the area. After designing measures, the users can re-calculate risk by updating hazard intensity and object layers. This is achieved by manual editing of shape (vector) layers in the web-GIS interface interactively. Within the application, a cost-benefit analysis tool is also integrated to support the decision-making process for the selection of different protection measures. Finally, the resultant risk information (vector layers and data) can be exported in the form of shapefiles and excel sheets. A prototype application is realized using open-source geospatial software and technologies. Boundless framework with its client-side SDK environment is applied for the rapid prototyping. Free and open source components such as PostGIS spatial database, GeoServer and GeoWebCache, GeoExt and OpenLayers are used for the development of the platform. This developed prototype is demonstrated with a case study area located in Les Diablerets, Switzerland. This research work is carried out within a project funded by the Canton of Vaud, Switzerland. References: Bründl, M., Romang, H. E., Bischof, N., and Rheinberger, C. M.: The risk concept and its application in natural hazard risk management in Switzerland, Nat. Hazards Earth Syst. Sci., 9, 801-813, 2009. DGE: Valdorisk - Direction Générale de l'Environnement, www.vd.ch, accessed 9 January 2016, 2016. OFEV: EconoMe - Office fédéral de l'environnement, www.econome.admin.ch, accessed 9 January 2016, 2016.

  9. 78 FR 3646 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-16

    ...The Food and Drug Administration (FDA) is proposing to amend its regulation for Current Good Manufacturing Practice In Manufacturing, Packing, or Holding Human Food (CGMPs) to modernize it and to add requirements for domestic and foreign facilities that are required to register under the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to establish and implement hazard analysis and risk- based preventive controls for human food. FDA also is proposing to revise certain definitions in FDA's current regulation for Registration of Food Facilities to clarify the scope of the exemption from registration requirements provided by the FD&C Act for ``farms.'' FDA is taking this action as part of its announced initiative to revisit the CGMPs since they were last revised in 1986 and to implement new statutory provisions in the FD&C Act. The proposed rule is intended to build a food safety system for the future that makes modern, science-, and risk-based preventive controls the norm across all sectors of the food system.

  10. Device for Detection of Explosives, Nuclear and Other Hazardous Materials in Luggage and Cargo Containers

    NASA Astrophysics Data System (ADS)

    Kuznetsov, Andrey; Evsenin, Alexey; Gorshkov, Igor; Osetrov, Oleg; Vakhtin, Dmitry

    2009-12-01

    Device for detection of explosives, radioactive and heavily shielded nuclear materials in luggage and cargo containers based on Nanosecond Neutron Analysis/Associated Particles Technique (NNA/APT) is under construction. Detection module consists of a small neutron generator with built-in position-sensitive detector of associated alpha-particles, and several scintillator-based gamma-ray detectors. Explosives and other hazardous chemicals are detected by analyzing secondary high-energy gamma-rays from reactions of fast neutrons with materials inside a container. The same gamma-ray detectors are used to detect unshielded radioactive and nuclear materials. An array of several neutron detectors is used to detect fast neutrons from induced fission of nuclear materials. Coincidence and timing analysis allows one to discriminate between fission neutrons and scattered probing neutrons. Mathematical modeling by MCNP5 and MCNP-PoliMi codes was used to estimate the sensitivity of the device and its optimal configuration. Comparison of the features of three gamma detector types—based on BGO, NaI and LaBr3 crystals is presented.

  11. Urbanization factors associated with childhood asthma and prematurity: a population-based analysis aged from 0 to 5 years in Taiwan by using Cox regression within a hospital cluster model.

    PubMed

    Lin, Sheng-Chieh; Lin, Hui-Wen

    2015-04-01

    Childhood asthma and premature birth are both common; however, no studies have reported urbanization association between asthma and prematurity and the duration of prematurity affect asthma development. We use Taiwan Longitudinal Health Insurance Database (LHID) to explore association between asthma and prematurity among children by using a population-based analysis. This is a retrospective cohort study with registration data derived from Taiwan LHID. We evaluated prematurely born infants and children aged <5 years (n = 532) and age-matched control patients (n = 60505) using Cox proportional hazard regression analysis within a hospital cluster model. Of the 61 037 examinees, 14 012 experienced asthma during the 5-year follow-up, including 161 (72.26 per 1000 person-years) infants and children born prematurely and 13 851 (40.27 per 1000 person-years) controls. The hazard ratio for asthma during 5-year follow-up period was 1.95 (95% confidence interval = 1.67-2.28) among children born prematurely. Boys born prematurely aged 0-2 years were associated with higher asthma rates compared with girls in non-premature and premature groups. Living in urban areas, those born prematurely were associated with higher rates of asthma compared with non-prematurity. Those born prematurely lived in northern region had higher asthma hazard ratio than other regions. Our analyses indicated that sex, age, urbanization level, and geographic region are significantly associated with prematurity and asthma. Based on cumulative asthma-free survival curve generated using the Kaplan-Meier method, infants born prematurely should be closely monitored to see if they would develop asthma until the age of 6 years.

  12. Risk of infective endocarditis in patients with systemic lupus erythematosus in Taiwan: a nationwide population-based study.

    PubMed

    Chang, Y S; Chang, C C; Chen, Y H; Chen, W S; Chen, J H

    2017-10-01

    Objectives Patients with systemic lupus erythematosus are considered vulnerable to infective endocarditis and prophylactic antibiotics are recommended before an invasive dental procedure. However, the evidence is insufficient. This nationwide population-based study evaluated the risk and related factors of infective endocarditis in systemic lupus erythematosus. Methods We identified 12,102 systemic lupus erythematosus patients from the National Health Insurance research-oriented database, and compared the incidence rate of infective endocarditis with that among 48,408 non-systemic lupus erythematosus controls. A Cox multivariable proportional hazards model was employed to evaluate the risk of infective endocarditis in the systemic lupus erythematosus cohort. Results After a mean follow-up of more than six years, the systemic lupus erythematosus cohort had a significantly higher incidence rate of infective endocarditis (42.58 vs 4.32 per 100,000 person-years, incidence rate ratio = 9.86, p < 0.001) than that of the control cohort. By contrast, the older systemic lupus erythematosus cohort had lower risk (adjusted hazard ratio 11.64) than that of the younger-than-60-years systemic lupus erythematosus cohort (adjusted hazard ratio 15.82). Cox multivariate proportional hazards analysis revealed heart disease (hazard ratio = 5.71, p < 0.001), chronic kidney disease (hazard ratio = 2.98, p = 0.034), receiving a dental procedure within 30 days (hazard ratio = 36.80, p < 0.001), and intravenous steroid therapy within 30 days (hazard ratio = 39.59, p < 0.001) were independent risk factors for infective endocarditis in systemic lupus erythematosus patients. Conclusions A higher risk of infective endocarditis was observed in systemic lupus erythematosus patients. Risk factors for infective endocarditis in the systemic lupus erythematosus cohort included heart disease, chronic kidney disease, steroid pulse therapy within 30 days, and a recent invasive dental procedure within 30 days.

  13. Expanding CyberShake Physics-Based Seismic Hazard Calculations to Central California

    NASA Astrophysics Data System (ADS)

    Silva, F.; Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.

    2016-12-01

    As part of its program of earthquake system science, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by first simulating a tensor-valued wavefield of Strain Green Tensors. CyberShake then takes an earthquake rupture forecast and extends it by varying the hypocenter location and slip distribution, resulting in about 500,000 rupture variations. Seismic reciprocity is used to calculate synthetic seismograms for each rupture variation at each computation site. These seismograms are processed to obtain intensity measures, such as spectral acceleration, which are then combined with probabilities from the earthquake rupture forecast to produce a hazard curve. Hazard curves are calculated at seismic frequencies up to 1 Hz for hundreds of sites in a region and the results interpolated to obtain a hazard map. In developing and verifying CyberShake, we have focused our modeling in the greater Los Angeles region. We are now expanding the hazard calculations into Central California. Using workflow tools running jobs across two large-scale open-science supercomputers, NCSA Blue Waters and OLCF Titan, we calculated 1-Hz PSHA results for over 400 locations in Central California. For each location, we produced hazard curves using both a 3D central California velocity model created via tomographic inversion, and a regionally averaged 1D model. These new results provide low-frequency exceedance probabilities for the rapidly expanding metropolitan areas of Santa Barbara, Bakersfield, and San Luis Obispo, and lend new insights into the effects of directivity-basin coupling associated with basins juxtaposed to major faults such as the San Andreas. Particularly interesting are the basin effects associated with the deep sediments of the southern San Joaquin Valley. We will compare hazard estimates from the 1D and 3D models, summarize the challenges of expanding CyberShake to a new geographic region, and describe our future CyberShake plans.

  14. Hazard assessment of substances produced from the accidental heating of chemical compounds.

    PubMed

    Lunghi, A; Gigante, L; Cardillo, P; Stefanoni, V; Pulga, G; Rota, R

    2004-12-10

    Accidental events concerning process industries can affect not only the staff working in, but also the environment and people living next to the factory. For this reason a regulation is imposed by the European Community to prevent accidents that could represent a risk for the population and the environment. In particular, Directive 96/82/CE, the so-called 'Seveso II directive', requests a risk analysis involving also the hazardous materials generated in accidental events. Therefore, it is necessary to develop simple and economic procedure to foresee the hazardous materials that can be produced in the case of major accidents, among which the accidental heating of a chemical due to a fire or a runaway reaction is one of the most frequent. The procedure proposed in this work is based on evolved gas analysis methodology that consists in coupling two instruments: a thermogravimetric analyzer or a flash pyrolyzer, that are employed to simulate accident conditions, and a FTIR spectrometer that can be used to detect the evolved gas composition. More than 40 materials have been examined in various accident scenarios and the obtained data have been statistically analyzed in order to identify meaningful correlations between the presence of a chemical group in the molecule of a chemical and the presence of a given hazardous species in the fume produced.

  15. Antenatal Steroid Therapy for Fetal Lung Maturation and the Subsequent Risk of Childhood Asthma: A Longitudinal Analysis

    PubMed Central

    Pole, Jason D.; Mustard, Cameron A.; To, Teresa; Beyene, Joseph; Allen, Alexander C.

    2010-01-01

    This study was designed to test the hypothesis that fetal exposure to corticosteroids in the antenatal period is an independent risk factor for the development of asthma in early childhood with little or no effect in later childhood. A population-based cohort study of all pregnant women who resided in Nova Scotia, Canada, and gave birth to a singleton fetus between 1989 and 1998 was undertaken. After a priori specified exclusions, 80,448 infants were available for analysis. Using linked health care utilization records, incident asthma cases developed after 36 months of age were identified. Extended Cox proportional hazards models were used to estimate hazard ratios while controlling for confounders. Exposure to corticosteroids during pregnancy was associated with a risk of asthma in childhood between 3–5 years of age: adjusted hazard ratio of 1.19 (95% confidence interval: 1.03, 1.39), with no association noted after 5 years of age: adjusted hazard ratio for 5–7 years was 1.06 (95% confidence interval: 0.86, 1.30) and for 8 or greater years was 0.74 (95% confidence interval: 0.54, 1.03). Antenatal steroid therapy appears to be an independent risk factor for the development of asthma between 3 and 5 years of age. PMID:21490744

  16. National assessment of hurricane-induced coastal erosion hazards--Gulf of Mexico

    USGS Publications Warehouse

    Stockdon, Hilary F.; Doran, Kara S.; Thompson, David M.; Sopkin, Kristin L.; Plant, Nathaniel G.; Sallenger, Asbury H.

    2012-01-01

    Sandy beaches provide a natural barrier between the ocean and inland communities, ecosystems, and resources. However, these dynamic environments move and change in response to winds, waves, and currents. During a hurricane, these changes can be large and sometimes catastrophic. High waves and storm surge act together to erode beaches and inundate low-lying lands, putting inland communities at risk. A decade of USGS research on storm-driven coastal change hazards has provided the data and modeling capabilities to identify areas of our coastline that are likely to experience extreme and potentially hazardous erosion during a hurricane. This report defines hurricane-induced coastal erosion hazards for sandy beaches along the U.S. Gulf of Mexico coastline. The analysis is based on a storm-impact scaling model that uses observations of beach morphology combined with sophisticated hydrodynamic models to predict how the coast will respond to the direct landfall of category 1-5 hurricanes. Hurricane-induced water levels, due to both surge and waves, are compared to beach and dune elevations to determine the probabilities of three types of coastal change: collision (dune erosion), overwash, and inundation. As new beach morphology observations and storm predictions become available, this analysis will be updated to describe how coastal vulnerability to storms will vary in the future.

  17. 75 FR 14361 - Notification, Documentation, and Recordkeeping Requirements for Inspected Establishments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-25

    ... establishment's process control plans, that is, its Hazard Analysis and Critical Control Point plans. DATES... control plans, i.e., its Hazard Analysis and Critical Control Point (HACCP) plans; and (3) make the recall... systematic prevention of biological, chemical, and physical hazards. HACCP plans are establishment-developed...

  18. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... determine whether there are food hazards that are reasonably likely to occur for each type of juice...

  19. Worker-specific exposure monitor and method for surveillance of workers

    DOEpatents

    Lovejoy, Michael L.; Peeters, John P.; Johnson, A. Wayne

    2000-01-01

    A person-specific monitor that provides sensor information regarding hazards to which the person is exposed and means to geolocate the person at the time of the exposure. The monitor also includes means to communicate with a remote base station. Information from the monitor can be downloaded at the base station for long term storage and analysis. The base station can also include means to recharge the monitor.

  20. Flood Impacts on People: from Hazard to Risk Maps

    NASA Astrophysics Data System (ADS)

    Arrighi, C.; Castelli, F.

    2017-12-01

    The mitigation of adverse consequences of floods on people is crucial for civil protection and public authorities. According to several studies, in the developed countries the majority of flood-related fatalities occurs due to inappropriate high risk behaviours such as driving and walking in floodwaters. In this work both the loss of stability of vehicles and pedestrians in floodwaters are analysed. Flood hazard is evaluated, based on (i) a 2D inundation model of an urban area, (ii) 3D hydrodynamic simulations of water flows around vehicles and human body and (iii) a dimensional analysis of experimental activity. Exposure and vulnerability of vehicles and population are assessed exploiting several sources of open GIS data in order to produce risk maps for a testing case study. The results show that a significant hazard to vehicles and pedestrians exists in the study area. Particularly high is the hazard to vehicles, which are likely to be swept away by flood flow, possibly aggravate damages to structures and infrastructures and locally alter the flood propagation. Exposure and vulnerability analysis identifies some structures such as schools and public facilities, which may attract several people. Moreover, some shopping facilities in the area, which attract both vehicular and pedestrians' circulation are located in the highest flood hazard zone.The application of the method demonstrates that, at municipal level, such risk maps can support civil defence strategies and education to active citizenship, thus contributing to flood impact reduction to population.

  1. Climate-Related Hazards: A Method for Global Assessment of Urban and Rural Population Exposure to Cyclones, Droughts, and Floods

    PubMed Central

    Christenson, Elizabeth; Elliott, Mark; Banerjee, Ovik; Hamrick, Laura; Bartram, Jamie

    2014-01-01

    Global climate change (GCC) has led to increased focus on the occurrence of, and preparation for, climate-related extremes and hazards. Population exposure, the relative likelihood that a person in a given location was exposed to a given hazard event(s) in a given period of time, was the outcome for this analysis. Our objectives were to develop a method for estimating the population exposure at the country level to the climate-related hazards cyclone, drought, and flood; develop a method that readily allows the addition of better datasets to an automated model; differentiate population exposure of urban and rural populations; and calculate and present the results of exposure scores and ranking of countries based on the country-wide, urban, and rural population exposures to cyclone, drought, and flood. Gridded global datasets on cyclone, drought and flood occurrence as well as population density were combined and analysis was carried out using ArcGIS. Results presented include global maps of ranked country-level population exposure to cyclone, drought, flood and multiple hazards. Analyses by geography and human development index (HDI) are also included. The results and analyses of this exposure assessment have implications for country-level adaptation. It can also be used to help prioritize aid decisions and allocation of adaptation resources between countries and within a country. This model is designed to allow flexibility in applying cyclone, drought and flood exposure to a range of outcomes and adaptation measures. PMID:24566046

  2. Assessment of a Tsunami Hazard for Mediterranean Coast of Egypt

    NASA Astrophysics Data System (ADS)

    Zaytsev, Andrey; Babeyko, Andrey; Yalciner, Ahmet; Pelinovsky, Efim

    2017-04-01

    Analysis of tsunami hazard for Egypt based on historic data and numerical modelling of historic and prognostic events is given. There are 13 historic events for 4000 years, including one instrumental record (1956). Tsunami database includes 12 earthquake tsunamis and 1 event of volcanic origin (Santorini eruption). Tsunami intensity of events (365, 881, 1303, 1870) is estimated as I = 3 led to tsunami wave height more than 6 m. Numerical simulation of some possible scenario of tsunamis of seismic and landslide origin is done with use of NAMI-DANCE software solved the shallow-water equations. The PTHA method (Probabilistic Tsunami Hazard Assessment - Probabilistic assessment of a tsunami hazard) for the Mediterranean Sea developed in (Sorensen M.B., Spada M., Babeyko A., Wiemer S., Grunthal G. Probabilistic tsunami hazard in the Mediterranean Sea. J Geophysical Research, 2012, vol. 117, B01305) is used to evaluate the probability of tsunami occurrence on the Egyptian coast. The synthetic catalogue of prognostic tsunamis of seismic origin with magnitude more than 6.5 includes 84 920 events for 100000 years. For the wave heights more 1 m the curve: exceedance probability - tsunami height can be approximated by exponential Gumbel function with two parameters which are determined for each coastal location in Egypt (totally. 24 points). Prognostic extreme highest events with probability less 10-4 are not satisfied to the Gumbel function (approximately 10 events) and required the special analysis. Acknowledgements: This work was supported EU FP7 ASTARTE Project [603839], and for EP - NS6637.2016.5.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kienhuis, Anne S., E-mail: anne.kienhuis@rivm.nl; RIKILT, Institute of Food Safety, Wageningen UR, PO Box 230, 6700 AE, Wageningen; Netherlands Toxicogenomics Centre

    Hepatic systems toxicology is the integrative analysis of toxicogenomic technologies, e.g., transcriptomics, proteomics, and metabolomics, in combination with traditional toxicology measures to improve the understanding of mechanisms of hepatotoxic action. Hepatic toxicology studies that have employed toxicogenomic technologies to date have already provided a proof of principle for the value of hepatic systems toxicology in hazard identification. In the present review, acetaminophen is used as a model compound to discuss the application of toxicogenomics in hepatic systems toxicology for its potential role in the risk assessment process, to progress from hazard identification towards hazard characterization. The toxicogenomics-based parallelogram is usedmore » to identify current achievements and limitations of acetaminophen toxicogenomic in vivo and in vitro studies for in vitro-to-in vivo and interspecies comparisons, with the ultimate aim to extrapolate animal studies to humans in vivo. This article provides a model for comparison of more species and more in vitro models enhancing the robustness of common toxicogenomic responses and their relevance to human risk assessment. To progress to quantitative dose-response analysis needed for hazard characterization, in hepatic systems toxicology studies, generation of toxicogenomic data of multiple doses/concentrations and time points is required. Newly developed bioinformatics tools for quantitative analysis of toxicogenomic data can aid in the elucidation of dose-responsive effects. The challenge herein is to assess which toxicogenomic responses are relevant for induction of the apical effect and whether perturbations are sufficient for the induction of downstream events, eventually causing toxicity.« less

  4. Stormwater Characterization and Lagoon Sediment Analysis, Grand Forks Air Force Base, North Dakota

    DTIC Science & Technology

    1990-08-01

    tetrachloroethylene, and 0.0026 mg/l ethyl benzene. Analyses showed no pesticides . 4. Extraction Procedure (EP) Analysis. An AFOEHL contractor performed EP extraction ...runoff met North Dakota state stream standards. Lagoon sediment did not contain Extraction Procedure hazardous chemicals. Stormwater runoff exceeded...Standards for Water Quality for the State of North Dakota ( Extracts ) 39 D Site/Analysis Summary 69 E Lift Station Flow Records 73 F Wastewater

  5. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system. PMID:24467813

  6. Discrete mathematical data analysis approach: a valuable assessment method for sustainable chemistry.

    PubMed

    Voigt, Kristina; Scherb, Hagen; Bruggemann, Rainer; Schramm, Karl-Werner

    2013-06-01

    Sustainable/Green Chemistry is a chemical philosophy encouraging the design of products and processes that reduce or eliminate the use and generation of hazardous substances. In this respect, metrical scientific disciplines like Chemometrics are important, because they indicate criteria for chemicals being hazardous or not. We demonstrated that sustainable principles in the disciplines Green Chemistry, Green Engineering, and Sustainability in Information Technology have main aspects in common. The use of non-hazardous chemicals or the more efficient use of chemical substances is one of these aspects. We take a closer look on the topic of the hazards of chemical substances. Our research focuses on data analyses concerning environmental chemicals named Persistent Organic Pollutants (POPs), which are found all over the world and pose a large risk to environment as well as to humans. The evaluation of the data is a major step in the elucidation of the danger of these chemicals. The data analysis method demonstrated here, is based on the theory of partially ordered sets and provides a generalized ranking. In our approach we investigate data sets of breast milk samples of women in Denmark, Finland, and Turkey which contained measurable levels of 20 POPs. The goal is twofold: On the one side the hazardous chemicals are to be identified and on the other side possible differences among the three nations should be detected, because in that case possible different uptake mechanisms may be supposed. The data analysis is performed by the free available software package PyHasse, written by the third author. We conclude that the data analysis method can well be applied for distinguishing between more or less dangerous existing chemicals. Furthermore, it should be used in sustainable chemistry in the same manner for detecting more and less sustainable chemicals. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Economic Impact Analyses of Interdisciplinary Multi-hazard Scenarios: ShakeOut and ARkStorm

    NASA Astrophysics Data System (ADS)

    Wein, A. M.; Rose, A.; Sue Wing, I.; Wei, D.

    2011-12-01

    U. S. Geological Survey (USGS) scientists are using an interdisciplinary strategy to develop and analyze multi-hazard scenarios to help communities enhance resilience to natural hazard disasters. Two such scenarios are the southern California ShakeOut earthquake and the California ARkStorm winter storm. Both scenarios are multi-hazard: Shakeout ground motions trigger landslides and liquefaction and ARkStorm involves wind, flood, landslide, and coastal hazards. A collaborative scenario-process engages partners and stakeholders throughout the development and use of the scenarios, In doing so, community resilience is enhanced by educating communities about hazards and hazard interdependencies, building networks from scientists to decision makers, exercising emergency management strategies, identifying emergency management issues, and motivating solutions prior to an event. In addition, interdisciplinary scenarios stimulate research on the various steps of analysis (e.g., natural hazard processes, physical damages, societal consequences, and policy connections). In particular, USGS scientists have collaborated with economists to advance methods to estimate the economic impacts (business interruption losses) of disasters. Our economic impact analyses evolved from the economic module in the Federal Emergency Management Agency's loss-estimation tool, HAZUS-MH, to a more encompassing input-output analysis for ShakeOut, to a more sophisticated Computable General Equilibrium model for ARkStorm. The analyses depend on physical damage and restoration time estimates from engineers and geographic analyses of economic assets in hazard zones. Economic resilience strategies are incorporated to represent resourcefulness and ingenuity that avoids potential losses during and after an event. Such strategies operate at three levels of the economy: micro (e.g., ability to catch up on lost production time), meso (e.g., coordination within a sector to share resources), and macro (e.g., price adjustments to redistribute scarce resources). A sensitivity analysis of the ARkStorm economic impact model explores the effects of 1) the magnitude of the shocks (e.g., flood damages to buildings and infrastructure, agricultural productivity, and lifeline service disruptions), 2) the sustainability of the economic resilience strategies, and 3) the amount, timing, and source of reconstruction funds. The inclusion of an economic analysis in ShakeOut and ARkStorm broadens the range of interest in the scenario results. For example, the relative contribution of ShakeOut economic shocks to business interruption losses emphasized the need to reduce the impacts of fire following earthquake and water service disruption. Based on the magnitude and duration of the economic impacts for the ARkStorm scenario, policy experts surmised that business interruption policy time elements would be exceeded and business interruptions would be largely unfunded calling attention to the need for innovative funding solutions. Finally, economic impact analyses inform the question of paying now to mitigate or paying more later to recover.

  8. Using SAR and GPS for Hazard Management and Response: Progress and Examples from the Advanced Rapid Imaging and Analysis (ARIA) Project

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Simons, M.; Hua, H.; Yun, S. H.; Agram, P. S.; Milillo, P.; Sacco, G. F.; Webb, F.; Rosen, P. A.; Lundgren, P.; Milillo, G.; Manipon, G. J. M.; Moore, A. W.; Liu, Z.; Polet, J.; Cruz, J.

    2014-12-01

    ARIA is a joint JPL/Caltech project to automate synthetic aperture radar (SAR) and GPS imaging capabilities for scientific understanding, hazard response, and societal benefit. We have built a prototype SAR and GPS data system that forms the foundation for hazard monitoring and response capability, as well as providing imaging capabilities important for science studies. Together, InSAR and GPS have the ability to capture surface deformation in high spatial and temporal resolution. For earthquakes, this deformation provides information that is complementary to seismic data on location, geometry and magnitude of earthquakes. Accurate location information is critical for understanding the regions affected by damaging shaking. Regular surface deformation measurements from SAR and GPS are useful for monitoring changes related to many processes that are important for hazard and resource management such as volcanic deformation, groundwater withdrawal, and landsliding. Observations of SAR coherence change have a demonstrated use for damage assessment for hazards such as earthquakes, tsunamis, hurricanes, and volcanic eruptions. These damage assessment maps can be made from imagery taken day or night and are not affected by clouds, making them valuable complements to optical imagery. The coherence change caused by the damage from hazards (building collapse, flooding, ash fall) is also detectable with intelligent algorithms, allowing for rapid generation of damage assessment maps over large areas at fine resolution, down to the spatial scale of single family homes. We will present the progress and results we have made on automating the analysis of SAR data for hazard monitoring and response using data from the Italian Space Agency's (ASI) COSMO-SkyMed constellation of X-band SAR satellites. Since the beginning of our project with ASI, our team has imaged deformation and coherence change caused by many natural hazard events around the world. We will present progress on our data system technology that enables rapid and reliable production of imagery. Lastly, we participated in the March 2014 FEMA exercise based on a repeat of the 1964 M9.2 Alaska earthquake, providing simulated data products for use in this hazards response exercise. We will present lessons learned from this and other simulation exercises.

  9. An operational-oriented approach to the assessment of low probability seismic ground motions for critical infrastructures

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, Mariano; Assatourians, Karen; Jimenez, Maria-Jose

    2018-01-01

    Extreme natural hazard events have the potential to cause significant disruption to critical infrastructure (CI) networks. Among them, earthquakes represent a major threat as sudden-onset events with limited, if any, capability of forecast, and high damage potential. In recent years, the increased exposure of interdependent systems has heightened concern, motivating the need for a framework for the management of these increased hazards. The seismic performance level and resilience of existing non-nuclear CIs can be analyzed by identifying the ground motion input values leading to failure of selected key elements. Main interest focuses on the ground motions exceeding the original design values, which should correspond to low probability occurrence. A seismic hazard methodology has been specifically developed to consider low-probability ground motions affecting elongated CI networks. The approach is based on Monte Carlo simulation, which allows for building long-duration synthetic earthquake catalogs to derive low-probability amplitudes. This approach does not affect the mean hazard values and allows obtaining a representation of maximum amplitudes that follow a general extreme-value distribution. This facilitates the analysis of the occurrence of extremes, i.e., very low probability of exceedance from unlikely combinations, for the development of, e.g., stress tests, among other applications. Following this methodology, extreme ground-motion scenarios have been developed for selected combinations of modeling inputs including seismic activity models (source model and magnitude-recurrence relationship), ground motion prediction equations (GMPE), hazard levels, and fractiles of extreme ground motion. The different results provide an overview of the effects of different hazard modeling inputs on the generated extreme motion hazard scenarios. This approach to seismic hazard is at the core of the risk analysis procedure developed and applied to European CI transport networks within the framework of the European-funded INFRARISK project. Such an operational seismic hazard framework can be used to provide insight in a timely manner to make informed risk management or regulating further decisions on the required level of detail or on the adoption of measures, the cost of which can be balanced against the benefits of the measures in question.

  10. Forensic and homeland security applications of modern portable Raman spectroscopy.

    PubMed

    Izake, Emad L

    2010-10-10

    Modern detection and identification of chemical and biological hazards within the forensic and homeland security contexts may well require conducting the analysis in field while adapting a non-contact approach to the hazard. Technological achievements on both surface and resonance enhancement Raman scattering re-developed Raman spectroscopy to become the most adaptable spectroscopy technique for stand-off and non-contact analysis of hazards. On the other hand, spatially offset Raman spectroscopy proved to be very valuable for non-invasive chemical analysis of hazards concealed within non-transparent containers and packaging. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  11. Food Safety Programs Based on HACCP Principles in School Nutrition Programs: Implementation Status and Factors Related to Implementation

    ERIC Educational Resources Information Center

    Stinson, Wendy Bounds; Carr, Deborah; Nettles, Mary Frances; Johnson, James T.

    2011-01-01

    Purpose/Objectives: The objectives of this study were to assess the extent to which school nutrition (SN) programs have implemented food safety programs based on Hazard Analysis and Critical Control Point (HACCP) principles, as well as factors, barriers, and practices related to implementation of these programs. Methods: An online survey was…

  12. Guess-Work and Reasonings on Centennial Evolution of Surface Air Temperature in Russia. Part III: Where is the Joint Between Norms and Hazards from a Bifurcation Analysis Viewpoint?

    NASA Astrophysics Data System (ADS)

    Kolokolov, Yury; Monovskaya, Anna

    2016-06-01

    The paper continues the application of the bifurcation analysis in the research on local climate dynamics based on processing the historically observed data on the daily average land surface air temperature. Since the analyzed data are from instrumental measurements, we are doing the experimental bifurcation analysis. In particular, we focus on the discussion where is the joint between the normal dynamics of local climate systems (norms) and situations with the potential to create damages (hazards)? We illustrate that, perhaps, the criteria for hazards (or violent and unfavorable weather factors) relate mainly to empirical considerations from human opinion, but not to the natural qualitative changes of climate dynamics. To build the bifurcation diagrams, we base on the unconventional conceptual model (HDS-model) which originates from the hysteresis regulator with double synchronization. The HDS-model is characterized by a variable structure with the competition between the amplitude quantization and the time quantization. Then the intermittency between three periodical processes is considered as the typical behavior of local climate systems instead of both chaos and quasi-periodicity in order to excuse the variety of local climate dynamics. From the known specific regularities of the HDS-model dynamics, we try to find a way to decompose the local behaviors into homogeneous units within the time sections with homogeneous dynamics. Here, we present the first results of such decomposition, where the quasi-homogeneous sections (QHS) are determined on the basis of the modified bifurcation diagrams, and the units are reconstructed within the limits connected with the problem of shape defects. Nevertheless, the proposed analysis of the local climate dynamics (QHS-analysis) allows to exhibit how the comparatively modest temperature differences between the mentioned units in an annual scale can step-by-step expand into the great temperature differences of the daily variability at a centennial scale. Then the norms and the hazards relate to the fundamentally different viewpoints, where the time sections of months and, especially, seasons distort the causal effects of natural dynamical processes. The specific circumstances to realize the qualitative changes of the local climate dynamics are summarized by the notion of a likely periodicity. That, in particular, allows to explain why 30-year averaging remains the most common rule so far, but the decadal averaging begins to substitute that rule. We believe that the QHS-analysis can be considered as the joint between the norms and the hazards from a bifurcation analysis viewpoint, where the causal effects of the local climate dynamics are projected into the customary timescale only at the last step. We believe that the results could be interesting to develop the fields connected with climatic change and risk assessment.

  13. Multi -risk assessment at a national level in Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashvili, Otar; Amiranashvili, Avtandil; Tsereteli, Emili; Elizbarashvili, Elizbar; Saluqvadze, Manana; Dolodze, Jemal

    2013-04-01

    Work presented here was initiated by national GNSF project " Reducing natural disasters multiple risk: a positive factor for Georgia development " and two international projects: NATO SFP 983038 "Seismic hazard and Rusk assessment for Southern Caucasus-eastern Turkey Energy Corridors" and EMME " Earthquake Model for Middle east Region". Methodology for estimation of "general" vulnerability, hazards and multiple risk to natural hazards (namely, earthquakes, landslides, snow avalanches, flash floods, mudflows, drought, hurricanes, frost, hail) where developed for Georgia. The electronic detailed databases of natural disasters were created. These databases contain the parameters of hazardous phenomena that caused natural disasters. The magnitude and intensity scale of the mentioned disasters are reviewed and the new magnitude and intensity scales are suggested for disasters for which the corresponding formalization is not yet performed. The associated economic losses were evaluated and presented in monetary terms for these hazards. Based on the hazard inventory, an approach was developed that allowed for the calculation of an overall vulnerability value for each individual hazard type, using the Gross Domestic Product per unit area (applied to population) as the indicator for elements at risk exposed. The correlation between estimated economic losses, physical exposure and the magnitude for each of the six types of hazards has been investigated in detail by using multiple linear regression analysis. Economic losses for all past events and historical vulnerability were estimated. Finally, the spatial distribution of general vulnerability was assessed, and the expected maximum economic loss was calculated as well as a multi-risk map was set-up.

  14. Landing Site Dispersion Analysis and Statistical Assessment for the Mars Phoenix Lander

    NASA Technical Reports Server (NTRS)

    Bonfiglio, Eugene P.; Adams, Douglas; Craig, Lynn; Spencer, David A.; Strauss, William; Seelos, Frank P.; Seelos, Kimberly D.; Arvidson, Ray; Heet, Tabatha

    2008-01-01

    The Mars Phoenix Lander launched on August 4, 2007 and successfully landed on Mars 10 months later on May 25, 2008. Landing ellipse predicts and hazard maps were key in selecting safe surface targets for Phoenix. Hazard maps were based on terrain slopes, geomorphology maps and automated rock counts of MRO's High Resolution Imaging Science Experiment (HiRISE) images. The expected landing dispersion which led to the selection of Phoenix's surface target is discussed as well as the actual landing dispersion predicts determined during operations in the weeks, days, and hours before landing. A statistical assessment of these dispersions is performed, comparing the actual landing-safety probabilities to criteria levied by the project. Also discussed are applications for this statistical analysis which were used by the Phoenix project. These include using the statistical analysis used to verify the effectiveness of a pre-planned maneuver menu and calculating the probability of future maneuvers.

  15. Household perceptions of coastal hazards and climate change in the Central Philippines.

    PubMed

    Combest-Friedman, Chelsea; Christie, Patrick; Miles, Edward

    2012-12-15

    As a tropical archipelagic nation, the Philippines is particularly susceptible to coastal hazards, which are likely to be exacerbated by climate change. To improve coastal hazard management and adaptation planning, it is imperative that climate information be provided at relevant scales and that decision-makers understand the causes and nature of risk in their constituencies. Focusing on a municipality in the Central Philippines, this study examines local meteorological information and explores household perceptions of climate change and coastal hazard risk. First, meteorological data and local perceptions of changing climate conditions are assessed. Perceived changes in climate include an increase in rainfall and rainfall variability, an increase in intensity and frequency of storm events and sea level rise. Second, factors affecting climate change perceptions and perceived risk from coastal hazards are determined through statistical analysis. Factors tested include social status, economic standing, resource dependency and spatial location. Results indicate that perceived risk to coastal hazards is most affected by households' spatial location and resource dependency, rather than socio-economic conditions. However, important differences exist based on the type of hazard and nature of risk being measured. Resource dependency variables are more significant in determining perceived risk from coastal erosion and sea level rise than flood events. Spatial location is most significant in determining households' perceived risk to their household assets, but not perceived risk to their livelihood. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Geological risk assessment for the rapid development area of the Erhai Basin

    NASA Astrophysics Data System (ADS)

    Yang, Liu; Wang, Zhanqi; Jin, Gui; Chen, Dongdong; Wang, Zhan

    For low-slope hilly land development to have more new land space in a watershed, it is particularly important that to coordinate the sharply increasing conflicts between mountainous and urban land utilization in the city. However, development of low-slope hilly land easily induce potential risks of geologic hazards such as landslide and landslip. It may lead to further environmental losses in a watershed. Hence, it is necessary to study potential risks of geo-hazards in low-slope hilly land development in urban area. Based on GIS spatial analysis technique, we select a study area, Dali City in the Erhai Basin located in watershed belt of Jinsha River, Lancang River and Red River in Yunnan Province of China. Through studying some relevant key indexes and parameters for monitoring potential risks of geo-hazards, we establish a composite index model for zoning the area with potential risks of geo-hazards in development of low-slope hilly land in the study area. Our research findings indicate that the potential risks of geo-hazards in eastern Dali City is relatively low while of that on slow hills with gentle slopes in the western area are relatively high. By using a zoning research method, generated maps show geological information of potential risks of geo-hazards on low-slope hilly land which provide important messages for guarding against natural geo-hazards and potential environmental losses in a watershed.

  17. Geotechnical applications of LiDAR pertaining to geomechanical evaluation and hazard identification

    NASA Astrophysics Data System (ADS)

    Lato, Matthew J.

    Natural hazards related to ground movement that directly affect the safety of motorists and highway infrastructure include, but are not limited to, rockfalls, rockslides, debris flows, and landslides. This thesis specifically deals with the evaluation of rockfall hazards through the evaluation of LiDAR data. Light Detection And Ranging (LiDAR) is an imaging technology that can be used to delineate and evaluate geomechanically-controlled hazards. LiDAR has been adopted to conduct hazard evaluations pertaining to rockfall, rock-avalanches, debris flows, and landslides. Characteristics of LiDAR surveying, such as rapid data acquisition rates, mobile data collection, and high data densities, pose problems to traditional CAD or GIS-based mapping methods. New analyses methods, including tools specifically oriented to geomechanical analyses, are needed. The research completed in this thesis supports development of new methods, including improved survey techniques, innovative software workflows, and processing algorithms to aid in the detection and evaluation of geomechanically controlled rockfall hazards. The scientific research conducted between the years of 2006-2010, as presented in this thesis, are divided into five chapters, each of which has been published by or is under review by an international journal. The five research foci are: (i) geomechanical feature extraction and analysis using LiDAR data in active mining environments; (ii) engineered monitoring of rockfall hazards along transportation corridors: using mobile terrestrial LiDAR; (iii) optimization of LiDAR scanning and processing for automated structural evaluation of discontinuities in rockmasses; (iv) location orientation bias when using static LiDAR data for geomechanical analysis; and (v) evaluating roadside rockmasses for rockfall hazards from LiDAR data: optimizing data collection and processing protocols. The research conducted pertaining to this thesis has direct and significant implications with respect to numerous engineering projects that are affected by geomechanical stability issues. The ability to efficiently and accurately map discontinuities, detect changes, and standardize roadside geomechanical stability analyses from remote locations will fundamentally change the state-of-practice of geotechnical investigation workflows and repeatable monitoring. This, in turn, will lead to earlier detection and definition of potential zones of instability, will allow for progressive monitoring and risk analysis, and will indicate the need for pro-active slope improvement and stabilization.

  18. Predictive Analytics for Safer Food Supply

    USDA-ARS?s Scientific Manuscript database

    Science based risk analysis improves the USDA Food Safety Inspection Service’s ability to combat threats to public health from food-borne illness by allowing the Agency to focus resources on hazards that pose the greatest risk. Innovative algorithms enable detection and containment of threat by an...

  19. 40 CFR 63.11502 - What definitions apply to this subpart?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...: process knowledge, an engineering assessment, or test data. Byproduct means a chemical (liquid, gas, or... limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources...

  20. 40 CFR 63.11502 - What definitions apply to this subpart?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...: process knowledge, an engineering assessment, or test data. Byproduct means a chemical (liquid, gas, or... limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources...

  1. Introduction to HACCP.

    USDA-ARS?s Scientific Manuscript database

    The Hazard Analysis and Critical Control Point (HACCP) food safety inspection program is utilized by both USDA Food Safety Inspection Service (FSIS) and FDA for many of the products they regulate. This science-based program was implemented by the USDA FSIS to enhance the food safety of meat and pou...

  2. PINS Spectrum Identification Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.J. Caffrey

    2012-03-01

    The Portable Isotopic Neutron Spectroscopy—PINS, for short—system identifies the chemicals inside munitions and containers without opening them, a decided safety advantage if the fill chemical is a hazardous substance like a chemical warfare agent or an explosive. The PINS Spectrum Identification Guide is intended as a reference for technical professionals responsible for the interpretation of PINS gamma-ray spectra. The guide is divided into two parts. The three chapters that constitute Part I cover the science and technology of PINS. Neutron activation analysis is the focus of Chapter 1. Chapter 2 explores PINS hardware, software, and related operational issues. Gamma-ray spectralmore » analysis basics are introduced in Chapter 3. The six chapters of Part II cover the identification of PINS spectra in detail. Like the PINS decision tree logic, these chapters are organized by chemical element: phosphorus-based chemicals, chlorine-based chemicals, etc. These descriptions of hazardous, toxic, and/or explosive chemicals conclude with a chapter on the identification of the inert chemicals, e.g. sand, used to fill practice munitions.« less

  3. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics of glacial lakes and their hazard potential. This phase of glacial lake hazard assessment aims to be geographically comprehensive in order to identify potentially dangerous lakes that may have previously been ignored. A second phase of analysis that includes site visits will be necessary for a thorough analysis at each lake to determine the potential hazard for downstream communities. The objective of the work presented here is to identify potentially dangerous lakes that warrant further study rather than provide a final hazard assessment for each lake of the glacial lake inventory in the Cordillera Blanca. References: Emmer, A. and Vilímek, V.: New method for assessing the potential hazardousness of glacial lakes in the Cordillera Blanca, Peru, Hydrol. Earth Syst. Sci. Discuss., 11, 2391-2439, 2014. UGRH - Unidad de Glaciologia y Recursos Hidricos. Inventario de Lagunas Glaciares del Peru. Ministerio de Agricultura y Riego, Autoridad Nacional del Agua, Direcccion de Conservacion y Planeamiento de Recursos Hidricos, Huaraz, Peru, 2014. Wang, W., Yao, T., Gao, Y., Yang, X., and Kattel, D. B.: A first-order method to identify potentially dangerous glacial lakes in a region of the southeastern Tibetan Plateau, Mountain Res. Develop., 31, 122-130, 2011.

  4. A Case Study of Measuring Process Risk for Early Insights into Software Safety

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.

    2011-01-01

    In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.

  5. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHAmore » then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment controls in the field.« less

  6. Microzonation Mapping Of The Yanbu Industrial City, Western Saudi Arabia: A Multicriteria Decision Analysis Approach

    NASA Astrophysics Data System (ADS)

    Moustafa, Sayed, Sr.; Alarifi, Nassir S.; Lashin, Aref A.

    2016-04-01

    Urban areas along the western coast of Saudi Arabia are susceptible to natural disasters and environmental damages due to lack of planning. To produce a site-specific microzonation map of the rapidly growing Yanbu industrial city, spatial distribution of different hazard entities are assessed using the Analytical Hierarchal Process (AHP) together with Geographical Information System (GIS). For this purpose six hazard parameter layers are considered, namely; fundamental frequency, site amplification, soil strength in terms of effective shear-wave velocity, overburden sediment thickness, seismic vulnerability index and peak ground acceleration. The weight and rank values are determined during AHP and are assigned to each layer and its corresponding classes, respectively. An integrated seismic microzonation map was derived using GIS platform. Based on the derived map, the study area is classified into five hazard categories: very low, low, moderate high, and very high. The western and central parts of the study area, as indicated from the derived microzonation map, are categorized as a high hazard zone as compared to other surrounding places. The produced microzonation map of the current study is envisaged as a first-level assessment of the site specific hazards in the Yanbu city area, which can be used as a platform by different stakeholders in any future land-use planning and environmental hazard management.

  7. Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria

    NASA Astrophysics Data System (ADS)

    Maltzkait, Anika; Pfurtscheller, Clemens

    2014-05-01

    Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria A. Maltzkait (1) & C. Pfurtscheller (1) (1) Institute for Interdisciplinary Mountain Research (IGF), Austrian Academy of Sciences, Innsbruck, Austria The extreme flood events of 2002, 2005 and 2013 in Austria underlined the importance of local emergency services being able to withstand and reduce the adverse impacts of natural hazards. Although for legal reasons municipal emergency and crisis management plans exist in Austria, they mostly do not cover risk analyses of natural hazards - a sound, comparable assessment to identify and evaluate risks. Moreover, total losses and operational emergencies triggered by natural hazards have increased in recent decades. Given sparse public funds, objective budget decisions are needed to ensure the efficient provision of operating resources, like personnel, vehicles and equipment in the case of natural hazards. We present a case study of the municipality of Au, Austria, which was hardly affected during the 2005 floods. Our approach is primarily based on a qualitative risk analysis, combining existing hazard plans, GIS data, field mapping and data on operational efforts of the fire departments. The risk analysis includes a map of phenomena discussed in a workshop with local experts and a list of risks as well as a risk matrix prepared at that workshop. On the basis for the exact requirements for technical and non-technical mitigation measures for each natural hazard risk were analysed in close collaboration with members of the municipal operation control and members of the local emergency services (fire brigade, Red Cross). The measures includes warning, evacuation and, technical interventions with heavy equipment and personnel. These results are used, first, to improve the municipal emergency and crisis management plan by providing a risk map, and a list of risks and, second, to check if the local emergency forces can cope with the different risk scenarios using locally available resources. The emergency response plans will identify possible resource deficiencies in personnel, vehicles and equipment. As qualitative methods and data are used, uncertainties in the study emerged in finding definitions for safety targets, in the construction of the different risk scenarios, in the inherent uncertainty beyond the probability of occurrence and the intensity of natural hazards, also in the case of the expectable losses. Finally, we used available studies and expert interviews to develop objective rules for investment decisions for the fire departments and the Red Cross to present an empirically sound basis for the efficient provision of intervention in the case of natural hazards for the municipality of Au. Again, the regulations for objective provision were developed in close collaboration with the emergency services.

  8. Risk analysis of landslide disaster in Ponorogo, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Koesuma, S.; Saido, A. P.; Fukuda, Y.

    2016-11-01

    Ponorogo is one of regency in South-West of East Java Province, Indonesia, where located in subduction zone between Eurasia and Australia plate tectonics. It has a lot of mountain area which is disaster-prone area for landslide. We have collected landslide data in 305 villages in Ponorogo and make it to be Hazards Index. Then we also calculate Vulnerability Index, Economic Loss index, Environmental Damage Index and Capacity Index. The risk analysis map is composed of three components H (Hazards), V (Vulnerability, Economic Loss index, Environmental Damage Index) and C (Capacity Index). The method is based on regulations of National Disaster Management Authority (BNPB) number 02/2012 and number 03/2012. It has three classes of risk index, i.e. Low, Medium and High. Ponorogo city has a medium landslide risk index.

  9. Association between Recurrent Metastasis from Stage II and III Primary Colorectal Tumors and Moderate Microsatellite Instability

    PubMed Central

    Garcia, Melissa; Choi, Chan; Kim, Hyeong-Rok; Daoud, Yahya; Toiyama, Yuji; Takahashi, Masanobu; Goel, Ajay; Boland, C Richard; Koi, Minoru

    2012-01-01

    Colorectal cancer (CRC) cells frequently have low levels of microsatellite instability (MSI-L) and elevated microsatellite alterations at tetranucleotide repeats (EMAST), but little is known about the clinicopathological significance of these features. We observed that patients with stage II or III CRC with MSI-L and/or EMAST had a shorter times of recurrence-free survival than patients with high levels of MSI (MSI-H) (P=.0084) or with highly stable microsatellites (H-MSS) (P=.0415), based on Kaplan-Meier analysis. MSI-L and/or EMAST were independent predictors of recurrent distant metastasis from primary stage II or III colorectal tumors (Cox proportional hazard analysis hazard ratio, 1.83; 95% confidence interval, 1.06–3.15; P=.0301). PMID:22465427

  10. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  11. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  12. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  13. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  14. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  15. Re-evaluation and updating of the seismic hazard of Lebanon

    NASA Astrophysics Data System (ADS)

    Huijer, Carla; Harajli, Mohamed; Sadek, Salah

    2016-01-01

    This paper presents the results of a study undertaken to evaluate the implications of the newly mapped offshore Mount Lebanon Thrust (MLT) fault system on the seismic hazard of Lebanon and the current seismic zoning and design parameters used by the local engineering community. This re-evaluation is critical, given that the MLT is located at close proximity to the major cities and economic centers of the country. The updated seismic hazard was assessed using probabilistic methods of analysis. The potential sources of seismic activities that affect Lebanon were integrated along with any/all newly established characteristics within an updated database which includes the newly mapped fault system. The earthquake recurrence relationships of these sources were developed from instrumental seismology data, historical records, and earlier studies undertaken to evaluate the seismic hazard of neighboring countries. Maps of peak ground acceleration contours, based on 10 % probability of exceedance in 50 years (as per Uniform Building Code (UBC) 1997), as well as 0.2 and 1 s peak spectral acceleration contours, based on 2 % probability of exceedance in 50 years (as per International Building Code (IBC) 2012), were also developed. Finally, spectral charts for the main coastal cities of Beirut, Tripoli, Jounieh, Byblos, Saida, and Tyre are provided for use by designers.

  16. Potential of Organic Matrix Composites for Liquid Oxygen Tank

    NASA Technical Reports Server (NTRS)

    Davis, Samuel E.; Herald, Stephen D.; Stolzfus, Joel M.; Engel, Carl D.; Bohlen, James W.; Palm, Tod; Robinson, Michael J.

    2005-01-01

    Composite materials are being considered for the tankage of cryogenic propellants in access to space because of potentially lower structural weights. A major hurdle for composites is an inherent concern about the safety of using flammable structural materials in contact with liquid and gaseous oxygen. A hazards analysis approach addresses a series of specific concerns that must be addressed based upon test data. Under the 2nd Generation Reusable Launch Vehicle contracts, testing was begun for a variety of organic matrix composite materials both to aid in the selection of materials and to provide needed test data to support hazards analyses. The work has continued at NASA MSFC and the NASA WSTF to provide information on the potential for using composite materials in oxygen systems. Appropriate methods for oxygen compatibility testing of structural materials and data for a range of composite materials from impact, friction, flammability and electrostatic discharge testing are presented. Remaining concerns and conclusions about composite tank structures, and recommendations for additional testing are discussed. Requirements for system specific hazards analysis are identified.

  17. Aircraft Loss of Control: Problem Analysis for the Development and Validation of Technology Solutions

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Newman, Richard L.; Crider, Dennis A.; Klyde, David H.; Foster, John V.; Groff, Loren

    2016-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes. LOC can result from a wide spectrum of precursors (or hazards), often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and the validation process must provide a means of assessing system effectiveness and coverage of these hazards. This paper provides a detailed description of a methodology for analyzing LOC as a dynamics and control problem for the purpose of developing effective technology solutions. The paper includes a definition of LOC based on several recent publications, a detailed description of a refined LOC accident analysis process that is illustrated via selected example cases, and a description of planned follow-on activities for identifying future potential LOC risks and the development of LOC test scenarios. Some preliminary considerations for LOC of Unmanned Aircraft Systems (UAS) and for their safe integration into the National Airspace System (NAS) are also discussed.

  18. “Smooth” Semiparametric Regression Analysis for Arbitrarily Censored Time-to-Event Data

    PubMed Central

    Zhang, Min; Davidian, Marie

    2008-01-01

    Summary A general framework for regression analysis of time-to-event data subject to arbitrary patterns of censoring is proposed. The approach is relevant when the analyst is willing to assume that distributions governing model components that are ordinarily left unspecified in popular semiparametric regression models, such as the baseline hazard function in the proportional hazards model, have densities satisfying mild “smoothness” conditions. Densities are approximated by a truncated series expansion that, for fixed degree of truncation, results in a “parametric” representation, which makes likelihood-based inference coupled with adaptive choice of the degree of truncation, and hence flexibility of the model, computationally and conceptually straightforward with data subject to any pattern of censoring. The formulation allows popular models, such as the proportional hazards, proportional odds, and accelerated failure time models, to be placed in a common framework; provides a principled basis for choosing among them; and renders useful extensions of the models straightforward. The utility and performance of the methods are demonstrated via simulations and by application to data from time-to-event studies. PMID:17970813

  19. [Design of a HACCP Plan for the Gouda-type cheesemaking process in a milk processing plant].

    PubMed

    Dávila, Jacqueline; Reyes, Genara; Corzo, Otoniel

    2006-03-01

    The Hazard Analysis and Critical Control Point (HACCP) is a preventive and systematic method used to identify, assess and control of the hazards related with raw material, ingredients, processing, marketing and intended consumer in order to assure the safety of the food. The aim of this study was to design a HACCP plan for implementing in a Gouda-type cheese-making process in a dairy processing plant. The used methodology was based in the application of the seven principles of the HACCP, the information from the plant about the compliment of the pre-requisite programs (70-80%), the experience of the HACCP team and the sequence of stages settles down by the COVENIN standard 3802 for implementing the HACCP system. A HACCP plan was proposed with the scope, the selection of HACCP team, the description of the product and the intended use, the flow diagram of the process, the hazard analysis and the control table of the plan with the critical control points (CCP). The following CCP were identified in the process: pasteurization, coagulation and ripening.

  20. SCEC Earthquake System Science Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.

    2008-12-01

    The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.

  1. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    NASA Astrophysics Data System (ADS)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the earthquake wave propagation to tsunami mitigation would be feasible once the user community support is in place.

  2. Manned space flight nuclear system safety. Volume 3: Reactor system preliminary nuclear safety analysis. Part 3: Nuclear Safety Analysis Document (NSAD)

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Nuclear safety analysis as applied to a space base mission is presented. The nuclear safety analysis document summarizes the mission and the credible accidents/events which may lead to nuclear hazards to the general public. The radiological effects and associated consequences of the hazards are discussed in detail. The probability of occurrence is combined with the potential number of individuals exposed to or above guideline values to provide a measure of accident and total mission risk. The overall mission risk has been determined to be low with the potential exposure to or above 25 rem limited to less than 4 individuals per every 1000 missions performed. No radiological risk to the general public occurs during the prelaunch phase at KSC. The most significant risks occur from prolonged exposure to reactor debris following land impact generally associated with the disposal phase of the mission where fission product inventories can be high.

  3. Survival analysis using inverse probability of treatment weighted methods based on the generalized propensity score.

    PubMed

    Sugihara, Masahiro

    2010-01-01

    In survival analysis, treatment effects are commonly evaluated based on survival curves and hazard ratios as causal treatment effects. In observational studies, these estimates may be biased due to confounding factors. The inverse probability of treatment weighted (IPTW) method based on the propensity score is one of the approaches utilized to adjust for confounding factors between binary treatment groups. As a generalization of this methodology, we developed an exact formula for an IPTW log-rank test based on the generalized propensity score for survival data. This makes it possible to compare the group differences of IPTW Kaplan-Meier estimators of survival curves using an IPTW log-rank test for multi-valued treatments. As causal treatment effects, the hazard ratio can be estimated using the IPTW approach. If the treatments correspond to ordered levels of a treatment, the proposed method can be easily extended to the analysis of treatment effect patterns with contrast statistics. In this paper, the proposed method is illustrated with data from the Kyushu Lipid Intervention Study (KLIS), which investigated the primary preventive effects of pravastatin on coronary heart disease (CHD). The results of the proposed method suggested that pravastatin treatment reduces the risk of CHD and that compliance to pravastatin treatment is important for the prevention of CHD. (c) 2009 John Wiley & Sons, Ltd.

  4. Grand Junction projects office mixed-waste treatment program, VAC*TRAX mobile treatment unit process hazards analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bloom, R.R.

    1996-04-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changesmore » to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less

  5. Application of Bioassays in Toxicological Hazard, Risk and Impact Assessment of Dredged Sediments

    EPA Science Inventory

    Given the potential environmental consequences of dumped dredged harbour sediments it is vital to establish the potential risks from exposure before disposal at sea. Currently, European legislation for disposal of contaminated sediments at sea is based on chemical analysis of a l...

  6. 14 CFR 35.38 - Lightning strike.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... STANDARDS: PROPELLERS Tests and Inspections § 35.38 Lightning strike. The applicant must demonstrate, by tests, analysis based on tests, or experience on similar designs, that the propeller can withstand a lightning strike without causing a major or hazardous propeller effect. The limit to which the propeller has...

  7. 14 CFR 35.38 - Lightning strike.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... STANDARDS: PROPELLERS Tests and Inspections § 35.38 Lightning strike. The applicant must demonstrate, by tests, analysis based on tests, or experience on similar designs, that the propeller can withstand a lightning strike without causing a major or hazardous propeller effect. The limit to which the propeller has...

  8. 14 CFR 35.38 - Lightning strike.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... STANDARDS: PROPELLERS Tests and Inspections § 35.38 Lightning strike. The applicant must demonstrate, by tests, analysis based on tests, or experience on similar designs, that the propeller can withstand a lightning strike without causing a major or hazardous propeller effect. The limit to which the propeller has...

  9. 14 CFR 35.38 - Lightning strike.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... STANDARDS: PROPELLERS Tests and Inspections § 35.38 Lightning strike. The applicant must demonstrate, by tests, analysis based on tests, or experience on similar designs, that the propeller can withstand a lightning strike without causing a major or hazardous propeller effect. The limit to which the propeller has...

  10. 14 CFR 35.38 - Lightning strike.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... STANDARDS: PROPELLERS Tests and Inspections § 35.38 Lightning strike. The applicant must demonstrate, by tests, analysis based on tests, or experience on similar designs, that the propeller can withstand a lightning strike without causing a major or hazardous propeller effect. The limit to which the propeller has...

  11. The principles of HACCP.

    USDA-ARS?s Scientific Manuscript database

    The Hazard Analysis and Critical Control Point (HACCP) food safety inspection program is utilized by both USDA Food Safety Inspection Service (FSIS) and FDA for many of the products they regulate. This science-based program was implemented by the USDA FSIS to enhance the food safety of meat and pou...

  12. Construction safety monitoring based on the project's characteristic with fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Winanda, Lila Ayu Ratna; Adi, Trijoko Wahyu; Anwar, Nadjadji; Wahyuni, Febriana Santi

    2017-11-01

    Construction workers accident is the highest number compared with other industries and falls are the main cause of fatal and serious injuries in high rise projects. Generally, construction workers accidents are caused by unsafe act and unsafe condition that can occur separately or together, thus a safety monitoring system based on influencing factors is needed to achieve zero accident in construction industry. The dynamic characteristic in construction causes high mobility for workers while doing the task, so it requires a continuously monitoring system to detect unsafe condition and to protect workers from potential hazards. In accordance with the unique nature of project, fuzzy logic approach is one of the appropriate methods for workers safety monitoring on site. In this study, the focus of discussion is based on the characteristic of construction projects in analyzing "potential hazard" and the "protection planning" to be used in accident prevention. The data have been collected from literature review, expert opinion and institution of safety and health. This data used to determine hazard identification. Then, an application model is created using Delphi programming. The process in fuzzy is divided into fuzzification, inference and defuzzification, according to the data collection. Then, the input and final output data are given back to the expert for assessment as a validation of application model. The result of the study showed that the potential hazard of construction workers accident could be analysed based on characteristic of project and protection system on site and fuzzy logic approach can be used for construction workers accident analysis. Based on case study and the feedback assessment from expert, it showed that the application model can be used as one of the safety monitoring tools.

  13. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.

  14. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  15. Hazard-Ranking of Agricultural Pesticides for Chronic Health Effects in Yuma County, Arizona

    PubMed Central

    Sugeng, Anastasia J.; Beamer, Paloma I.; Lutz, Eric A.; Rosales, Cecilia B.

    2013-01-01

    With thousands of pesticides registered by the United States Environmental Protection Agency, it not feasible to sample for all pesticides applied in agricultural communities. Hazard-ranking pesticides based on use, toxicity, and exposure potential can help prioritize community-specific pesticide hazards. This study applied hazard-ranking schemes for cancer, endocrine disruption, and reproductive/developmental toxicity in Yuma County, Arizona. An existing cancer hazard-ranking scheme was modified, and novel schemes for endocrine disruption and reproductive/developmental toxicity were developed to rank pesticide hazards. The hazard-ranking schemes accounted for pesticide use, toxicity, and exposure potential based on chemical properties of each pesticide. Pesticides were ranked as hazards with respect to each health effect, as well as overall chronic health effects. The highest hazard-ranked pesticides for overall chronic health effects were maneb, metam sodium, trifluralin, pronamide, and bifenthrin. The relative pesticide rankings were unique for each health effect. The highest hazard-ranked pesticides differed from those most heavily applied, as well as from those previously detected in Yuma homes over a decade ago. The most hazardous pesticides for cancer in Yuma County, Arizona were also different from a previous hazard-ranking applied in California. Hazard-ranking schemes that take into account pesticide use, toxicity, and exposure potential can help prioritize pesticides of greatest health risk in agricultural communities. This study is the first to provide pesticide hazard-rankings for endocrine disruption and reproductive/developmental toxicity based on use, toxicity, and exposure potential. These hazard-ranking schemes can be applied to other agricultural communities for prioritizing community-specific pesticide hazards to target decreasing health risk. PMID:23783270

  16. Using Remotely Sensed Information for Near Real-Time Landslide Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia; Adler, Robert; Peters-Lidard, Christa

    2013-01-01

    The increasing availability of remotely sensed precipitation and surface products provides a unique opportunity to explore how landslide susceptibility and hazard assessment may be approached at larger spatial scales with higher resolution remote sensing products. A prototype global landslide hazard assessment framework has been developed to evaluate how landslide susceptibility and satellite-derived precipitation estimates can be used to identify potential landslide conditions in near-real time. Preliminary analysis of this algorithm suggests that forecasting errors are geographically variable due to the resolution and accuracy of the current susceptibility map and the application of satellite-based rainfall estimates. This research is currently working to improve the algorithm through considering higher spatial and temporal resolution landslide susceptibility information and testing different rainfall triggering thresholds, antecedent rainfall scenarios, and various surface products at regional and global scales.

  17. FLAMMABLE GAS TECHNICAL BASIS DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KRIPPS, L.J.

    2005-02-18

    This document describes the qualitative evaluation of frequency and consequences for double shell tank (DST) and single shell tank (SST) representative flammable gas accidents and associated hazardous conditions without controls. The evaluation indicated that safety-significant SSCs and/or TSRS were required to prevent or mitigate flammable gas accidents. Discussion on the resulting control decisions is included. This technical basis document was developed to support of the Tank Farms Documented Safety Analysis (DSA) and describes the risk binning process for the flammable gas representative accidents and associated represented hazardous conditions. The purpose of the risk binning process is to determine the needmore » for safety-significant structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls for a given representative accident or represented hazardous condition based on an evaluation of the event frequency and consequence.« less

  18. Radiological Hazard of Spallation Products in Accelerator-Driven System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saito, M.; Stankovskii, A.; Artisyuk, V.

    The central issue underlying this paper is related to elucidating the hazard of radioactive spallation products that might be an important factor affecting the design option of accelerator-driven systems (ADSs). Hazard analysis based on the concept of Annual Limit on Intake identifies alpha-emitting isotopes of rare earths (REs) (dysprosium, gadolinium, and samarium) as the dominant contributors to the overall toxicity of traditional (W, Pb, Pb-Bi) targets. The matter is addressed from several points of view: code validation to simulate their yields, choice of material for the neutron producing targets, and challenging the beam type. The paper quantitatively determines the domainmore » in which the toxicity of REs exceeds that of polonium activation products broadly discussed now in connection with advertising lead-bismuth technology for the needs of ADSs.« less

  19. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    EPA Pesticide Factsheets

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  20. Rasmussen's legacy: A paradigm change in engineering for safety.

    PubMed

    Leveson, Nancy G

    2017-03-01

    This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

Top