77 FR 55371 - System Safety Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-07
...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
NASA Astrophysics Data System (ADS)
Citraresmi, A. D. P.; Wahyuni, E. E.
2018-03-01
The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...
2017-08-23
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
PECH, S.H.
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
Popova, A Yu; Trukhina, G M; Mikailova, O M
In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.
Cantley, Linda F; Taiwo, Oyebode A; Galusha, Deron; Barbour, Russell; Slade, Martin D; Tessier-Sherman, Baylah; Cullen, Mark R
2014-01-01
This study aimed to examine the effect of an ergonomic hazard control (HC) initiative, undertaken as part of a company ergonomics standard, on worker injury risk. Using the company's ergonomic hazards database to identify jobs with and without ergonomic HC implementation and linking to individual job and injury histories, injury risk among person-jobs with HC implementation (the HC group) was compared to those without HC (NoHC group) using random coefficient models. Further analysis of the HC group was conducted to determine the effect of additional ergonomic hazards controlled on injury risk. Among 123 jobs at 17 plant locations, 347 ergonomic hazards were quantitatively identified during the study period. HC were implemented for 204 quantified ergonomic hazards in 84 jobs, impacting 10 385 persons (12 967 person-jobs). No HC were implemented for quantified ergonomic hazards in the remaining 39 jobs affecting 4155 persons (5046 person-jobs). Adjusting for age, sex, plant origin, and year to control for any temporal trend in injury risk, the relative risk (RR) for musculoskeletal disorder (MSD) was 0.85 and the RR for any injury or MSD was 0.92 in the HC compared to NoHC group. Among the HC group, each ergonomic hazard controlled was associated with risk reduction for MSD and acute injury outcomes (RR 0.93). Systematic ergonomic HC through participatory ergonomics, as part of a mandatory company ergonomics standard, is associated with MSD and injury risk reduction among workers in jobs with HC implemented.
Hazard detection and avoidance sensor for NASA's planetary landers
NASA Technical Reports Server (NTRS)
Lau, Brian; Chao, Tien-Hsin
1992-01-01
An optical terrain analysis based sensor system specifically designed for landing hazard detection as required for NASA's autonomous planetary landers is introduced. This optical hazard detection and avoidance (HDA) sensor utilizes an optoelectronic wedge-and-ting (WRD) filter for Fourier transformed feature extraction and an electronic neural network processor for pattern classification. A fully implemented optical HDA sensor would assure safe landing of the planetary landers. Computer simulation results of a successful feasibility study is reported. Future research for hardware system implementation is also provided.
Cantley, Linda F; Taiwo, Oyebode A; Galusha, Deron; Barbour, Russell; Slade, Martin D; Tessier-Sherman, Baylah; Cullen, Mark R
2014-01-01
Objectives This study aimed to examine the effect of an ergonomic hazard control (HC) initiative, undertaken as part of a company ergonomics standard, on worker injury risk. Methods Using the company's ergonomic hazards database to identify jobs with and without ergonomic HC implementation and linking to individual job and injury histories, injury risk among person-jobs with HC implementation (the HC group) was compared to those without HC (NoHC group) using random coefficient models. Further analysis of the HC group was conducted to determine the effect of additional ergonomic hazards controlled on injury risk. Results Among 123jobs at 17 plant locations, 347 ergonomic hazards were quantitatively identified during the study period. HC were implemented for 204 quantified ergonomic hazards in 84 jobs, impacting 10 385 persons (12 967 person-jobs). No HC were implemented for quantified ergonomic hazards in the remaining 39 jobs affecting 4155 persons (5046 person-jobs). Adjusting for age, sex, plant origin, and year to control for any temporal trend in injury risk, the relative risk (RR) for musculoskeletal disorder (MSD) was 0.85 and the RR for any injury or MSD was 0.92 in the HC compared to NoHC group. Among the HC group, each ergonomic hazard controlled was associated with risk reduction for MSD and acute injury outcomes (RR 0.93). Conclusion Systematic ergonomic HC through participatory ergonomics, as part of a mandatory company ergonomics standard, is associated with MSD and injury risk reduction among workers in jobs with HC implemented. PMID:24142048
NASA Astrophysics Data System (ADS)
Widodo, L.; Adianto; Sartika, D. I.
2017-12-01
PT. XYZ is a large automotive manufacturing company that manufacture, assemble as well as a car exporter. The other products are spare parts, jig and dies. PT. XYZ has long been implementing the Occupational Safety and Health Management System (OSHMS) to reduce the potential hazards that cause work accidents. However, this does not mean that OSHMS that has been implemented does not need to be upgraded and improved. This is due to the potential danger caused by work is quite high. This research was conducted in Sunter 2 Plant where its production activities have a high level of potential hazard. Based on Hazard Identification risk assessment, Risk Assessment, and Risk Control (HIRARC) found 10 potential hazards in Plant Stamping Production, consisting of 4 very high risk potential hazards (E), 5 high risk potential hazards (H), and 1 moderate risk potential hazard (M). While in Plant Casting Production found 22 potential hazards findings consist of 7 very high risk potential hazards (E), 12 high risk potential hazards (H), and 3 medium risk potential hazards (M). Based on the result of Fault Tree Analysis (FTA), the main priority is the high risk potential hazards (H) and very high risk potential hazards (E). The proposed improvement are to make the visual display of the importance of always using the correct Personal Protective Equipment (PPE), establishing good working procedures, conducting OSH training for workers on a regular basis, and continuing to conduct safety campaigns.
Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen
2015-09-01
To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-06
... analyses and the development of other elements of the standard; developing a written action plan for..., revalidating and retaining the process hazard analysis; developing and implementing written operating [[Page 66639
Fielding, L M; Ellis, L; Beveridge, C; Peters, A C
2005-04-01
To reduce foodborne illnesses, hazard and risk-based quality management systems are essential. Small and medium sized companies (SMEs) tend to have a poor understanding of such systems and limited adoption of the Hazard Analysis Critical Control Point system (HACCP). The requirement for full HACCP implementation by 2006 will place an even greater burden on these businesses. The aim of this project is to assess the current levels of understanding of hazards and risks in SMEs in the manufacturing sector. A questionnaire survey was made of 850 SMEs, including microbusinesses. This determined the industry sector and processes carried out, whether the company operated hazard-based quality management and the knowledge of the technical manager regarding the associated hazards and risks. Follow-up visits to the manufacturing plant observed the processes and the operatives to determine their level of understanding. A benchmarking audit was carried out and each company was rated. The results show that the majority of respondents stated that they operated hazard analysis-based quality management. The ability of the respondents to correctly define a hazard or risk or identify different types of hazard was, however, poor. There was no correlation between business type and audit score. The microbusinesses did, however, perform significantly less well than the larger SMEs.
NASA Astrophysics Data System (ADS)
Li, Deying; Yin, Kunlong; Gao, Huaxi; Liu, Changchun
2009-10-01
Although the project of the Three Gorges Dam across the Yangtze River in China can utilize this huge potential source of hydroelectric power, and eliminate the loss of life and damage by flood, it also causes environmental problems due to the big rise and fluctuation of the water, such as geo-hazards. In order to prevent and predict geo-hazards, the establishment of prediction system of geo-hazards is very necessary. In order to implement functions of hazard prediction of regional and urban geo-hazard, single geo-hazard prediction, prediction of landslide surge and risk evaluation, logical layers of the system consist of data capturing layer, data manipulation and processing layer, analysis and application layer, and information publication layer. Due to the existence of multi-source spatial data, the research on the multi-source transformation and fusion data should be carried on in the paper. Its applicability of the system was testified on the spatial prediction of landslide hazard through spatial analysis of GIS in which information value method have been applied aims to identify susceptible areas that are possible to future landslide, on the basis of historical record of past landslide, terrain parameter, geology, rainfall and anthropogenic activity. Detailed discussion was carried out on spatial distribution characteristics of landslide hazard in the new town of Badong. These results can be used for risk evaluation. The system can be implemented as an early-warning and emergency management tool by the relevant authorities of the Three Gorges Reservoir in the future.
Arvanitoyannis, Ioannis S; Traikou, Athina
2005-01-01
The production of flour and semolina and their ensuing products, such as bread, cake, spaghetti, noodles, and corn flakes, is of major importance, because these products constitute some of the main ingredients of the human diet. The Hazard Analysis Critical Control Point (HACCP) system aims at ensuring the safety of these products. HACCP has been implemented within the frame of this study on various products of both Asian and European origin; the hazards, critical control limits (CCLs), observation practices, and corrective actions have been summarized in comprehensive tables. Furthermore, the various production steps, packaging included, were thoroughly analyzed, and reference was made to both the traditional and new methodologies in an attempt to pinpoint the occurring differences (advantages and disadvantages) per process.
Performing a preliminary hazard analysis applied to administration of injectable drugs to infants.
Hfaiedh, Nadia; Kabiche, Sofiane; Delescluse, Catherine; Balde, Issa-Bella; Merlin, Sophie; Carret, Sandra; de Pontual, Loïc; Fontan, Jean-Eudes; Schlatter, Joël
2017-08-01
Errors in hospitals during the preparation and administration of intravenous drugs to infants and children have been reported to a rate of 13% to 84%. This study aimed to investigate the potential for hazardous events that may lead to an accident for preparation and administration of drug injection in a pediatric department and to describe a reduction plan of risks. The preliminary hazard analysis (PHA) method was implemented by a multidisciplinary working group over a period of 5 months (April-August 2014) in infants aged from 28 days to 2 years. The group identified required hazard controls and follow-up actions to reduce the error risk. To analyze the results, the STATCART APR software was used. During the analysis, 34 hazardous situations were identified, among 17 were quoted very critical and drawn 69 risk scenarios. After follow-up actions, the scenarios with unacceptable risk declined from 17.4% to 0%, and these with acceptable under control from 46.4% to 43.5%. The PHA can be used as an aid in the prioritization of corrective actions and the implementation of control measures to reduce risk. The PHA is a complement of the a posteriori risk management already exists. © 2017 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Hilz, Christoph; Ehrenfeld, John R.
1991-01-01
Several policy frameworks for managing hazardous waste import/export are examined with respect to economic issues, environmental sustainability, and administrative feasibility and effectiveness. Several recommendations for improving the present instrument and implementing process are offered. (Author/CW)
Dinitz, Laura B.
2008-01-01
With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS-MH currently performs analyses for earthquakes, floods, and hurricane wind. HAZUS-MH loss estimates, however, do not account for some uncertainties associated with the specific natural-hazard scenarios, such as the likelihood of occurrence within a particular time horizon or the effectiveness of alternative risk-reduction options. Because of the uncertainties involved, it is challenging to make informative decisions about how to cost-effectively reduce risk from natural-hazard events. Risk analysis is one approach that decision-makers can use to evaluate alternative risk-reduction choices when outcomes are unknown. The Land Use Portfolio Model (LUPM), developed by the U.S. Geological Survey (USGS), is a geospatial scenario-based tool that incorporates hazard-event uncertainties to support risk analysis. The LUPM offers an approach to estimate and compare risks and returns from investments in risk-reduction measures. This paper describes and demonstrates a hypothetical application of the LUPM for Ventura County, California, and examines the challenges involved in developing decision tools that provide quantitative methods to estimate losses and analyze risk from natural hazards.
An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study
NASA Technical Reports Server (NTRS)
Ray, Paul S.
1996-01-01
The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.
ERIC Educational Resources Information Center
Stinson, Wendy Bounds; Carr, Deborah; Nettles, Mary Frances; Johnson, James T.
2011-01-01
Purpose/Objectives: The objectives of this study were to assess the extent to which school nutrition (SN) programs have implemented food safety programs based on Hazard Analysis and Critical Control Point (HACCP) principles, as well as factors, barriers, and practices related to implementation of these programs. Methods: An online survey was…
Validation of a heteroscedastic hazards regression model.
Wu, Hong-Dar Isaac; Hsieh, Fushing; Chen, Chen-Hsin
2002-03-01
A Cox-type regression model accommodating heteroscedasticity, with a power factor of the baseline cumulative hazard, is investigated for analyzing data with crossing hazards behavior. Since the approach of partial likelihood cannot eliminate the baseline hazard, an overidentified estimating equation (OEE) approach is introduced in the estimation procedure. It by-product, a model checking statistic, is presented to test for the overall adequacy of the heteroscedastic model. Further, under the heteroscedastic model setting, we propose two statistics to test the proportional hazards assumption. Implementation of this model is illustrated in a data analysis of a cancer clinical trial.
Landslide hazard assessment : LIFE+IMAGINE project methodology and Liguria region use case
NASA Astrophysics Data System (ADS)
Spizzichino, Daniele; Campo, Valentina; Congi, Maria Pia; Cipolloni, Carlo; Delmonaco, Giuseppe; Guerrieri, Luca; Iadanza, Carla; Leoni, Gabriele; Trigila, Alessandro
2015-04-01
Scope of the work is to present a methodology developed for analysis of potential impacts in areas prone to landslide hazard in the framework of the EC project LIFE+IMAGINE. The project aims to implement a web services-based infrastructure addressed to environmental analysis, that integrates, in its own architecture, specifications and results from INSPIRE, SEIS and GMES. Existing web services has been customized to provide functionalities for supporting environmental integrated management. The implemented infrastructure has been applied to landslide risk scenarios, developed in selected pilot areas, aiming at: i) application of standard procedures to implement a landslide risk analysis; ii) definition of a procedure for assessment of potential environmental impacts, based on a set of indicators to estimate the different exposed elements with their specific vulnerability in the pilot area. The landslide pilot and related scenario are focused at providing a simplified Landslide Risk Assessment (LRA) through: 1) a landslide inventory derived from available historical and recent databases and maps; 2) landslide susceptibility and hazard maps; 3) assessment of exposure and vulnerability on selected typologies of elements at risk; 4) implementation of a landslide risk scenario for different sets of exposed elements 5) development of a use case; 6) definition of guidelines, best practices and production of thematic maps. The LRA has been implemented in Liguria region, Italy, in two different catchment areas located in the Cinque Terre National Park, characterized by a high landslide susceptibility and low resilience. The landslide risk impact analysis has been calibrated taking into account the socio-economic damage caused by landslides triggered by the October 2011 meteorological event. During this event, over 600 landslides were triggered in the selected pilot area. Most of landslides affected the diffuse system of anthropogenic terraces and caused the direct disruption of the walls as well as transportation of a large amount of loose sediments along the slopes and channels as induced consequence of the event. Application of a spatial analysis detected ca. 400 critical point along the road network with an average length of about 200 m. Over 1,000 buildings were affected and damaged by the event. The exposed population in the area involved by the event has been estimated in ca. 2,600 inhabitants (people?). In the pilot area, 19 different typologies of Cultural Heritage were affected by landslide phenomena or located in zones classified as high landslide hazard. The final scope of the landslide scenario is to improve the awareness on hazard, exposure, vulnerability and landslide risk in the Cinque Terre National Park to the benefit of local authorities and population. In addition, the results of the application will be used for updating the land planning process in order to improve the resilience of local communities, ii) implementing cost-benefit analysis aimed at the definition of guidelines for sustainable landslide risk mitigation strategies, iii) suggesting a general road map for the implementation of a local adaptation plan.
Employee impact and attitude analysis for GHS implementation in Taiwan.
Chang, Yi-Kuo; Su, Teh-Sheng; Ouyang, Yun; Tseng, Jo-Ming
2013-01-01
The employee impact and attitude analysis for GHS implementation in Taiwan was investigated in this study. An impact assessment on the new regulations or changes in regulations for government, potential costs, benefits, and the global trade in chemicals to industries and hazard communication program for workers was studied by the methods of the questionnaire design and Delphi expert method. A survey was conducted using questionnaires and taking 200 experts from government's expert database and 500 selected respondents from case company. Results from present study revealed that the barrier associated with GHS implementation is existed; it is feasible to overcome. Both experts and employees think that business entities are insufficient to test and classify chemicals on their own, and the technical guidance from the government is needed. Data analyzed by the logistic regression revealed that more hours an employee spends on education and trainings of new GHS systems; the employee thinks implementation of GHS will improve hazard awareness for transporters. The weak labeling ability affects deployment of the new GHS system.
NASA Astrophysics Data System (ADS)
Melzner, Sandra; Mölk, Michael; Schiffer, Michael; Gasperl, Wolfgang
2015-04-01
In times of decreasing financial resources, the demand for the investment in protection measures with a positive return on investment is of high importance. Hazard and risk assessments are essential tools in order to ensure an economically justifiable application of money in the implementation of preventive measures. Many areas in the Eastern Alps are recurrently affected by rockfall processes which pose a significant hazard to settlements and infrastructures. Complex tectonic, lithological and geomorphologic settings require a sufficient amount of effort to map and collect high quality data to perform a reliable hazard and risk analysis. The present work summarizes the results of a detailed hazard and risk assessment performed in a community in the Northern Calcareous Alps (Upper Austroalpine Unit). The community Hallstatt is exposed to very steep limestone cliffs, which are highly susceptible towards future, in many parts high magnitude rock failures. The analysis of the record of former events shows that since 1652 several rockfall events damaged or destroyed houses and killed or injured some people. Hallstatt as a Unesco World Heritage Site represents a very vulnerable settlement, the risk being elevated by a high frequency tourism with greater one million visitors per year. Discussion will focus on the applied methods to identify and map the rockfall hazard and risk, including a magnitude-frequency analysis of events in the past and an extrapolation in the future as well as a vulnerability analysis for the existing infrastructure under the assumed events for the determined magnitude-frequency scenarios. Furthermore challenges for a decision making in terms of a sustainable land use planning and implementation of preventive measures will be discussed.
Dust Hazard Management in the Outer Solar System
NASA Technical Reports Server (NTRS)
Seal, David A.
2012-01-01
Most robotic missions to the outer solar system must grapple with the hazards posed by the dusty rings of the gas giants. Early assessments of these hazards led simply to ring avoidance due to insufficient data and high uncertainties on the dust population present in such rings. Recent approaches, principal among them the Cassini dust hazard management strategy, provide useful results from detailed modeling of spacecraft vulnerabilities and dust hazard regions, which along with the range of mission trajectories are used to to assess the risks posed by each passage through a zone of potential hazard. This paper shows the general approach used to implement the analysis for Cassini, with recommendations for future outer planet missions.
Mouchtouri, Varavara; Malissiova, Eleni; Zisis, Panagiotis; Paparizou, Evina; Hadjichristodoulou, Christos
2013-01-01
The level of hygiene on ferries can have impact on travellers' health. The aim of this study was to assess the hygiene standards of ferries in Greece and to investigate whether Hazard Analysis Critical Control Points (HACCP) implementation contributes to the hygiene status and particularly food safety aboard passenger ships. Hygiene inspections on 17 ferries in Greece were performed using a standardized inspection form, with a 135-point scale. Thirty-four water and 17 food samples were collected and analysed. About 65% (11/17) of ferries were scored with >100 points. Ferries with HACCP received higher scores during inspection compared to those without HACCP (p value <0.001). All 34 microbiological water test results were found negative and, from the 17 food samples, only one was found positive for Salmonella spp. Implementation of management systems including HACCP principles can help to raise the level of hygiene aboard passenger ships.
Guide for Hydrogen Hazards Analysis on Components and Systems
NASA Technical Reports Server (NTRS)
Beeson, Harold; Woods, Stephen
2003-01-01
The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.
Making the Hubble Space Telescope servicing mission safe
NASA Technical Reports Server (NTRS)
Bahr, N. J.; Depalo, S. V.
1992-01-01
The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.
NASA Astrophysics Data System (ADS)
Moreira, Francisco; Silva, Nuno
2016-08-01
Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.
The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less
Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.; ...
2017-08-25
The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less
Introducing a change in hospital policy using FMEA methodology as a tool to reduce patient hazards.
Ofek, Fanny; Magnezi, Racheli; Kurzweil, Yaffa; Gazit, Inbal; Berkovitch, Sofia; Tal, Orna
2016-01-01
Intravenous potassium chloride (IV KCl) solutions are widely used in hospitals for treatment of hypokalemia. As ampoules of concentrated KCL must be diluted before use, critical incidents have been associated with its preparation and administration. Currently, we have introduced ready-to-use diluted KCl infusion solutions to minimize the use of high-alert concentrated KCl. Since this process may be associated with considerable risks, we embraced a proactive hazard analysis as a tool to implement a change in high-alert drug usage in a hospital setting. Failure mode and effect analysis (FMEA) is a systematic tool to analyze and identify risks in system operations. We used FMEA to examine the hazards associated with the implementation of the ready-to-use solutions. A multidisciplinary team analyzed the risks by identifying failure modes, conducting a hazard analysis and calculating the criticality index (CI) for each failure mode. A 1-day survey was performed as an evaluation step after a trial run period of approximately 4 months. Six major possible risks were identified. The most severe risks were prioritized and specific recommendations were formulated. Out of 28 patients receiving IV KCl on the day of the survey, 22 received the ready-to-use solutions and 6 received the concentrated solutions as instructed. Only 1 patient received inappropriate ready-to-use KCl. Using the FMEA tool in our study has proven once again that by creating a gradient of severity of potential vulnerable elements, we are able to proactively promote safer and more efficient processes in health care systems. This article presents a utilization of this method for implementing a change in hospital policy regarding the routine use of IV KCl.
[Design of a HACCP Plan for the Gouda-type cheesemaking process in a milk processing plant].
Dávila, Jacqueline; Reyes, Genara; Corzo, Otoniel
2006-03-01
The Hazard Analysis and Critical Control Point (HACCP) is a preventive and systematic method used to identify, assess and control of the hazards related with raw material, ingredients, processing, marketing and intended consumer in order to assure the safety of the food. The aim of this study was to design a HACCP plan for implementing in a Gouda-type cheese-making process in a dairy processing plant. The used methodology was based in the application of the seven principles of the HACCP, the information from the plant about the compliment of the pre-requisite programs (70-80%), the experience of the HACCP team and the sequence of stages settles down by the COVENIN standard 3802 for implementing the HACCP system. A HACCP plan was proposed with the scope, the selection of HACCP team, the description of the product and the intended use, the flow diagram of the process, the hazard analysis and the control table of the plan with the critical control points (CCP). The following CCP were identified in the process: pasteurization, coagulation and ripening.
Muinde, R K; Kiinyukia, C; Rombo, G O; Muoki, M A
2012-12-01
To determine the microbial load in food, examination of safety measures and possibility of implementing an Hazard Analysis Critical Control Points (HACCP) system. The target population for this study consisted of restaurants owners in Thika. Municipality (n = 30). Simple randomsamples of restaurantswere selected on a systematic sampling method of microbial analysis in cooked, non-cooked, raw food and water sanitation in the selected restaurants. Two hundred and ninety eight restaurants within Thika Municipality were selected. Of these, 30 were sampled for microbiological testing. From the study, 221 (74%) of the restaurants were ready to eat establishments where food was prepared early enough to hold and only 77(26%) of the total restaurants, customers made an order of food they wanted. 118(63%) of the restaurant operators/staff had knowledge on quality control on food safety measures, 24 (8%) of the restaurants applied these knowledge while 256 (86%) of the restaurants staff showed that food contains ingredients that were hazard if poorly handled. 238 (80%) of the resultants used weighing and sorting of food materials, 45 (15%) used preservation methods and the rest used dry foods as critical control points on food safety measures. The study showed that there was need for implementation of Hazard Analysis Critical Control Points (HACCP) system to enhance food safety. Knowledge of HACCP was very low with 89 (30%) of the restaurants applying some of quality measures to the food production process systems. There was contamination with Coliforms, Escherichia coli and Staphylococcus aureus microbial though at very low level. The means of Coliforms, Escherichia coli and Staphylococcus aureas microbial in sampled food were 9.7 x 103CFU/gm, 8.2 x 103 CFU/gm and 5.4 x 103 CFU/gm respectively with Coliforms taking the highest mean.
Risk management analysis for construction of Kutai Kartanegara bridge-East Kalimantan-Indonesia
NASA Astrophysics Data System (ADS)
Azis, Subandiyah
2017-11-01
Many sources of risk that may impede the achievement of the project objectives through either cost or quality and time, especially for bridges that have collapsed before, so when the implementation of possible hazard / high hazard so when should the possible hazard/high hazard be implemented The purpose of this research is to identify, to analyze risks by classifying the risks using the method of Risk Breakdown Structure (RBS) and managing the dominant risk of execution of the installation work to determine the handling of the steel frame in order to maximize the positive and minimize the incidence of adverse events. The results of this study indicate there are 15 sources of risks are identified, and there are 6 risk indicates the dominant risks. Mitigation performed on the dominant risks unacceptable i.e. Project factor that shows the delays in the arrival of materials due to locations, schedule of the arrival of materials should be tailored to the needs of the field and the amount of material that comes with the required field should also be evaluated. The result is expected to be a guideline for identifying risks and mitigation measures for further research. Subsequent researchers to pay attention on the security factor that implementation time does not affect the productivity of work in construction projects.
Managing risks and hazardous in industrial operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almaula, S.C.
1996-12-31
The main objective of this paper is to demonstrate that it makes good business sense to identify risks and hazards of an operation and take appropriate steps to manage them effectively. Developing and implementing an effective risk and hazard management plan also contibutes to other industry requirements and standards. Development of a risk management system, key elements of a risk management plan, and hazards and risk analysis methods are outlined. Comparing potential risk to the cost of prevention is also discussed. It is estimated that the cost of developing and preparing the first risk management plan varies between $50,000 tomore » $200,000. 3 refs., 2 figs., 1 tab.« less
Defining hazards of supplemental oxygen therapy in neonatology using the FMEA tool.
van der Eijk, Anne Catherine; Rook, Denise; Dankelman, Jenny; Smit, Bert Johan
2013-01-01
To prospectively evaluate hazards in the process of supplemental oxygen therapy in very preterm infants hospitalized in a Dutch NICU. A Failure Mode and Effects Analysis (FMEA) was conducted by a multidisciplinary team. This team identified, evaluated, and prioritized hazards of supplemental oxygen therapy in preterm infants. After accrediting "hazard scores" for each step in this process, recommendations were formulated for the main hazards. Performing the FMEA took seven meetings of 2 hours. The top 10 hazards could all be categorized into three main topics: incorrect adjustment of the fraction of inspired oxygen (FiO2), incorrect alarm limits for SpO2, and incorrect pulse-oximetry alarm limits on patient monitors for temporary use. The FMEA culminated in recommendations in both educational and technical directions. These included suggestions for (changes in) protocols on alarm limits and manual FiO2 adjustments, education of NICU staff on hazards of supplemental oxygen, and technical improvements in respiratory devices and patient monitors. The FMEA prioritized flaws in the process of supplemental oxygen therapy in very preterm infants. Thanks to the structured approach of the analysis by a multidisciplinary team, several recommendations were made. These recommendations are currently implemented in the study's center.
49 CFR Appendix C to Part 195 - Guidance for Implementation of an Integrity Management Program
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... understanding and analysis of the failure mechanisms or threats to integrity of each pipeline segment. (2) An... pipeline, information and data used for the information analysis; (13) results of the information analyses...
Mapping Natech risk due to earthquakes using RAPID-N
NASA Astrophysics Data System (ADS)
Girgin, Serkan; Krausmann, Elisabeth
2013-04-01
Natural hazard-triggered technological accidents (so-called Natech accidents) at hazardous installations are an emerging risk with possibly serious consequences due to the potential for release of hazardous materials, fires or explosions. For the reduction of Natech risk, one of the highest priority needs is the identification of Natech-prone areas and the systematic assessment of Natech risks. With hardly any Natech risk maps existing within the EU the European Commission's Joint Research Centre has developed a Natech risk analysis and mapping tool called RAPID-N, that estimates the overall risk of natural-hazard impact to industrial installations and its possible consequences. The results are presented as risk summary reports and interactive risk maps which can be used for decision making. Currently, RAPID-N focuses on Natech risk due to earthquakes at industrial installations. However, it will be extended to also analyse and map Natech risk due to floods in the near future. The RAPID-N methodology is based on the estimation of on-site natural hazard parameters, use of fragility curves to determine damage probabilities of plant units for various damage states, and the calculation of spatial extent, severity, and probability of Natech events potentially triggered by the natural hazard. The methodology was implemented as a web-based risk assessment and mapping software tool which allows easy data entry, rapid local or regional risk assessment and mapping. RAPID-N features an innovative property estimation framework to calculate on-site natural hazard parameters, industrial plant and plant unit characteristics, and hazardous substance properties. Custom damage states and fragility curves can be defined for different types of plant units. Conditional relationships can be specified between damage states and Natech risk states, which describe probable Natech event scenarios. Natech consequences are assessed using a custom implementation of U.S. EPA's Risk Management Program (RMP) Guidance for Offsite Consequence Analysis methodology. This custom implementation is based on the property estimation framework and allows the easy modification of model parameters and the substitution of equations with alternatives. RAPID-N can be applied at different stages of the Natech risk management process: It allows on the one hand the analysis of hypothetical Natech scenarios to prevent or prepare for a Natech accident by supporting land-use and emergency planning. On the other hand, once a natural disaster occurs RAPID-N can be used for rapidly locating facilities with potential Natech accident damage based on actual natural-hazard information. This provides a means to warn the population in the vicinity of the facilities in a timely manner. This presentation will introduce the specific features of RAPID-N and show the use of the tool by application to a case-study area.
Caballero Mesa, J M; Alonso Marrero, S; González Weller, D M; Afonso Gutiérrez, V L; Rubio Armendariz, C; Hardisson de la Torre, A
2006-01-01
To satisfactorily implement the critical hazards and check points analysis. Tenerife Island Subjects: 15 industries visits to gofio-manufacturing industries were done with the aim of giving advice to employers and workers, and thereafter, the intervention was assessed verifying the hygiene and sanitary conditions of the industry and the correct application of the established auto-control system. After the advising intervention, we observed that certain parameters taken into account from the hygiene and sanitary perspective have been corrected, such as modifying the facilities to adapt them to in force regulations, or asking the suppliers to certify raw materials. With regards to food production process, the intervention was effective in such a way that more than have of the industries reduced the time of those phases with higher contamination susceptibility and to carry out the control registries that were established. All industries implemented the auto-control system by means of registration charts of each one of the elaboration phases. 86% of the industries have introduced more hygienic materials. 60% implemented a reduction in intermediate times of production phases. 26% perfmored some obsolete machinery replacement modernaizing the facilities.
Connor, Thomas H; Smith, Jerome P
2016-09-01
At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.
Airflow Hazard Visualization for Helicopter Pilots: Flight Simulation Study Results
NASA Technical Reports Server (NTRS)
Aragon, Cecilia R.; Long, Kurtis R.
2005-01-01
Airflow hazards such as vortices or low level wind shear have been identified as a primary contributing factor in many helicopter accidents. US Navy ships generate airwakes over their decks, creating potentially hazardous conditions for shipboard rotorcraft launch and recovery. Recent sensor developments may enable the delivery of airwake data to the cockpit, where visualizing the hazard data may improve safety and possibly extend ship/helicopter operational envelopes. A prototype flight-deck airflow hazard visualization system was implemented on a high-fidelity rotorcraft flight dynamics simulator. Experienced helicopter pilots, including pilots from all five branches of the military, participated in a usability study of the system. Data was collected both objectively from the simulator and subjectively from post-test questionnaires. Results of the data analysis are presented, demonstrating a reduction in crash rate and other trends that illustrate the potential of airflow hazard visualization to improve flight safety.
Guide for Oxygen Hazards Analyses on Components and Systems
NASA Technical Reports Server (NTRS)
Stoltzfus, Joel M.; Dees, Jesse; Poe, Robert F.
1996-01-01
Because most materials, including metals, will burn in an oxygen-enriched environment, hazards are always present when using oxygen. Most materials will ignite at lower temperatures in an oxygen-enriched environment than in air, and once ignited, combustion rates are greater in the oxygen-enriched environment. Many metals burn violently in an oxygen-enriched environment when ignited. Lubricants, tapes, gaskets, fuels, and solvents can increase the possibility of ignition in oxygen systems. However, these hazards do not preclude the use of oxygen. Oxygen may be safely used if all the materials in a system are not flammable in the end-use environment or if ignition sources are identified and controlled. These ignition and combustion hazards necessitate a proper oxygen hazards analysis before introducing a material or component into oxygen service. The objective of this test plan is to describe the White Sands Test Facility oxygen hazards analysis to be performed on components and systems before oxygen is introduced and is recommended before implementing the oxygen component qualification procedure. The plan describes the NASA Johnson Space Center White Sands Test Facility method consistent with the ASTM documents for analyzing the hazards of components and systems exposed to an oxygen-enriched environment. The oxygen hazards analysis is a useful tool for oxygen-system designers, system engineers, and facility managers. Problem areas can be pinpointed before oxygen is introduced into the system, preventing damage to hardware and possible injury or loss of life.
AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potts, T. Todd; Hylko, James M.; Douglas, Terence A.
2003-02-27
WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHAmore » then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment controls in the field.« less
NASA Astrophysics Data System (ADS)
Berlin, Julian; Bogaard, Thom; Van Westen, Cees; Bakker, Wim; Mostert, Eric; Dopheide, Emile
2014-05-01
Cost benefit analysis (CBA) is a well know method used widely for the assessment of investments either in the private and public sector. In the context of risk mitigation and the evaluation of risk reduction alternatives for natural hazards its use is very important to evaluate the effectiveness of such efforts in terms of avoided monetary losses. However the current method has some disadvantages related to the spatial distribution of the costs and benefits, the geographical distribution of the avoided damage and losses, the variation in areas that are benefited in terms of invested money and avoided monetary risk. Decision-makers are often interested in how the costs and benefits are distributed among different administrative units of a large area or region, so they will be able to compare and analyse the cost and benefits per administrative unit as a result of the implementation of the risk reduction projects. In this work we first examined the Cost benefit procedure for natural hazards, how the costs are assessed for several structural and non-structural risk reduction alternatives, we also examined the current problems of the method such as the inclusion of cultural and social considerations that are complex to monetize , the problem of discounting future values using a defined interest rate and the spatial distribution of cost and benefits. We also examined the additional benefits and the indirect costs associated with the implementation of the risk reduction alternatives such as the cost of having a ugly landscape (also called negative benefits). In the last part we examined the current tools and software used in natural hazards assessment with support to conduct CBA and we propose design considerations for the implementation of the CBA module for the CHANGES-SDSS Platform an initiative of the ongoing 7th Framework Programme "CHANGES of the European commission. Keywords: Risk management, Economics of risk mitigation, EU Flood Directive, resilience, prevention, cost benefit analysis, spatial distribution of costs and benefits
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.
2015-08-01
Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and ways for their usage in flood risk management are outlined.
Landslide and flood hazard assessment in urban areas of LevoÄa region (Eastern Slovakia)
NASA Astrophysics Data System (ADS)
Magulova, Barbora; Caporali, Enrica; Bednarik, Martin
2010-05-01
The case study presents the use of statistical methods and analysis tools, for hazard assessment of "urbanization units", implemented in a Geographic Information Systems (GIS) environment. As a case study, the Levoča region (Slovakia) is selected. The region, with a total area of about 351 km2, is widely affected by landslides and floods. The problem, for small urbanization areas, is nowadays particularly significant from the socio-economic point of view. It is considered, presently, also an increasing problem, mainly because of climate change and more frequent extreme rainfall events. The geo-hazards are evaluated using a multivariate analysis. The landslide hazard assessment is based on the comparison and subsequent statistical elaboration of territorial dependence among different input factors influencing the instability of the slopes. Particularly, five factors influencing slope stability are evaluated, i.e. lithology, slope aspect, slope angle, hypsographic level and present land use. As a result a new landslide susceptibility map is compiled and different zones of stable, dormant and non-stable areas are defined. For flood hazard map a detailed digital elevation model is created. A compose index of flood hazard is derived from topography, land cover and pedology related data. To estimate flood discharge, time series of stream flow and precipitation measurements are used. The assessment results are prognostic maps of landslide hazard and flood hazard, which presents the optimal base for urbanization planning.
Henry, C Jeya K; Xin, Janice Lim Wen
2014-06-01
The local manufacture of ready-to-use therapeutic foods (RUTFs) is increasing, and there is a need to develop methods to ensure their safe production. We propose the application of Hazard Analysis Critical Control Point (HACCP) principles to achieve this goal. The basic principles of HACCP in the production of RUTFs are outlined. It is concluded that the implementation of an HACCP system in the manufacture of RUTFs is not only feasible but also attainable. The introduction of good manufacturing practices, coupled with an effective HACCP system, will ensure that RUTFs are produced in a cost-effective, safe, and hygienic manner.
16 CFR 1000.25 - Office of Hazard Identification and Reduction.
Code of Federal Regulations, 2014 CFR
2014-01-01
... has line authority over the Directorates for Epidemiology and Health Sciences, Economic Analysis, Engineering Sciences, and Laboratory Sciences. The Office develops strategies for and implements the agency's... social impacts of projects are comprehensively and objectively presented to the Commission for decision. ...
16 CFR 1000.25 - Office of Hazard Identification and Reduction.
Code of Federal Regulations, 2011 CFR
2011-01-01
... has line authority over the Directorates for Epidemiology and Health Sciences, Economic Analysis, Engineering Sciences, and Laboratory Sciences. The Office develops strategies for and implements the agency's... social impacts of projects are comprehensively and objectively presented to the Commission for decision. ...
16 CFR 1000.25 - Office of Hazard Identification and Reduction.
Code of Federal Regulations, 2012 CFR
2012-01-01
... has line authority over the Directorates for Epidemiology and Health Sciences, Economic Analysis, Engineering Sciences, and Laboratory Sciences. The Office develops strategies for and implements the agency's... social impacts of projects are comprehensively and objectively presented to the Commission for decision. ...
16 CFR 1000.25 - Office of Hazard Identification and Reduction.
Code of Federal Regulations, 2010 CFR
2010-01-01
... has line authority over the Directorates for Epidemiology and Health Sciences, Economic Analysis, Engineering Sciences, and Laboratory Sciences. The Office develops strategies for and implements the agency's... social impacts of projects are comprehensively and objectively presented to the Commission for decision. ...
Improving tsunami resiliency: California's Tsunami Policy Working Group
Real, Charles R.; Johnson, Laurie; Jones, Lucile M.; Ross, Stephanie L.; Kontar, Y.A.; Santiago-Fandiño, V.; Takahashi, T.
2014-01-01
California has established a Tsunami Policy Working Group to facilitate development of policy recommendations for tsunami hazard mitigation. The Tsunami Policy Working Group brings together government and industry specialists from diverse fields including tsunami, seismic, and flood hazards, local and regional planning, structural engineering, natural hazard policy, and coastal engineering. The group is acting on findings from two parallel efforts: The USGS SAFRR Tsunami Scenario project, a comprehensive impact analysis of a large credible tsunami originating from an M 9.1 earthquake in the Aleutian Islands Subduction Zone striking California’s coastline, and the State’s Tsunami Preparedness and Hazard Mitigation Program. The unique dual-track approach provides a comprehensive assessment of vulnerability and risk within which the policy group can identify gaps and issues in current tsunami hazard mitigation and risk reduction, make recommendations that will help eliminate these impediments, and provide advice that will assist development and implementation of effective tsunami hazard risk communication products to improve community resiliency.
Occupational safety and health status of medical laboratories in Kajiado County, Kenya.
Tait, Fridah Ntinyari; Mburu, Charles; Gikunju, Joseph
2018-01-01
Despite the increasing interest in Occupational Safety and Health (OSH), seldom studies are available on OSH in medical laboratories from developing countries in general although a high number of injuries occur without proper documentation. It is estimated that every day 6,300 people die as a result of occupational accidents or work-related diseases resulting in over 2.3 million deaths per year. Medical laboratories handle a wide range of materials, potentially dangerous pathogenic agents and exposes health workers to numerous potential hazards. This study evaluated the status of OSH in medical laboratories in Kajiado County, Kenya. The objectives included establishment of biological, chemical and physical hazards; reviewing medical laboratories control measures; and enumerating factors hindering implementation of good practices in OSH. This was a cross-sectional descriptive study research design. Observation check lists, interview schedules and structured questionnaires were used. The study was carried out in 108 medical laboratories among 204 sampled respondents. Data was analysed using statistical package for social science (SPSS) 20 software. The commonest type of hazards in medical laboratories include; bacteria (80%) for Biological hazards; handling un-labelled and un-marked chemicals (38.2%) for chemical hazards; and laboratory equipment's dangerously placed (49.5%) for Physical hazards. According to Pearson's Product Moment Correlation analysis, not-wearing personal protective equipment's was statistically associated with exposure to hazards. Individual control measures were statistically significant at 0.01 significance level. Only 65.1% of the factors influencing implementation of OSH in medical laboratories were identified. Training has the highest contribution to good OSH practices.
NASA Astrophysics Data System (ADS)
Congi, Maria Pia; Campo, Valentina; Cipolloni, Carlo; Delmonaco, Giuseppe; Guerrieri, Luca; Iadanza, Carla; Spizzichino, Daniele; Trigila, Alessandro
2014-05-01
The increasing damage caused by natural disasters in the last decades points out the need for interoperable added-value services to support environmental safety and human protection, by reducing vulnerability of exposed elements as well as improving the resilience of the involved communities. For this reason, to provide access to harmonized and customized data is only one of several steps towards delivering adequate support to risk assessment, reduction and management. Scope of the present work is to illustrate a methodology under development for analysis of potential impacts in areas prone to landslide hazard in the framework of the EC project LIFE+IMAGINE. The project aims to implement an infrastructure based on web services for environmental analysis, that integrates in its own architecture specifications and results from INSPIRE, SEIS and GMES. Existing web services will be customized during the project to provide functionalities for supporting the environmental integrated management. The implemented infrastructure will be applied to landslide risk scenarios, to be developed in selected pilot areas, aiming at: i) application of standard procedures to implement a landslide risk analysis; ii) definition of a procedure for assessment of potential environmental impacts, based on a set of indicators to estimate the different exposed elements with their specific vulnerability in the pilot area. More in detail, the landslide pilot will be aimed at providing a landslide risk scenario through the implementation and analysis of: 1) a landslide inventory from available historical databases and maps; 2) landslide susceptibility and hazard maps; 3) assessment of exposure and vulnerability on selected typologies of elements at risk; 4) implementation of a landslide risk scenario for different sets of exposed elements (e.g. population, road network, residential area, cultural heritage). The pilot will be implemented in Liguria, Italy, in two different catchment areas located in the Cinque Terre National Park, characterized by a high landslide susceptibility and low resilience, being highly vulnerable to landslides induced by heavy rainfall. The landslide risk impact analysis will be calibrated taking into account the socio-economic damage caused by landslides triggered by the October 2011 meteorological event. Most of landslides affected the diffuse system of anthropogenic terraces and caused the direct disruption of the walls as well as transportation of a large amount of loose sediments along the slopes and channels as induced consequence of the event. The final target of the landslide risk assessment scenario will be to improve the knowledge and awareness on hazard, exposure, vulnerability and landslide risk in the Cinque Terre National Park to the benefit of local authorities and population. In addition, the results of the application can have a practical and positive effects for i.e. i) updating the land planning process in order to improve the resilience of local communities, ii) implementing preliminary cost-benefit analysis aimed at the definition of guidelines for sustainable landslide risk mitigation strategies, iii) suggesting a general road map for the implementation of a local adaptation plan.
Di Renzo, Laura; Colica, Carmen; Carraro, Alberto; Cenci Goga, Beniamino; Marsella, Luigi Tonino; Botta, Roberto; Colombo, Maria Laura; Gratteri, Santo; Chang, Ting Fa Margherita; Droli, Maurizio; Sarlo, Francesca; De Lorenzo, Antonino
2015-04-23
The important role of food and nutrition in public health is being increasingly recognized as crucial for its potential impact on health-related quality of life and the economy, both at the societal and individual levels. The prevalence of non-communicable diseases calls for a reformulation of our view of food. The Hazard Analysis and Critical Control Point (HACCP) system, first implemented in the EU with the Directive 43/93/CEE, later replaced by Regulation CE 178/2002 and Regulation CE 852/2004, is the internationally agreed approach for food safety control. Our aim is to develop a new procedure for the assessment of the Nutrient, hazard Analysis and Critical Control Point (NACCP) process, for total quality management (TMQ), and optimize nutritional levels. NACCP was based on four general principles: i) guarantee of health maintenance; ii) evaluate and assure the nutritional quality of food and TMQ; iii) give correct information to the consumers; iv) ensure an ethical profit. There are three stages for the application of the NACCP process: 1) application of NACCP for quality principles; 2) application of NACCP for health principals; 3) implementation of the NACCP process. The actions are: 1) identification of nutritional markers, which must remain intact throughout the food supply chain; 2) identification of critical control points which must monitored in order to minimize the likelihood of a reduction in quality; 3) establishment of critical limits to maintain adequate levels of nutrient; 4) establishment, and implementation of effective monitoring procedures of critical control points; 5) establishment of corrective actions; 6) identification of metabolic biomarkers; 7) evaluation of the effects of food intake, through the application of specific clinical trials; 8) establishment of procedures for consumer information; 9) implementation of the Health claim Regulation EU 1924/2006; 10) starting a training program. We calculate the risk assessment as follows: Risk (R) = probability (P) × damage (D). The NACCP process considers the entire food supply chain "from farm to consumer"; in each point of the chain it is necessary implement a tight monitoring in order to guarantee optimal nutritional quality.
Williams, Michael S; Ebel, Eric D
2012-01-01
A common approach to reducing microbial contamination has been the implementation of a Hazard Analysis and Critical Control Point (HACCP) program to prevent or reduce contamination during production. One example is the Pathogen Reduction HACCP program implemented by the U.S. Department of Agriculture's Food Safety and Inspection Service (FSIS). This program consisted of a staged implementation between 1996 and 2000 to reduce microbial contamination on meat and poultry products. Of the commodities regulated by FSIS, one of the largest observed reductions was for Salmonella contamination on broiler chicken carcasses. Nevertheless, how this reduction might have influenced the total number of salmonellosis cases in the United States has not been assessed. This study incorporates information from public health surveillance and surveys of the poultry slaughter industry into a model that estimates the number of broiler-related salmonellosis cases through time. The model estimates that-following the 56% reduction in the proportion of contaminated broiler carcasses observed between 1995 and 2000-approximately 190,000 fewer annual salmonellosis cases (attributed to broilers) occurred in 2000 compared with 1995. The uncertainty bounds for this estimate range from approximately 37,000 to 500,000 illnesses. Estimated illnesses prevented, due to the more modest reduction in contamination of 13% between 2000 and 2007, were not statistically significant. An analysis relating the necessary magnitude of change in contamination required for detection via human surveillance also is provided.
NASA Technical Reports Server (NTRS)
1986-01-01
The status of the implementation of the recommendations of the Presidential Commission on the Space Shuttle Challenger Accident is reported. The implementation of recommendations in the following areas is detailed: (1) solid rocket motor design; (2) shuttle management structure, including the shuttle safety panel and astronauts in management; (3) critical item review and hazard analysis; (4) safety organization; (5) improved communication; (6) landing safety; (7) launch abort and crew escape; (8) flight rate; and (9) maintenance safeguards. Supporting memoranda and communications from NASA are appended.
14 CFR 417.111 - Launch plans.
Code of Federal Regulations, 2010 CFR
2010-01-01
... controls identified by a launch operator's ground safety analysis and implementation of the ground safety.... (ii) For each toxic propellant, any hazard controls and process constraints determined under the... classification and compatibility group as defined by part 420 of this chapter. (3) A graphic depiction of the...
16 CFR § 1000.25 - Office of Hazard Identification and Reduction.
Code of Federal Regulations, 2013 CFR
2013-01-01
... has line authority over the Directorates for Epidemiology and Health Sciences, Economic Analysis, Engineering Sciences, and Laboratory Sciences. The Office develops strategies for and implements the agency's... social impacts of projects are comprehensively and objectively presented to the Commission for decision. ...
Improving Food Safety in Meat and Poultry: Will New Regulations Benefit Consumers?
ERIC Educational Resources Information Center
Unnevehr, Laurian J.; Roberts, Tanya; Jensen, Helen H.
1997-01-01
The U.S. Department of Agriculture's Hazard Analysis and Critical Control Point System for meat and poultry processing will benefit consumers by reducing food-borne illnesses. The benefits are likely to exceed the additional costs from implementing the regulations. (SK)
USDA-ARS?s Scientific Manuscript database
The Hazard Analysis and Critical Control Point (HACCP) food safety inspection program is utilized by both USDA Food Safety Inspection Service (FSIS) and FDA for many of the products they regulate. This science-based program was implemented by the USDA FSIS to enhance the food safety of meat and pou...
Castellanos Rey, Liliana C; Villamil Jiménez, Luis C; Romero Prada, Jaime R
2004-01-01
The Hazard Analysis and Critical Control Point system (HACCP), recommended by different international organizations as the Codex Alimentarius Commission, the World Trade Organization (WTO), the International Office of Epizootics (OIE) and the International Convention for Vegetables Protection (ICPV) amongst others, contributes to ensuring the innocuity of food along the agro-alimentary chain and requires of Good Manufacturing Practices (GMP) for its implementation, GMP's which are legislated in most countries. Since 1997, Colombia has set rules and legislation for application of HACCP system in agreement with international standards. This paper discusses the potential and difficulties of the legislation enforcement and suggests some policy implications towards food safety.
USDA-ARS?s Scientific Manuscript database
HACCP is an acronym for Hazard Analysis and Critical Control Point and was initially developed by the Pillsbury Company and NASA. They utilized this program to enhance the safety of the food for manned space flights. The USDA-FSIS implemented the HACCP approach to food safety in the meat and pou...
USDA-ARS?s Scientific Manuscript database
HACCP is an acronym for Hazard Analysis and Critical Control Point and was initially developed by the Pillsbury Company and NASA. They utilized this program to enhance the safety of the food for manned space flights. The USDA-FSIS implemented the HACCP approach to food safety in the meat and p...
USDA-ARS?s Scientific Manuscript database
HACCP is an acronym for Hazard Analysis and Critical Control Point and was initially developed by the Pillsbury Company and NASA. They utilized this program to enhance the safety of the food for manned space flights. The USDA-FSIS implemented the HACCP approach to food safety in the meat and poult...
USDA-ARS?s Scientific Manuscript database
The Hazard Analysis and Critical Control Point (HACCP) food safety inspection program is utilized by both USDA Food Safety Inspection Service (FSIS) and FDA for many of the products they regulate. This science-based program was implemented by the USDA FSIS to enhance the food safety of meat and pou...
Gilling, S J; Taylor, E A; Kane, K; Taylor, J Z
2001-05-01
Hazard analysis critical control point (HACCP), a system of risk management designed to control food safety, has emerged over the last decade as the primary approach to securing the safety of the food supply. It is thus an important tool in combatting the worldwide escalation of foodborne disease. Yet despite wide dissemination and scientific support of its principles, successful HACCP implementation has been limited. This report takes a psychological approach to this problem by examining processes and factors that could impede adherence to the internationally accepted HACCP Guidelines and subsequent successful implementation of HACCP. Utilizing knowledge of medical clinical guideline adherence models and practical experience of HACCP implementation problems, the potential advantages of applying a behavioral model to food safety management are highlighted. The models' applicability was investigated using telephone interviews from over 200 businesses in the United Kingdom. Eleven key barriers to HACCP guideline adherence were identified. In-depth narrative interviews with food business proprietors then confirmed these findings and demonstrated the subsequent negative effect(s) on HACCP implementation. A resultant HACCP awareness to adherence model is proposed that demonstrates the complex range of potential knowledge, attitude, and behavior-related barriers involved in failures of HACCP guideline adherence. The model's specificity and detail provide a tool whereby problems can be identified and located and in this way facilitate tailored and constructive intervention. It is suggested that further investigation into the barriers involved and how to overcome them would be of substantial benefit to successful HACCP implementation and thereby contribute to an overall improvement in public health.
Introduction to the Principles of HACCP
USDA-ARS?s Scientific Manuscript database
HACCP is an acronym for Hazard Analysis and Critical Control Point and was initially developed by the Pillsbury Company and NASA. They utilized this program to enhance the safety of the food for manned space flights. The USDA-FSIS implemented the HACCP approach to food safety in the meat and poult...
USDA-ARS?s Scientific Manuscript database
HACCP is an acronym for Hazard Analysis and Critical Control Point and was initially developed by the Pillsbury Company and NASA. They utilized this program to enhance the safety of the food for manned space flights. The USDA-FSIS implemented the HACCP approach to food safety in the meat and poult...
75 FR 28763 - Permission To Use Air Inflation of Meat Carcasses and Parts
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-24
... inflate carcasses and parts with air if they develop, implement, and maintain written controls to ensure... require that establishments incorporate these controls into their Hazard Analysis and Critical Control... procedures for other than the approved methods are required to submit to FSIS a request for experimental...
10 CFR Appendix A to Subpart B of... - General Statement of Safety Basis Policy
Code of Federal Regulations, 2010 CFR
2010-01-01
... Analysis Reports for Nuclear Power Plants, or successor document. (2) A DOE nonreactor nuclear facility... with DOE Policy 450.2A, “Identifying, Implementing and Complying with Environment, Safety and Health..., the public and the environment from adverse consequences. These analyses and hazard controls...
A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen
2014-05-01
Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho are too short for a meaningful frequency analysis. The detailed hazard mapping is performed by a 2D hydrodynamic model for Can Tho city. As the scenarios are derived in a Monte-Carlo framework, the final flood hazard maps are probabilistic, i.e. show the median flood hazard along with uncertainty estimates for each defined level of probabilities of exceedance. For the pluvial flood hazard a frequency analysis of the hourly rain gauge data of Can Tho is performed implementing a peak-over-threshold procedure. Based on this frequency analysis synthetic rains storms are generated in a Monte-Carlo framework for the same probabilities of exceedance as in the fluvial flood hazard analysis. Probabilistic flood hazard maps were then generated with the same 2D hydrodynamic model for the city. In a last step the fluvial and pluvial scenarios are combined assuming independence of the events. These scenarios were also transferred into hazard maps by the 2D hydrodynamic model finally yielding combined fluvial-pluvial probabilistic flood hazard maps for Can Tho. The derived set of maps may be used for an improved city planning or a flood risk analysis.
Rockfall Hazard Process Assessment : Implementation Report
DOT National Transportation Integrated Search
2017-10-01
The Montana Department of Transportation (MDT) commissioned a new research program to improve assessment and management of its rock slope assets. The Department implemented a Rockfall Hazard Rating System (RHRS) program in 2005 and wished to add valu...
Risk Governance of Multiple Natural Hazards: Centralized versus Decentralized Approach in Europe
NASA Astrophysics Data System (ADS)
Komendantova, Nadejda; Scolobig, Anna; Vinchon, Charlotte
2014-05-01
The multi-risk approach is a relatively new field and its definition includes the need to consider multiple hazards and vulnerabilities in their interdependency (Selva, 2013) and the current multi-hazards disasters, such as the 2011 Tohoku earthquake, tsunami and nuclear catastrophe, showed the need for a multi-risk approach in hazard mitigation and management. Our knowledge about multi-risk assessment, including studies from different scientific disciplines and developed assessment tools, is constantly growing (White et al., 2001). However, the link between scientific knowledge, its implementation and the results in terms of improved governance and decision-making have gained significantly less attention (IRGC, 2005; Kappes et al., 2012), even though the interest to risk governance, in general, has increased significantly during the last years (Verweiy and Thompson, 2006). Therefore, the key research question is how risk assessment is implemented and what is the potential for the implementation of a multi-risk approach in different governance systems across Europe. More precisely, how do the characteristics of risk governance, such as the degree of centralization versus decentralization, influence the implementation of a multi-risk approach. The methodology of this research includes comparative case study analysis of top-down and bottom-up interactions in governance in the city of Naples, (Italy), where the institutional landscape is marked by significant autonomy of Italian regions in decision-making processes for assessing the majority of natural risks, excluding volcanic, and in Guadeloupe, French West Indies, an overseas department of France, where the decision-making process is marked by greater centralization in decision making associated with a well established state governance within regions, delegated to the prefect and decentralised services of central ministries. The research design included documentary analysis and extensive empirical work involving policy makers, private sector actors and practitioners in risk and emergency management. This work was informed by 36 semi-structured interviews, three workshops with over seventy participants from eleven different countries, feedback from questionnaires and focus group discussions (Scolobig et al., 2013). The results show that both governance systems have their own strengths and weaknesses (Komendantova et al., 2013). Elements of the centralized multi-risk governance system could lead to improvements in interagency communication and the creation of an inter-agency environment, where the different departments at the national level can exchange information, identify the communities that are most exposed to multiple risks and set priorities, while providing consistent information about and responses to multi-risk to the relevant stakeholders at the local level. A decentralised multi-risk governance system by contrast can instead favour the creation of local multi-risk commissions to conduct discussions between experts in meteorological, geological and technological risks and practitioners, to elaborate risk and hazard maps, and to develop local capacities which would include educational and training activities. Both governance systems suffer from common deficiencies, the most important being the frequent lack of capacities at the local level, especially financial, but sometimes also technical and institutional ones, as the responsibilities for disaster risk management are often transferred from the national to local levels without sufficient resources for implementation of programs on risk management (UNISDR, 2013). The difficulty in balancing available resources between short-term and medium-term priorities often complicates the issue. Our recommendations are that the implementation of multi-risk approach can be facilitated through knowledge exchange and dialogue between different disciplinary communities, such as geological and meteorological, and between the natural and social sciences. The implementation of a multi-risk approach can be strengthened through the creation of multi-risk platforms and multi-risk commissions, which can liaise between risk management experts and local communities and to unify numerous actions on natural hazard management. However, the multi-risk approach cannot be a subsidiary to a single risk approach, and both have to be pursued. References: IRGC. (2011). Concept note: Improving the management of emerging risks: Risks from new technologies, system interactions, and unforeseen or changing circumstances. International Risk Governance Council (IRGC), Geneva. Kappes, M. S., Keiler, M., Elverfeldt, von K., & Glade, T, (2012). Challenges of analyzing multi-hazard risk: A review. Natural Hazards, 64(2), 1925-1958. doi: 10.1007/s11069-012-0294-2. Komendantova N, Scolobig A, Vinchon C (2013). Multi-risk approach in centralized and decentralized risk governance systems: Case studies of Naples, Italy and Guadeloupe, France. International Relations and Diplomacy, 1(3):224-239 (December 2013) Scolobig, A., Vichon, C., Komendantova, N., Bengoubou-Valerius, M., & Patt, A. (2013). Social and institutional barriers to effective multi-hazard and multi-risk decision-making governance. D6.3 MATRIX project. Selva, J. (2013). Long-term multi-risk assessment: statistical treatment of interaction among risks. Natural Hazards, 67(2),701-722. UNISDR. (2013). Implementing the HYOGO framework for action in Europe: Regional synthesis report 2011-2013. Verweij, M., & Thompson, M. (Eds.). (2006). Clumsy solutions for a complex world: Governance, politics, and plural perceptions. New York: Palgrave Macmillan. White, G., Kates, R., & Burton, I. (2001). Knowing better and losing even more: the use of knowledge in hazards management. Environmental Hazards, 3, 81-92.
NASA Technical Reports Server (NTRS)
1986-01-01
The status of the implementation of the recommendations of the Presidential Commission on the Space Shuttle Challenger Accident is reported. The implementation of recommendations in the following areas is detailed: (1) solid rocket motor design; (2) shuttle management structure, including the shuttle safety panel and astronauts in management; (3) critical item review and hazard analysis; (4) safety organization; (5) improved communication; (6) landing safety; (7) launch abort and crew escape; (8) flight rate; and (9) maintenance safeguards. Supporting memoranda and communications from NASA are appended.
Automated Mixed Traffic Vehicle (AMTV) technology and safety study
NASA Technical Reports Server (NTRS)
Johnston, A. R.; Peng, T. K. C.; Vivian, H. C.; Wang, P. K.
1978-01-01
Technology and safety related to the implementation of an Automated Mixed Traffic Vehicle (AMTV) system are discussed. System concepts and technology status were reviewed and areas where further development is needed are identified. Failure and hazard modes were also analyzed and methods for prevention were suggested. The results presented are intended as a guide for further efforts in AMTV system design and technology development for both near term and long term applications. The AMTV systems discussed include a low speed system, and a hybrid system consisting of low speed sections and high speed sections operating in a semi-guideway. The safety analysis identified hazards that may arise in a properly functioning AMTV system, as well as hardware failure modes. Safety related failure modes were emphasized. A risk assessment was performed in order to create a priority order and significant hazards and failure modes were summarized. Corrective measures were proposed for each hazard.
Establishing a proactive safety and health risk management system in the fire service.
Poplin, Gerald S; Pollack, Keshia M; Griffin, Stephanie; Day-Nash, Virginia; Peate, Wayne F; Nied, Ed; Gulotta, John; Burgess, Jefferey L
2015-04-19
Formalized risk management (RM) is an internationally accepted process for reducing hazards in the workplace, with defined steps including hazard scoping, risk assessment, and implementation of controls, all within an iterative process. While required for all industry in the European Union and widely used elsewhere, the United States maintains a compliance-based regulatory structure, rather than one based on systematic, risk-based methodologies. Firefighting is a hazardous profession, with high injury, illness, and fatality rates compared with other occupations, and implementation of RM programs has the potential to greatly improve firefighter safety and health; however, no descriptions of RM implementation are in the peer-reviewed literature for the North American fire service. In this paper we describe the steps used to design and implement the RM process in a moderately-sized fire department, with particular focus on prioritizing and managing injury hazards during patient transport, fireground, and physical exercise procedures. Hazard scoping and formalized risk assessments are described, in addition to the identification of participatory-led injury control strategies. Process evaluation methods were conducted to primarily assess the feasibility of voluntarily instituting the RM approach within the fire service setting. The RM process was well accepted by the fire department and led to development of 45 hazard specific-interventions. Qualitative data documenting the implementation of the RM process revealed that participants emphasized the: value of the RM process, especially the participatory bottom-up approach; usefulness of the RM process for breaking down tasks to identify potential risks; and potential of RM for reducing firefighter injury. As implemented, this risk-based approach used to identify and manage occupational hazards and risks was successful and is deemed feasible for U.S. (and other) fire services. While several barriers and challenges do exist in the implementation of any intervention such as this, recommendations for adopting the process are provided. Additional work will be performed to determine the effectiveness of select controls strategies that were implemented; however participants throughout the organizational structure perceived the RM process to be of high utility while researchers also found the process improved the awareness and engagement in actively enhancing worker safety and health.
Anderson, Joe; Collins, Michele; Devlin, John; Renner, Paul
2012-01-01
The Institute for Sustainable Work and Environment and the Utility Workers Union of America worked with a professional evaluator to design, implement, and evaluate the results of a union-led system of safety-based hazard identification program that trained workers to use hazard maps to identify workplace hazards and target them for elimination. The evaluation documented program implementation and impact using data collected from both qualitative interviews and an on-line survey from worker trainers, plant managers, and health and safety staff. Managers and workers reported that not only were many dangerous hazards eliminated as a result of hazard mapping, some of which were long-standing, difficult-to-resolve issues, but the evaluation also documented improved communication between union members and management that both workers and managers agreed resulted in better, more sustainable hazard elimination.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dedrick, Daniel E.; Bradshaw, Robert W.; Behrens, Richard, Jr.
2007-08-01
Safe and efficient hydrogen storage is a significant challenge inhibiting the use of hydrogen as a primary energy carrier. Although energy storage performance properties are critical to the success of solid-state hydrogen storage systems, operator and user safety is of highest importance when designing and implementing consumer products. As researchers are now integrating high energy density solid materials into hydrogen storage systems, quantification of the hazards associated with the operation and handling of these materials becomes imperative. The experimental effort presented in this paper focuses on identifying the hazards associated with producing, storing, and handling sodium alanates, and thus allowingmore » for the development and implementation of hazard mitigation procedures. The chemical changes of sodium alanates associated with exposure to oxygen and water vapor have been characterized by thermal decomposition analysis using simultaneous thermogravimetric modulated beam mass spectrometry (STMBMS) and X-ray diffraction methods. Partial oxidation of sodium alanates, an alkali metal complex hydride, results in destabilization of the remaining hydrogen-containing material. At temperatures below 70 C, reaction of sodium alanate with water generates potentially combustible mixtures of H{sub 2} and O{sub 2}. In addition to identifying the reaction hazards associated with the oxidation of alkali-metal containing complex hydrides, potential treatment methods are identified that chemically stabilize the oxidized material and reduce the hazard associated with handling the contaminated metal hydrides.« less
40 CFR 63.7570 - Who implements and enforces this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Industrial, Commercial, and Institutional Boilers and Process Heaters Other Requirements and Information § 63.7570 Who implements and...
NASA Astrophysics Data System (ADS)
Schoessow, F. S.; Li, Y.; Howe, P. D.
2016-12-01
Extreme heat events are the deadliest natural hazard in the United States and are expected to increase in both severity and frequency in the coming years due to the effects of climate change. The risks of climate change and weather-related events such as heat waves to a population can be more comprehensively assessed by coupling the traditional examination of natural hazards using remote sensing and geospatial analysis techniques with human vulnerability factors and individual perceptions of hazards. By analyzing remote-sensed and empirical survey data alongside national hazards advisories, this study endeavors to establish a nationally-representative baseline quantifying the spatiotemporal variation of individual heat vulnerabilities at multiple scales and between disparate population groups affected by their unique socioenvironmental factors. This is of immediate academic interest because the study of heat waves risk perceptions remains relatively unexplored - despite the intensification of extreme heat events. The use of "human sensors", georeferenced & timestamped individual response data, provides invaluable contextualized data at a high spatial resolution, which will enable policy-makers to more effectively implement targeted strategies for risk prevention, mitigation, and communication. As climate change risks are further defined, this cognizance will help identify vulnerable populations and enhance national hazard preparedness and recovery frameworks.
Compiler-assisted multiple instruction rollback recovery using a read buffer
NASA Technical Reports Server (NTRS)
Alewine, N. J.; Chen, S.-K.; Fuchs, W. K.; Hwu, W.-M.
1993-01-01
Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper focuses on compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations indicate improved efficiency over previous hardware-based and compiler-based schemes.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-16
...The Food and Drug Administration (FDA) is proposing to amend its regulation for Current Good Manufacturing Practice In Manufacturing, Packing, or Holding Human Food (CGMPs) to modernize it and to add requirements for domestic and foreign facilities that are required to register under the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to establish and implement hazard analysis and risk- based preventive controls for human food. FDA also is proposing to revise certain definitions in FDA's current regulation for Registration of Food Facilities to clarify the scope of the exemption from registration requirements provided by the FD&C Act for ``farms.'' FDA is taking this action as part of its announced initiative to revisit the CGMPs since they were last revised in 1986 and to implement new statutory provisions in the FD&C Act. The proposed rule is intended to build a food safety system for the future that makes modern, science-, and risk-based preventive controls the norm across all sectors of the food system.
BIOREMEDIATION OF HAZARDOUS WASTE SITES: PRACTICAL APPROACHES TO IMPLEMENTATION (EPA/625/K-96/001)
This document contains abstracts and slide hardcopy for the U.S. Environmental Protection Agency's (EPA's) "Seminar Series on Bioremediation of Hazardous Waste Sites: Practical Approaches to Implementation." This technology transfer seminar series, sponsored by EPA's Biosystems ...
A generic multi-hazard and multi-risk framework and its application illustrated in a virtual city
NASA Astrophysics Data System (ADS)
Mignan, Arnaud; Euchner, Fabian; Wiemer, Stefan
2013-04-01
We present a generic framework to implement hazard correlations in multi-risk assessment strategies. We consider hazard interactions (process I), time-dependent vulnerability (process II) and time-dependent exposure (process III). Our approach is based on the Monte Carlo method to simulate a complex system, which is defined from assets exposed to a hazardous region. We generate 1-year time series, sampling from a stochastic set of events. Each time series corresponds to one risk scenario and the analysis of multiple time series allows for the probabilistic assessment of losses and for the recognition of more or less probable risk paths. Each sampled event is associated to a time of occurrence, a damage footprint and a loss footprint. The occurrence of an event depends on its rate, which is conditional on the occurrence of past events (process I, concept of correlation matrix). Damage depends on the hazard intensity and on the vulnerability of the asset, which is conditional on previous damage on that asset (process II). Losses are the product of damage and exposure value, this value being the original exposure minus previous losses (process III, no reconstruction considered). The Monte Carlo method allows for a straightforward implementation of uncertainties and for implementation of numerous interactions, which is otherwise challenging in an analytical multi-risk approach. We apply our framework to a synthetic data set, defined by a virtual city within a virtual region. This approach gives the opportunity to perform multi-risk analyses in a controlled environment while not requiring real data, which may be difficultly accessible or simply unavailable to the public. Based on the heuristic approach, we define a 100 by 100 km region where earthquakes, volcanic eruptions, fluvial floods, hurricanes and coastal floods can occur. All hazards are harmonized to a common format. We define a 20 by 20 km city, composed of 50,000 identical buildings with a fixed economic value. Vulnerability curves are defined in terms of mean damage ratio as a function of hazard intensity. All data are based on simple equations found in the literature and on other simplifications. We show the impact of earthquake-earthquake interaction and hurricane-storm surge coupling, as well as of time-dependent vulnerability and exposure, on aggregated loss curves. One main result is the emergence of low probability-high consequences (extreme) events when correlations are implemented. While the concept of virtual city can suggest the theoretical benefits of multi-risk assessment for decision support, identifying their real-world practicality will require the study of real test sites.
NASA Astrophysics Data System (ADS)
DELİCE, Yavuz
2015-04-01
Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing, signaling caused malfunctions and risks), fire or explosion etc.- In this study, with FMEA method, risk analysis of the urban and intercity motorways against natural disasters and hazards have been performed and found solutions were brought against these risks. Keywords: Failure Modes Effects Analysis (FMEA), Pareto Analyses (PA), Highways, Risk Management.
Landslide risk models for decision making.
Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio
2009-11-01
This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.
Using qualitative psychology to investigate HACCP implementation barriers.
Taylor, Eunice; Taylor, Joanne Zaida
2004-02-01
Hazard Analysis Critical Control Point (HACCP) is a system of food safety management that in the last few decades has become an increasing part of national government and international strategy to reduce the prevalence of food borne disease. Yet despite wide dissemination and scientific support of its principles, successful HACCP implementation has been limited. There has been very little in-depth consideration of the reasons behind this, and qualitative psychological research examining the interplay of factors involved is almost non-existent. For this study therefore, four in-depth narrative interviews were carried out with small business owners attempting to implement HACCP. Non-prescriptive analysis of their interviews was carried out, and this revealed five key themes acting as 'barriers' to successful implementation. These were of HACCP as difficult, burdensome and unnecessary, and hindered by staff and external problems. The analysis showed furthermore the complexity of issues underpinning problems with HACCP implementation and the way in which they operate at knowledge, attitude and behavioural levels. From this, essential issues to be addressed in order for successful HACCP to be achieved are put forward.
Pardo, José E; de Figueirêdo, Vinícius Reis; Alvarez-Ortí, Manuel; Zied, Diego C; Peñaranda, Jesús A; Dias, Eustáquio Souza; Pardo-Giménez, Arturo
2013-09-01
The Hazard analysis and critical control points (HACCP) is a preventive system which seeks to ensure food safety and security. It allows product protection and correction of errors, improves the costs derived from quality defects and reduces the final overcontrol. In this paper, the system is applied to the line of cultivation of mushrooms and other edible cultivated fungi. From all stages of the process, only the reception of covering materials (stage 1) and compost (stage 3), the pre-fruiting and induction (step 6) and the harvest (stage 7) have been considered as critical control point (CCP). The main hazards found were the presence of unauthorized phytosanitary products or above the permitted dose (stages 6 and 7), and the presence of pathogenic bacteria (stages 1 and 3) and/or heavy metals (stage 3). The implementation of this knowledge will allow the self-control of their productions based on the system HACCP to any plant dedicated to mushroom or other edible fungi cultivation.
Using GIS in risk analysis: a case study of hazardous waste transport.
Lovett, A A; Parfitt, J P; Brainard, J S
1997-10-01
This paper provides an illustration of how a geographic information system (GIS) can be used in risk analysis. It focuses on liquid hazardous waste transport and utilizes records archived by the London Waste Regulatory Authority. This data source provides information on the origin and destination of each waste stream, but not the route followed during transport. A GIS was therefore employed to predict the paths used, taking into account different routing criteria and characteristics of the available road network. Details were also assembled on population distribution and ground-water vulnerability, thus providing a basis for evaluating the potential consequences of a waste spillage during transport. Four routing scenarios were implemented to identify sections of road which consistently saw heavy traffic. These simulations also highlighted that some interventions could lead to risk tradeoffs rather than hazard mitigation. Many parts of the research would not have been possible without a GIS, and the study demonstrates the considerable potential of such software in environmental risk assessment and management.
Application of Risk Assessment Tools in the Continuous Risk Management (CRM) Process
NASA Technical Reports Server (NTRS)
Ray, Paul S.
2002-01-01
Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) is currently implementing the Continuous Risk Management (CRM) Program developed by the Carnegie Mellon University and recommended by NASA as the Risk Management (RM) implementation approach. The four most frequently used risk assessment tools in the center are: (a) Failure Modes and Effects Analysis (FMEA), Hazard Analysis (HA), Fault Tree Analysis (FTA), and Probabilistic Risk Analysis (PRA). There are some guidelines for selecting the type of risk assessment tools during the project formulation phase of a project, but there is not enough guidance as to how to apply these tools in the Continuous Risk Management process (CRM). But the ways the safety and risk assessment tools are used make a significant difference in the effectiveness in the risk management function. Decisions regarding, what events are to be included in the analysis, to what level of details should the analysis be continued, make significant difference in the effectiveness of risk management program. Tools of risk analysis also depends on the phase of a project e.g. at the initial phase of a project, when not much data are available on hardware, standard FMEA cannot be applied; instead a functional FMEA may be appropriate. This study attempted to provide some directives to alleviate the difficulty in applying FTA, PRA, and FMEA in the CRM process. Hazard Analysis was not included in the scope of the study due to the short duration of the summer research project.
Environmental Assessment for Management of South End of Runway Wetlands, Moody AFB, Georgia
2010-11-01
implement a management program for the wetlands at the south end of runway (EOR) at Moody AFB to reduce the bird/wildlife aircraft strike hazard (BASH) risk...because birds and other wildlife pose an increased bird/wildlife aircraft strike hazard (BASH) risk to aircraft utilizing the Moody AFB airfield. ln...support ofthe military mission, Moody AFB has implemented a BASH management program designed to minimize aircraft exposure to potentially hazardous
SAR interferometry monitoring along the ancient Rome City Walls -the PROTHEGO project case study
NASA Astrophysics Data System (ADS)
Carta, Cristina; Cimino, Maria gabriella; Leoni, Gabriele; Marcelli, Marina; Margottini, Claudio; Spizzichino, Daniele
2017-04-01
Led by the Italian Institute for Environmental Protection and Research, in collaboration with NERC British Geological Survey, Geological and Mining Institute of Spain, University of Milano-Bicocca and Cyprus University of Technology, the PROTHEGO project, co-funded in the framework of JPI on Cultural Heritage EU program (2015-2018), brings an innovative contribution towards the analysis of geo-hazards in areas of cultural heritage in Europe. The project apply InSAR techniques to monitor monuments and sites that are potentially unstable due to natural geo-hazard. After the remote sensing investigation, detailed geological interpretation, hazard analysis, local-scale monitoring, advanced modeling and field surveying for some case studies is implemented. The selected case studies are: the Alhambra in Granada (ES); the Choirokoitia village (CY); the Derwent Valley Mills (UK); the Pompei archaeological site and Historical centre of Rome (IT). In this work, in particular, we will focus on ground deformation measurements (obtained by satellite SAR Interferometry) and on their interpretation with respect to the ancient Rome City Walls. The research activities carried out jointly with the Superintendence's technicians, foresee the implementation of a dedicated web GIS platform as a final repository for data storage and spatial data elaboration. The entire circuit of the ancient city walls (both Mura Aureliane and Mura Gianicolensi), was digitalized and georeferenced. All the elements (towers, gates and wall segments) were drawn and collected in order to produce a map of elements at risk. A detailed historical analysis (during the last twenty years) of the ground and structural deformations were performed. A specific data sheet of ruptures was created and fulfilled in order to produce a geographic inventory of past damage. This data sheet contains the following attributes: triggering data; typology of damage; dimension, triggering mechanism; presence of restoration works. More than thirty events were collected. The most frequent damages refers to human impacts, detachment of brick outer surface and wall collapse. The resulting damage layer was compared with different local hazard maps (e.g. landslide; subsidence; seismic) and also with the PS (monitored point) coming from the satellite analysis. The satellite monitoring data and analysis was based on the processing of COSMO-SkyMed image data (from 2011 to 2014). The data were obtained from the Extraordinary Monitoring Project Plan, implemented by the Italian Environmental Ministry. The preliminary analysis did not show large areas affected by deformations. A wide area affected by subsidence phenomena was detected in the south portion of the walls (close to the Ostiense district). While smaller and localized detachments were detected in the northern sector. Starting from these firsts results, COSMO-SkyMed SAR interferometry analysis seems to be very efficient due to its capability of providing a large number of deformation measurements over the whole site and structures with relatively small cost and without any impact. Cross analysis between interferometric results, natural hazard and historical data of the site (e.g. collapses, works) is still in progress in order to define a forecasting model aiming at an early identification of areas subjected to potential instability or sudden collapse
NASA Astrophysics Data System (ADS)
Li, P.
2016-12-01
In this study, on the basis of 3,200 km shallow stratigraphic section and sidescan sonar data of the coastal area of the Yellow River Delta, we delineated and interpreted a total of seven types of typical hazardous geologies, including the hazardous geology in the shallow strata (buried ancient channel and strata disturbance) and hazardous geology in the seabed surface strata (pit, erosive residual body, sand patch, sand wave and scour channel). We selected eight parameters representing the development scale of the hazardous geology as the zoning indexes, including the number of hazardous geology types, pit depth, height of erosive residual body, length of scour channel, area of sand patch, length of sand wave, width of the buried ancient channel and depth of strata disturbance, and implemented the grid processing of the research area to calculate the arithmetic sum of the zoning indexes of each unit grid one by one. We then adopted the clustering analysis method to divide the near-shore waters of the Yellow River Delta into five hazardous geology areas, namely the serious erosion disaster area controlled by Diaokou lobe waves, hazardous geology area of multi-disasters under the combined action of the Shenxiangou lobe river wave flow, accumulation type hazardous geology area controlled by the current estuary river, hazardous geology area of single disaster in the deep water area and potential hazardous geology area of the Chengdao Oilfield. All four of the main factors affecting the development of hazardous geology, namely the diffusion and movement of sediment flux of the Yellow River water entering the sea, seabed stability, bottom sediment type and distribution, as well as the marine hydrodynamic characteristics, show significant regional differentiation characteristics and laws. These characteristics and laws are consistent with the above-mentioned zoning results, in which the distribution, scale and genetic mechanism of hazardous geology are considered comprehensively. This indicates that the hazardous geology zoning based on the cluster analysis is a new attempt in research regarding the hazardous geology zoning of the near-shore waters of the modern Yellow River Delta and that this type of zoning has a high level of reasonability.
SB 1082 -- Unified hazardous materials/waste program: Local implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, W.
California Senate Bill 1082 was signed into law in the fall of 1993 because business and industry believed there were too many hazardous materials inspectors asking the same questions, looking at the same items and requiring similar information on several variations of the same form. Industry was not happy with the large diversity of programs, each with its own inspectors, permits and fees, essentially doing what industry believed was the same inspection. SB 1082 will allow local city and county agencies to apply to the California Environmental Protection Agency to become a Certified Unified Program Agency (CUPA) or work withmore » a CUPA as a Participating Agency (PA) to manage specific program elements. The CUPA will unify six regulatory programs including hazardous waste/tiered permitting, aboveground storage tanks, underground storage tanks, business and area plans/inventory or disclosure, acutely hazardous materials/risk management prevention and Uniform Fire Code programs related to hazardous materials inventory/plan requirements. The bill requires the CUPA to (1) implement a permit consolidation program; (2) implement a single fee system with a state surcharge; (3) consolidate, coordinate and make consistent any local or regional requirements or guidance documents; and (4) implement a single unified inspection and enforcement program.« less
Wildfire Research in an Environmental Hazards Course: An Active Learning Approach
ERIC Educational Resources Information Center
Wall, Tamara U.; Halvorson, Sarah J.
2011-01-01
Creating opportunities for students to actively apply hazards theory to real-life situations is often a challenge in hazards geography courses. This article presents a project, the Jocko Lakes Fire Project, that implemented learning strategies to encourage students to be active in wildfire hazards research. Wildfire hazards stand out as an…
Stereo-vision-based terrain mapping for off-road autonomous navigation
NASA Astrophysics Data System (ADS)
Rankin, Arturo L.; Huertas, Andres; Matthies, Larry H.
2009-05-01
Successful off-road autonomous navigation by an unmanned ground vehicle (UGV) requires reliable perception and representation of natural terrain. While perception algorithms are used to detect driving hazards, terrain mapping algorithms are used to represent the detected hazards in a world model a UGV can use to plan safe paths. There are two primary ways to detect driving hazards with perception sensors mounted to a UGV: binary obstacle detection and traversability cost analysis. Binary obstacle detectors label terrain as either traversable or non-traversable, whereas, traversability cost analysis assigns a cost to driving over a discrete patch of terrain. In uncluttered environments where the non-obstacle terrain is equally traversable, binary obstacle detection is sufficient. However, in cluttered environments, some form of traversability cost analysis is necessary. The Jet Propulsion Laboratory (JPL) has explored both approaches using stereo vision systems. A set of binary detectors has been implemented that detect positive obstacles, negative obstacles, tree trunks, tree lines, excessive slope, low overhangs, and water bodies. A compact terrain map is built from each frame of stereo images. The mapping algorithm labels cells that contain obstacles as nogo regions, and encodes terrain elevation, terrain classification, terrain roughness, traversability cost, and a confidence value. The single frame maps are merged into a world map where temporal filtering is applied. In previous papers, we have described our perception algorithms that perform binary obstacle detection. In this paper, we summarize the terrain mapping capabilities that JPL has implemented during several UGV programs over the last decade and discuss some challenges to building terrain maps with stereo range data.
Stereo Vision Based Terrain Mapping for Off-Road Autonomous Navigation
NASA Technical Reports Server (NTRS)
Rankin, Arturo L.; Huertas, Andres; Matthies, Larry H.
2009-01-01
Successful off-road autonomous navigation by an unmanned ground vehicle (UGV) requires reliable perception and representation of natural terrain. While perception algorithms are used to detect driving hazards, terrain mapping algorithms are used to represent the detected hazards in a world model a UGV can use to plan safe paths. There are two primary ways to detect driving hazards with perception sensors mounted to a UGV: binary obstacle detection and traversability cost analysis. Binary obstacle detectors label terrain as either traversable or non-traversable, whereas, traversability cost analysis assigns a cost to driving over a discrete patch of terrain. In uncluttered environments where the non-obstacle terrain is equally traversable, binary obstacle detection is sufficient. However, in cluttered environments, some form of traversability cost analysis is necessary. The Jet Propulsion Laboratory (JPL) has explored both approaches using stereo vision systems. A set of binary detectors has been implemented that detect positive obstacles, negative obstacles, tree trunks, tree lines, excessive slope, low overhangs, and water bodies. A compact terrain map is built from each frame of stereo images. The mapping algorithm labels cells that contain obstacles as no-go regions, and encodes terrain elevation, terrain classification, terrain roughness, traversability cost, and a confidence value. The single frame maps are merged into a world map where temporal filtering is applied. In previous papers, we have described our perception algorithms that perform binary obstacle detection. In this paper, we summarize the terrain mapping capabilities that JPL has implemented during several UGV programs over the last decade and discuss some challenges to building terrain maps with stereo range data.
Wilhelm, Barbara; Rajić, Andrijana; Greig, Judy D; Waddell, Lisa; Harris, Janet
2011-09-01
Hazard analysis critical control point (HACCP) programs have been endorsed and implemented globally to enhance food safety. Our objective was to identify, assess, and summarize or synthesize the published research investigating the effect of HACCP programs on microbial prevalence and concentration on food animal carcasses in abattoirs through primary processing. The results of microbial testing pre- and post-HACCP implementation were reported in only 19 studies, mostly investigating beef (n=13 studies) and pork (n=8 studies) carcasses. In 12 of 13 studies measuring aerobic bacterial counts, reductions were reported on beef (7/8 studies), pork (3/3), poultry (1/1), and sheep (1/1). Significant (p<0.05) reductions in prevalence of Salmonella spp. were reported in studies on pork (2/3 studies) and poultry carcasses (3/3); no significant reductions were reported on beef carcasses (0/8 studies). These trends were confirmed through meta-analysis of these data; however, powerful meta-analysis was precluded because of an overall scarcity of individual studies and significant heterogeneity across studies. Australia reported extensive national data spanning the period from 4 years prior to HACCP implementation to 4 years post-HACCP, indicating reduction in microbial prevalence and concentration on beef carcasses in abattoirs slaughtering beef for export; however, the effect of abattoir changes initiated independent of HACCP could not be excluded. More primary research and access to relevant proprietary data are needed to properly evaluate HACCP program effectiveness using modeling techniques capable of differentiating the effects of HACCP from other concurrent factors.
Research and Evaluations of the Health Aspects of Disasters, Part IX: Risk-Reduction Framework.
Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Loretti, Alessandro
2016-06-01
A disaster is a failure of resilience to an event. Mitigating the risks that a hazard will progress into a destructive event, or increasing the resilience of a society-at-risk, requires careful analysis, planning, and execution. The Disaster Logic Model (DLM) is used to define the value (effects, costs, and outcome(s)), impacts, and benefits of interventions directed at risk reduction. A Risk-Reduction Framework, based on the DLM, details the processes involved in hazard mitigation and/or capacity-building interventions to augment the resilience of a community or to decrease the risk that a secondary event will develop. This Framework provides the structure to systematically undertake and evaluate risk-reduction interventions. It applies to all interventions aimed at hazard mitigation and/or increasing the absorbing, buffering, or response capacities of a community-at-risk for a primary or secondary event that could result in a disaster. The Framework utilizes the structure provided by the DLM and consists of 14 steps: (1) hazards and risks identification; (2) historical perspectives and predictions; (3) selection of hazard(s) to address; (4) selection of appropriate indicators; (5) identification of current resilience standards and benchmarks; (6) assessment of the current resilience status; (7) identification of resilience needs; (8) strategic planning; (9) selection of an appropriate intervention; (10) operational planning; (11) implementation; (12) assessments of outputs; (13) synthesis; and (14) feedback. Each of these steps is a transformation process that is described in detail. Emphasis is placed on the role of Coordination and Control during planning, implementation of risk-reduction/capacity building interventions, and evaluation. Birnbaum ML , Daily EK , O'Rourke AP , Loretti A . Research and evaluations of the health aspects of disasters, part IX: Risk-Reduction Framework. Prehosp Disaster Med. 2016;31(3):309-325.
Vulnerability Assessment Using LIDAR Data in Silang-Sta Rosa Subwatershed, Philippines
NASA Astrophysics Data System (ADS)
Bragais, M. A.; Magcale-Macandog, D. B.; Arizapa, J. L.; Manalo, K. M.
2016-10-01
Silang-Sta. Rosa Subwatershed is experiencing rapid urbanization. Its downstream area is already urbanized and the development is moving fast upstream. With the rapid land conversion of pervious to impervious areas and increase frequency of intense rainfall events, the downstream of the watershed is at risk of flood hazard. The widely used freeware HEC-RAS (Hydrologic Engineering Center- River Analysis System) model was used to implement the 2D unsteady flow analysis to develop a flood hazard map. The LiDAR derived digital elevation model (DEM) with 1m resolution provided detailed terrain that is vital for producing reliable flood extent map that can be used for early warning system. With the detailed information from the simulation like areas to be flooded, the predicted depth and duration, we can now provide specific flood forecasting and mitigation plan even at community level. The methodology of using 2D unsteady flow modelling and high resolution DEM in a watershed can be replicated to other neighbouring watersheds specially those areas that are not yet urbanized so that their development will be guided to be flood hazard resilient. LGUs all over the country will benefit from having a high resolution flood hazard map.
Concerns related to Safety Management of Engineered Nanomaterials in research environment
NASA Astrophysics Data System (ADS)
Groso, A.; Meyer, Th
2013-04-01
Since the rise of occupational safety and health research on nanomaterials a lot of progress has been made in generating health effects and exposure data. However, when detailed quantitative risk analysis is in question, more research is needed, especially quantitative measures of workers exposure and standards to categorize toxicity/hazardousness data. In the absence of dose-response relationships and quantitative exposure measurements, control banding (CB) has been widely adopted by OHS community as a pragmatic tool in implementing a risk management strategy based on a precautionary approach. Being in charge of health and safety in a Swiss university, where nanomaterials are largely used and produced, we are also faced with the challenge related to nanomaterials' occupational safety. In this work, we discuss the field application of an in-house risk management methodology similar to CB as well as some other methodologies. The challenges and issues related to the process will be discussed. Since exact data on nanomaterials hazardousness are missing for most of the situations, we deduce that the outcome of the analysis for a particular process is essentially the same with a simple methodology that determines only exposure potential and the one taking into account the hazardousness of ENPs. It is evident that when reliable data on hazardousness factors (as surface chemistry, solubility, carcinogenicity, toxicity etc.) will be available, more differentiation will be possible in determining the risk for different materials. On the protective measures side, all CB methodologies are inclined to overprotection side, only that some of them suggest comprehensive protective/preventive measures and others remain with basic advices. The implementation and control of protective measures in research environment will also be discussed.
The SAMCO Web-platform for resilience assessment in mountainous valleys impacted by landslide risks.
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Thomas, Loic; Bernardie, Severine
2016-04-01
The ANR-SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points: (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform available on the web. The strength and originality of the SAMCO project lies in the combination of different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) that are implemented in a user-oriented web-platform, currently in development. We present the first results of this development task, architecture and functions of the web-tools, the case studies database showing the multi-hazard maps and the stakes at risks. Risk assessment over several area of interest in Alpine or Pyrenean valleys are still in progress, but the first analyses are presented for current and future periods for which climate change and land-use (economical, geographical and social aspects) scenarios are taken into account. This tool, dedicated to stakeholders, should be finally used to evaluate resilience of mountainous regions since multiple scenarios can be tested and compared.
EPOS Thematic Core Service Anthropogenic Hazards: Implementation Plan
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanislaw; Grasso, Jean Robert; Schmittbuhl, Jean; Styles, Peter; Kwiatek, Grzegorz; Sterzel, Mariusz; Garcia, Alexander
2015-04-01
EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) aims to integrate distributed research infrastructures (RI) to facilitate and stimulate research on anthropogenic hazards (AH) especially those associated with the exploration and exploitation of geo-resources. The innovative element is the uniqueness of the integrated RI which comprises two main deliverables: (1) Exceptional datasets, called "episodes", which comprehensively describe a geophysical process; induced or triggered by human technological activity, posing hazard for populations, infrastructure and the environment, (2) Problem-oriented, bespoke services uniquely designed for the discrimination and analysis of correlations between technology, geophysical response and resulting hazard. These objectives will be achieved through the Science-Industry Synergy (SIS) built by EPOS WG10, ensuring bi-directional information exchange, including unique and previously unavailable data furnished by industrial partners. The Episodes and services to be integrated have been selected using strict criteria during the EPOS PP. The data are related to a wide spectrum of inducing technologies, with seismic/aseismic deformation and production history as a minimum data set requirement and the quality of software services is confirmed and referenced in literature. Implementation of TCS AH is planned for four years and requires five major activities: (1) Strategic Activities and Governance: will define and establish the governance structure to ensure the long-term sustainability of these research infrastructures for data provision through EPOS. (2) Coordination and Interaction with the Community: will establish robust communication channels within the whole TCS AH community while supporting global EPOS communication strategy. (3) Interoperability with EPOS Integrated Core Service (ICS) and Testing Activities: will coordinate and ensure interoperability between the RIs and the ICS. Within this modality a functional e-research environment with access to High-Performance Computing will be built. A prototype for such an environment is already under construction and will become operational in mid -2015 (is-epos.eu). (4) Integration of AH Episodes: will address at least 20 global episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production which will be integrated into the e-environment of TCS AH. All the multi-disciplinary heterogeneous data from these particular episodes will be transformed to unified structures to form integrated data sets articulated with the defined standards of ICS and other TCS's. (5) Implementation of services for analyzing Episodes: will deliver the protocols and methodologies for analysis of the seismic/deformation response to time-varying georesource exploitation technologies on long and short time scales and the related time- and technology-dependent seismic hazard issues.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Thio, H. K.; Løvholt, F.; Harbitz, C. B.; Polet, J.; Lorito, S.; Basili, R.; Volpe, M.; Romano, F.; Selva, J.; Piatanesi, A.; Davies, G.; Griffin, J.; Baptista, M. A.; Omira, R.; Babeyko, A. Y.; Power, W. L.; Salgado Gálvez, M.; Behrens, J.; Yalciner, A. C.; Kanoglu, U.; Pekcan, O.; Ross, S.; Parsons, T.; LeVeque, R. J.; Gonzalez, F. I.; Paris, R.; Shäfer, A.; Canals, M.; Fraser, S. A.; Wei, Y.; Weiss, R.; Zaniboni, F.; Papadopoulos, G. A.; Didenkulova, I.; Necmioglu, O.; Suppasri, A.; Lynett, P. J.; Mokhtari, M.; Sørensen, M.; von Hillebrandt-Andrade, C.; Aguirre Ayerbe, I.; Aniel-Quiroga, Í.; Guillas, S.; Macias, J.
2016-12-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.
2017-12-01
The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Løvholt, Finn
2017-04-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
Safety analysis and review system (SARS) assessment report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Browne, E.T.
1981-03-01
Under DOE Order 5481.1, Safety Analysis and Review System for DOE Operations, safety analyses are required for DOE projects in order to ensure that: (1) potential hazards are systematically identified; (2) potential impacts are analyzed; (3) reasonable measures have been taken to eliminate, control, or mitigate the hazards; and (4) there is documented management authorization of the DOE operation based on an objective assessment of the adequacy of the safety analysis. This report is intended to provide the DOE Office of Plans and Technology Assessment (OPTA) with an independent evaluation of the adequacy of the ongoing safety analysis effort. Asmore » part of this effort, a number of site visits and interviews were conducted, and FE SARS documents were reviewed. The latter included SARS Implementation Plans for a number of FE field offices, as well as safety analysis reports completed for certain FE operations. This report summarizes SARS related efforts at the DOE field offices visited and evaluates the extent to which they fulfill the requirements of DOE 5481.1.« less
Informing Workers of Chemical Hazards: The OSHA Hazard Communication Standard.
ERIC Educational Resources Information Center
American Chemical Society, Washington, DC.
Practical information on how to implement a chemical-related safety program is outlined in this publication. Highlights of the federal Occupational Safety and Health Administrations (OSHA) Hazard Communication Standard are presented and explained. These include: (1) hazard communication requirements (consisting of warning labels, material safety…
Advanced Environmental Monitoring and Control Program: Strategic Plan
NASA Technical Reports Server (NTRS)
Schmidt, Gregory
1996-01-01
Human missions in space, from short-duration shuttle missions lasting no more than several days to the medium-to-long-duration missions planned for the International Space Station, face a number of hazards that must be understood and mitigated for the mission to be carried out safely. Among these hazards are those posed by the internal environment of the spacecraft itself; through outgassing of toxic vapors from plastics and other items, failures or off-nominal operations of spacecraft environmental control systems, accidental exposure to hazardous compounds used in experiments: all present potential hazards that while small, may accumulate and pose a danger to crew health. The first step toward mitigating the dangers of these hazards is understanding the internal environment of the spacecraft and the compounds contained within it. Future spacecraft will have integrated networks of redundant sensors which will not only inform the crew of hazards, but will pinpoint the problem location and, through analysis by intelligent systems, recommend and even implement a course of action to stop the problem. This strategic plan details strategies to determine NASA's requirements for environmental monitoring and control systems for future spacecraft, and goals and objectives for a program to answer these needs.
Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A
2001-10-12
As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.
Probabilistic Tsunami Hazard Analysis
NASA Astrophysics Data System (ADS)
Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.
2006-12-01
The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.
Rockfall Hazard Process Assessment : [Project Summary
DOT National Transportation Integrated Search
2017-10-01
The Montana Department of Transportation (MDT) implemented its Rockfall Hazard Rating System (RHRS) between 2003 and 2005, obtaining information on the state's rock slopes and their associated hazards. The RHRS data facilitated decision-making in an ...
FY 2017 Hazardous Waste Management Grant Program for Tribes
This notice announces the availability of funds and solicits proposals from federally-recognized tribes or intertribal consortia for the development and implementation of hazardous waste programs and for building capacity to address hazardous waste
NASA Astrophysics Data System (ADS)
Power, William; Wang, Xiaoming; Lane, Emily; Gillibrand, Philip
2013-09-01
Regional source tsunamis represent a potentially devastating threat to coastal communities in New Zealand, yet are infrequent events for which little historical information is available. It is therefore essential to develop robust methods for quantitatively estimating the hazards posed, so that effective mitigation measures can be implemented. We develop a probabilistic model for the tsunami hazard posed to the Auckland region of New Zealand from the Kermadec Trench and the southern New Hebrides Trench subduction zones. An innovative feature of our model is the systematic analysis of uncertainty regarding the magnitude-frequency distribution of earthquakes in the source regions. The methodology is first used to estimate the tsunami hazard at the coastline, and then used to produce a set of scenarios that can be applied to produce probabilistic maps of tsunami inundation for the study region; the production of these maps is described in part II. We find that the 2,500 year return period regional source tsunami hazard for the densely populated east coast of Auckland is dominated by events originating in the Kermadec Trench, while the equivalent hazard to the sparsely populated west coast is approximately equally due to events on the Kermadec Trench and the southern New Hebrides Trench.
40 CFR 264.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) SOLID WASTES (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE, AND... plan must be designed to minimize hazards to human health or the environment from fires, explosions, or any unplanned sudden or non-sudden release of hazardous waste or hazardous waste constituents to air...
40 CFR 265.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) SOLID WASTES (CONTINUED) INTERIM STATUS STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT... contingency plan must be designed to minimize hazards to human health or the environment from fires, explosions, or any unplanned sudden or non-sudden release of hazardous waste or hazardous waste constituents...
40 CFR 265.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) SOLID WASTES (CONTINUED) INTERIM STATUS STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT... contingency plan must be designed to minimize hazards to human health or the environment from fires, explosions, or any unplanned sudden or non-sudden release of hazardous waste or hazardous waste constituents...
40 CFR 265.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) SOLID WASTES (CONTINUED) INTERIM STATUS STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT... contingency plan must be designed to minimize hazards to human health or the environment from fires, explosions, or any unplanned sudden or non-sudden release of hazardous waste or hazardous waste constituents...
40 CFR 264.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) SOLID WASTES (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE, AND... plan must be designed to minimize hazards to human health or the environment from fires, explosions, or any unplanned sudden or non-sudden release of hazardous waste or hazardous waste constituents to air...
40 CFR 264.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) SOLID WASTES (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE, AND... plan must be designed to minimize hazards to human health or the environment from fires, explosions, or any unplanned sudden or non-sudden release of hazardous waste or hazardous waste constituents to air...
Ng, S K; McLachlan, G J
2003-04-15
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright 2003 John Wiley & Sons, Ltd.
Seismic risk management solution for nuclear power plants
Coleman, Justin; Sabharwall, Piyush
2014-12-01
Nuclear power plants should safely operate during normal operations and maintain core-cooling capabilities during off-normal events, including external hazards (such as flooding and earthquakes). Management of external hazards to expectable levels of risk is critical to maintaining nuclear facility and nuclear power plant safety. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components). Seismic isolation (SI) is one protective measure showing promise to minimize seismic risk. Current SI designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefitmore » of SI application in the nuclear industry is being recognized and SI systems have been proposed in American Society of Civil Engineer Standard 4, ASCE-4, to be released in the winter of 2014, for light water reactors facilities using commercially available technology. The intent of ASCE-4 is to provide criteria for seismic analysis of safety related nuclear structures such that the responses to design basis seismic events, computed in accordance with this standard, will have a small likelihood of being exceeded. The U.S. nuclear industry has not implemented SI to date; a seismic isolation gap analysis meeting was convened on August 19, 2014, to determine progress on implementing SI in the U.S. nuclear industry. The meeting focused on the systems and components that could benefit from SI. As a result, this article highlights the gaps identified at this meeting.« less
The HSE management system in practice-implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Primrose, M.J.; Bentley, P.D.; Sykes, R.M.
1996-11-01
This paper sets out the necessary strategic issues that must be dealt with when setting up a management system for HSE. It touches on the setting of objectives using a form of risk matrix and the establishment of corporate risk tolerability levels. Such issue management is vital but can be seen as yet another corporate HQ initiative. It must therefore be linked, and made relevant to those in middle management tasked with implementing the system and also to those at risk {open_quote}at the sharp end{close_quote} of the business. Setting acceptance criteria is aimed at demonstrating a necessary and sufficient levelmore » of control or coverage for those hazards considered as being within the objective setting of the Safety or HSE Case. Critical risk areas addressed via the Safety Case, within Shell companies at least, must show how this coverage is extended to critical health and environmental issues. Methods of achieving this are various ranging from specific Case deliverables (like the Hazard Register and Accountability Matrices) through to the incorporation of topics from the hazard analysis in toolbox talks and meetings. Risk analysis techniques are increasingly seen as complementary rather than separate with environmental assessments, health risk assessment sand safety risk analyses taking place together and results being considered jointly. The paper ends with some views on the way ahead regarding the linking of risk decisions to target setting at the workplace and views on how Case information may be retrieved and used on a daily basis.« less
FY 2018 Hazardous Waste Management Grant Program For Tribes
This notice announces the availability of funds and solicits proposals from federally-recognized tribes or intertribal consortia for the development and implementation of hazardous waste programs and for building capacity to address hazardous waste managem
Sadeghi, Samira; Sadeghi, Leyla; Tricot, Nicolas; Mathieu, Luc
2017-12-01
Accident reports are published in order to communicate the information and lessons learned from accidents. An efficient accident recording and analysis system is a necessary step towards improvement of safety. However, currently there is a shortage of efficient tools to support such recording and analysis. In this study we introduce a flexible and customizable tool that allows structuring and analysis of this information. This tool has been implemented under TEEXMA®. We named our prototype TEEXMA®SAFETY. This tool provides an information management system to facilitate data collection, organization, query, analysis and reporting of accidents. A predefined information retrieval module provides ready access to data which allows the user to quickly identify the possible hazards for specific machines and provides information on the source of hazards. The main target audience for this tool includes safety personnel, accident reporters and designers. The proposed data model has been developed by analyzing different accident reports.
ERIC Educational Resources Information Center
Pivarnik, Lori F.; Patnoad, Martha S.; Nyachuba, David; McLandsborough, Lynne; Couto, Stephen; Hagan, Elsina E.; Breau, Marti
2013-01-01
Food safety training materials, targeted for residential childcare institution (RCCI) staff of facilities of 20 residents or less, were developed, piloted, and evaluated. The goal was to assist in the implementation of a Hazard Analysis Critical Control Points (HACCP)-based food safety plan as required by Food and Nutrition Service/United States…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-29
...The Food and Drug Administration (FDA) is proposing regulations for domestic and foreign facilities that are required to register under the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to establish requirements for current good manufacturing practice in manufacturing, processing, packing, and holding of animal food. FDA also is proposing regulations to require that certain facilities establish and implement hazard analysis and risk-based preventive controls for food for animals. FDA is taking this action to provide greater assurance that animal food is safe and will not cause illness or injury to animals or humans and is intended to build an animal food safety system for the future that makes modern, science and risk-based preventive controls the norm across all sectors of the animal food system.
Socioeconomic Determinants of Adult Mortality in Namibia Using an Event History Analysis.
Kandjimbi, Alina; Nickanor, Ndeyapo; Kazembe, Lawrence N
2014-01-01
Adult mortality remains a neglected public health issue in sub-Saharan Africa, with most policy instruments concentrated on child and maternal health. In developed countries, adult mortality is negatively associated with socioeconomic factors. A similar pattern is expected in developing countries, but has not been extensively demonstrated, because of dearth of data. Understanding the hazard and factors associated with adult mortality is crucial for informing policies and for implementation of interventions aimed at improving adult survival. This paper applied a geo-additive survival model to elucidate effects of socioeconomic factors on adult mortality in Namibia, controlling for spatial frailties. Results show a clear disadvantage for adults in rural areas, for those not married and from poor households or in female-headed households. The hazard of adult mortality was highly variable with a 1.5-fold difference between areas, with highest hazard recorded in north eastern, central west and southern west parts of the country. The analysis emphasizes that, for Namibia to achieve its national development goals, targeted interventions should be aimed at poor-resourced adults, particularly in high-risk areas.
Beyond reducing fire hazard: fuel treatment impacts on overstory tree survival
Brandon M. Collins; Adrian J. Das; John J. Battles; Danny L. Fry; Kevin D. Krasnow; Scott L. Stephens
2014-01-01
Fuel treatment implementation in dry forest types throughout the western United States is likely to increase in pace and scale in response to increasing incidence of large wildfires. While it is clear that properly implemented fuel treatments are effective at reducing hazardous fire potential, there are ancillary ecological effects that can impact forest...
Developments in management and technology of waste reduction and disposal.
Rushbrook, Philip
2006-09-01
Scandals and public dangers from the mismanagement and poor disposal of hazardous wastes during the 1960s and 1970s awakened the modern-day environmental movement. Influential publications such as "Silent Spring" and high-profile disposal failures, for example, Love Canal and Lekkerkerk, focused attention on the use of chemicals in everyday life and the potential dangers from inappropriate disposal. This attention has not abated and developments, invariably increasing expectations and tightening requirements, continue to be implemented. Waste, as a surrogate for environmental improvement, is a topic where elected representatives and administrations continually want to do more. This article will chart the recent changes in hazardous waste management emanating from the European Union legislation, now being implemented in Member States across the continent. These developments widen the range of discarded materials regarded as "hazardous," prohibit the use of specific chemicals, prohibit the use of waste management options, shift the emphasis from risk-based treatment and disposal to inclusive lists, and incorporate waste producers into more stringent regulatory regimes. The impact of the changes is also intended to provide renewed impetus for waste reduction. Under an environmental control system where only certainty is tolerated, the opportunities for innovation within the industry and the waste treatment and disposal sector will be explored. A challenging analysis will be offered on the impact of this regulation-led approach to the nature and sustainability of hazardous waste treatment and disposal in the future.
Performance Analysis: Work Control Events Identified January - August 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Grange, C E; Freeman, J W; Kerr, C E
2011-01-14
This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting inmore » each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete. The second most frequent cause was unclear, incomplete or confusing documents directing the work. Together, these two causes were mentioned 17 times and contributed to 13 of the events. All of the events with the cause of ''workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete'' had this error in the first two ISMS functions: define the work and analyze the hazard. This means that these causes result in the scope of work being ill-defined or the hazard(s) improperly analyzed. Incomplete implementation of these functional steps leads to the hazards not being controlled. The causes are then manifested in events when the work is conducted. The process to operate safely relies on accurately defining the scope of work. This review has identified a number of examples of latent organizational weakness in the execution of work control processes.« less
Comparative risk analysis of technological hazards (a review).
Kates, R W; Kasperson, J X
1983-01-01
Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625
Correlating regional natural hazards for global reinsurance risk assessment
NASA Astrophysics Data System (ADS)
Steptoe, Hamish; Maynard, Trevor; Economou, Theo; Fox, Helen; Wallace, Emily; Maisey, Paul
2016-04-01
Concurrent natural hazards represent an uncertainty in assessing exposure for the insurance industry. The recently implemented Solvency II Directive requires EU insurance companies to fully understand and justify their capital reserving and portfolio decisions. Lloyd's, the London insurance and reinsurance market, commissioned the Met Office to investigate the dependencies between different global extreme weather events (known to the industry as perils), and the mechanisms for these dependencies, with the aim of helping them assess their compound risk to the exposure of multiple simultaneous hazards. In this work, we base the analysis of hazard-to-hazard dependency on the interaction of different modes of global and regional climate variability. Lloyd's defined 16 key hazard regions, including Australian wildfires, flooding in China and EU windstorms, and we investigate the impact of 10 key climate modes on these areas. We develop a statistical model that facilitates rapid risk assessment whilst allowing for both temporal auto-correlation and, crucially, interdependencies between drivers. The simulator itself is built conditionally using autoregressive regression models for each driver conditional on the others. Whilst the baseline assumption within the (re)insurance industry is that different natural hazards are independent of each other, the assumption of independence of meteorological risks requires greater justification. Although our results suggest that most of the 120 hazard-hazard connections considered are likely to be independent of each other, 13 have significant dependence arising from one or more global modes of climate variability. This allows us to create a matrix of linkages describing the hazard dependency structure that Lloyd's can use to inform their understanding of risk.
Agyei-Baffour, Peter; Sekyere, Kofi Boateng; Addy, Ernestine Akosua
2013-11-04
Food borne diseases claim more lives and are growing public health concerns. Simple preventive techniques such as adoption and adherence to hazard analysis and critical control point (HACCP) policy can significantly reduce this disease burden. Though food screening and inspection are done, the ultimate regulation, Hazard Analysis and Critical Control Point, which is known and accepted worldwide, appears not to be popular among food operators in Ghana. This paper examines the level of awareness of the existence of policy on hazard analysis and critical control point (HACCP) and its adherence to food preparation guidelines among food service providers in Ghana. The results revealed the mean age of food providers as 33.1 years with a standard deviation of 7.5, range of 18-55 years, more females, in full time employment and with basic education. Of the fifty institutional managers, 42 (84%) were senior officers and had worked for more than five years. Education and type of food operator had strong statistically significant relationship with the implementation of HCCP policy and adherence with food preparation guidelines. The enforcement of HACCP policy and adherence with food safety guidelines was led by the Ghana Tourist Board, Public Health officers, and KMA, respectively. While a majority of food operators 373/450 (83.3%) did not know HACCP policy is part of food safety guidelines, staff of food safety law enforcement 44/50 (88%) confirmed knowing that food operators were not aware of the HACCP policy. The study documents evidence on the practice of food safety principles or HACCP policy or adherence to food preparation guidelines. Existing food safety guidelines incorporate varying principles of HACCP, however, awareness is low among food operators. The implication is that food production is likely to fall short of acceptable standards and not be wholesome putting consumers at health risk. Repeating this study in rural and urban areas in Ghana is necessary to provide much more evidence to inform food safety guidelines. Further studies on chemical analysis of food and implementing training modules on HACCP policy for food producers and law enforcement agencies may be helpful to improve existing situation.
2013-01-01
Background Food borne diseases claim more lives and are growing public health concerns. Simple preventive techniques such as adoption and adherence to hazard analysis and critical control point (HACCP) policy can significantly reduce this disease burden. Though food screening and inspection are done, the ultimate regulation, Hazard Analysis and Critical Control Point, which is known and accepted worldwide, appears not to be popular among food operators in Ghana. This paper examines the level of awareness of the existence of policy on hazard analysis and critical control point (HACCP) and its adherence to food preparation guidelines among food service providers in Ghana. Results The results revealed the mean age of food providers as 33.1 years with a standard deviation of 7.5, range of 18–55 years, more females, in full time employment and with basic education. Of the fifty institutional managers, 42 (84%) were senior officers and had worked for more than five years. Education and type of food operator had strong statistically significant relationship with the implementation of HCCP policy and adherence with food preparation guidelines. The enforcement of HACCP policy and adherence with food safety guidelines was led by the Ghana Tourist Board, Public Health officers, and KMA, respectively. While a majority of food operators 373/450 (83.3%) did not know HACCP policy is part of food safety guidelines, staff of food safety law enforcement 44/50 (88%) confirmed knowing that food operators were not aware of the HACCP policy. Conclusion The study documents evidence on the practice of food safety principles or HACCP policy or adherence to food preparation guidelines. Existing food safety guidelines incorporate varying principles of HACCP, however, awareness is low among food operators. The implication is that food production is likely to fall short of acceptable standards and not be wholesome putting consumers at health risk. Repeating this study in rural and urban areas in Ghana is necessary to provide much more evidence to inform food safety guidelines. Further studies on chemical analysis of food and implementing training modules on HACCP policy for food producers and law enforcement agencies may be helpful to improve existing situation. PMID:24180236
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen
2016-04-01
Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of coincidence into account. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation taking into account the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and their usage in flood risk management are outlined.
Incorporating natural hazard assessments into municipal master-plans; case-studies from Israel
NASA Astrophysics Data System (ADS)
Katz, Oded
2010-05-01
The active Dead Sea Rift (DSR) runs along the length of Israel, making the entire state susceptible to earthquake-related hazards. Current building codes generally acknowledge seismic hazards and direct engineers towards earthquake-resistant structures. However, hazard mapping on a scale fit for municipal/governmental planning is subject to local initiative and is currently not mandatory as seems necessary. In the following, a few cases of seismic-hazard evaluation made by the Geological Survey of Israel are presented, emphasizing the reasons for their initiation and the way results were incorporated (or not). The first case is a seismic hazard qualitative micro-zonation invited by the municipality of Jerusalem as part of a new master plan. This work resulted in maps (1:50,000; GIS format) identifying areas prone to (1) amplification of seismic shaking due to site characteristics (outcrops of soft rocks or steep topography) and (2) sites with earthquake induced landslide (EILS) hazard. Results were validated using reports from the 1927, M=6.2 earthquake that originated along the DSR about 30km east of Jerusalem. Although the hazard maps were accepted by municipal authorities, practical use by geotechnical engineers working within the frame of the new master-plan was not significant. The main reason for that is apparently a difference of opinion between the city-engineers responsible for implementing the new master-plan and the geologists responsible of the hazard evaluation. The second case involves evaluation of EILS hazard for two towns located further north along the DSR, Zefat and Tiberias. Both were heavily damaged more than once by strong earthquakes in past centuries. Work was carried out as part of a governmental seismic-hazard reduction program. The results include maps (1:10,000 scales) of sites with high EILS hazard identified within city limits. Maps (in GIS format) were sent to city engineers with reports explaining the methods and results. As far as we know, widespread implementation of the maps within municipal master plans never came about, and there was no open discussion between city engineers and the Geological Survey. The main reasons apparently are (1) a lack, until recently, of mandatory building codes requiring incorporation of EILS hazard; (2) budget priorities; (3) failure to involve municipality personnel in planning and executing the EILS hazard evaluation. These cases demonstrate that for seismic hazard data to be incorporated and implemented within municipal master-plans there needs to be (1) active involvement of municipal officials and engineers from the early planning stages of the evaluation campaign, and (2) a-priori dedication of funds towards implementation of evaluation results.
RiskScape: a new tool for comparing risk from natural hazards (Invited)
NASA Astrophysics Data System (ADS)
Stirling, M. W.; King, A.
2010-12-01
The Regional RiskScape is New Zealand’s joint venture between GNS Science & NIWA, and represents a comprehensive and easy-to-use tool for multi-hazard-based risk and impact analysis. It has basic GIS functionality, in that it has Import/Export functions to use with GIS software. Five natural hazards have been implemented in Riskscape to date: Flood (river), earthquake, volcano (ash), tsunami and wind storm. The software converts hazard exposure information into the likely impacts for a region, for example, damage and replacement costs, casualties, economic losses, disruption, and number of people affected. It therefore can be used to assist with risk management, land use planning, building codes and design, risk identification, prioritization of risk-reduction/mitigation, determination of “best use” risk-reduction investment, evacuation and contingency planning, awareness raising, public information, realistic scenarios for exercises, and hazard event response. Three geographically disparate pilot regions have been used to develop and triall Riskscape in New Zealand, and each region is exposed to a different mix of natural hazards. Future (phase II) development of Riskscape will include the following hazards: Landslides (both rainfall and earthquake triggered), storm surges, pyroclastic flows and lahars, and climate change effects. While Riskscape developments have thus far focussed on scenario-based risk, future developments will advance the software into providing probabilistic-based solutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prooijen, Monique van; Breen, Stephen
Purpose: Our treatment for choroidal melanoma utilizes the GTC frame. The patient looks at a small LED to stabilize target position. The LED is attached to a metal arm attached to the GTC frame. A camera on the arm allows therapists to monitor patient compliance. To move to mask-based immobilization we need a new LED/camera attachment mechanism. We used a Hazard-Risk Analysis (HRA) to guide the design of the new tool. Method: A pre-clinical model was built with input from therapy and machine shop personnel. It consisted of an aluminum frame placed in aluminum guide posts attached to the couchmore » top. Further development was guided by the Department of Defense Standard Practice - System Safety hazard risk analysis technique. Results: An Orfit mask was selected because it allowed access to indexes on the couch top which assist with setup reproducibility. The first HRA table was created considering mechanical failure modes of the device. Discussions with operators and manufacturers identified other failure modes and solutions. HRA directed the design towards a safe clinical device. Conclusion: A new immobilization tool has been designed using hazard-risk analysis which resulted in an easier-to-use and safer tool compared to the initial design. The remaining risks are all low probability events and not dissimilar from those currently faced with the GTC setup. Given the gains in ease of use for therapists and patients as well as the lower costs for the hospital, we will implement this new tool.« less
Environmental Assessment: Land Acquisition at Whiteman Air Force Base, Missouri
2011-06-01
Canadian clearweed (Pilea pumila), common duckweed ( Lemna minor ), common rush (Juncus effusus), cottonwood (Populus deltoides), crabgrass...resources, hazardous materials and hazardous waste, and safety. Implementation of the Proposed Action would result in minor , short-term adverse impacts...consumption of petroleum products during fence construction. As a result of implementing the Proposed Action, minor long-term adverse impacts to land use
ERIC Educational Resources Information Center
Saldaria, Miguel Angel Mariscal; Herrero, Susana Garcia; Rodriguez, Javier Garcia; Ritzel, Dale
2012-01-01
All workers have the right to perform their job duties under the best possible conditions, safeguarded from the harm which the execution of their duties may entail. In addition, employers have the obligation to guarantee this right to health, implementing a preventive system which assures the safety and health of the workers under their charge.…
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.
2014-10-01
Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.
2015-03-01
Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
Flood Hazard Management: British and International Perspectives
NASA Astrophysics Data System (ADS)
James, L. Douglas
This proceedings of an international workshop at the Flood Hazard Research Centre (Queensway, Enfield, Middlesex, U.K.) begins by noting how past British research on flood problems concentrated on refining techniques to implement established policy. In contrast, research covered in North American and Australian publications involved normative issues on policy alternatives and administrative implementation. The workshop's participants included 16 widely recognized scientists, whose origins were about equally divided between Britain and overseas; from this group the workshop's organizers expertly drew ideas for refining British urban riverine flood hazard management and for cultivating links among researchers everywhere. Such intellectual exchange should be of keen interest to flood hazard program managers around the world, to students of comparative institutional performance, to those who make policy on protecting people from hazards, and to hydrologists and other geophysicists who must communicate descriptive information for bureaucratic, political, and public decision- making.
Xian, Siyuan; Yin, Jie; Lin, Ning; Oppenheimer, Michael
2018-01-01
Coastal flood protection measures have been widely implemented to improve flood resilience. However, protection levels vary among coastal megacities globally. This study compares the distinct flood protection standards for two coastal megacities, New York City and Shanghai, and investigates potential influences such as risk factors and past flood events. Extreme value analysis reveals that, compared to NYC, Shanghai faces a significantly higher flood hazard. Flood inundation analysis indicates that Shanghai has a higher exposure to extreme flooding. Meanwhile, Shanghai's urban development, population, and economy have increased much faster than NYC's over the last three decades. These risk factors provide part of the explanation for the implementation of a relatively high level of protection (e.g. reinforced concrete sea-wall designed for a 200-year flood return level) in Shanghai and low protection (e.g. vertical brick and stone walls and sand dunes) in NYC. However, individual extreme flood events (typhoons in 1962, 1974, and 1981) seem to have had a greater impact on flood protection decision-making in Shanghai, while NYC responded significantly less to past events (with the exception of Hurricane Sandy). Climate change, sea level rise, and ongoing coastal development are rapidly changing the hazard and risk calculus for both cities and both would benefit from a more systematic and dynamic approach to coastal protection. Copyright © 2017 Elsevier B.V. All rights reserved.
Incineration is often the preferred technology for disposing of hazardous waste, and remediating Superfund sites. The effective implementation of this technology is frequently impeded by strong public opposition `to hazardous waste' incineration HWI). One of the reasons cited for...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-03
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Distribution Systems, Gas Transmission and Gathering Systems, and Hazardous Liquid Systems AGENCY: Pipeline and.... SUMMARY: This notice advises owners and operators of gas pipeline facilities and hazardous liquid pipeline...
The national emission standards for hazardous air pollutants for miscellaneous coating manufacturing. Includes summary, rule history, compliance and implementation information, federal registry citations.
Preventing blood transfusion failures: FMEA, an effective assessment method.
Najafpour, Zhila; Hasoumi, Mojtaba; Behzadi, Faranak; Mohamadi, Efat; Jafary, Mohamadreza; Saeedi, Morteza
2017-06-30
Failure Mode and Effect Analysis (FMEA) is a method used to assess the risk of failures and harms to patients during the medical process and to identify the associated clinical issues. The aim of this study was to conduct an assessment of blood transfusion process in a teaching general hospital, using FMEA as the method. A structured FMEA was recruited in our study performed in 2014, and corrective actions were implemented and re-evaluated after 6 months. Sixteen 2-h sessions were held to perform FMEA in the blood transfusion process, including five steps: establishing the context, selecting team members, analysis of the processes, hazard analysis, and developing a risk reduction protocol for blood transfusion. Failure modes with the highest risk priority numbers (RPNs) were identified. The overall RPN scores ranged from 5 to 100 among which, four failure modes were associated with RPNs over 75. The data analysis indicated that failures with the highest RPNs were: labelling (RPN: 100), transfusion of blood or the component (RPN: 100), patient identification (RPN: 80) and sampling (RPN: 75). The results demonstrated that mis-transfusion of blood or blood component is the most important error, which can lead to serious morbidity or mortality. Provision of training to the personnel on blood transfusion, knowledge raising on hazards and appropriate preventative measures, as well as developing standard safety guidelines are essential, and must be implemented during all steps of blood and blood component transfusion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mansfield, N.J.
1992-01-01
The increasing number of hazardous materials accidents in the United States has resulted in new federal regulations addressing the emergency response activities associated with chemical releases. A significant part of these new federal standards (29 CFR 1910.120 and 40 CFR Part 311) requires compliance with specific criteria by all personnel involved in a hazardous material emergency. This study investigated alternative lesson design models applicable to instruction for hazardous material emergencies. A specialized design checklist was created based on the work of Gagne, Briggs, and Wager (1988), Merrill (1987), and Clark (1989). This checklist was used in the development of lessonmore » plan templates for the hazardous materials incident commander course. Qualitative data for establishing learning objectives was collected by conducting a needs assessment and a job analysis of the incident commander position. Incident commanders from 14 public and private organizations participated in the needs assessment process. Technical information for the lessons was collected from appropriate governmental agencies. The implementation of the checklist and lesson plans can contribute to assuring quality training for incident commanders throughout the United States.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-24
...] Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings AGENCY... for Wind Retrofit Projects for Existing Residential Buildings. DATES: Comments must be received by... property from hazards and their effects. One such activity is the implementation of wind retrofit projects...
40 CFR 265.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) SOLID WASTES (CONTINUED) INTERIM STATUS STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT..., explosions, or any unplanned sudden or non-sudden release of hazardous waste or hazardous waste constituents to air, soil, or surface water. (b) The provisions of the plan must be carried out immediately...
40 CFR 264.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) SOLID WASTES (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE, AND... any unplanned sudden or non-sudden release of hazardous waste or hazardous waste constituents to air, soil, or surface water. (b) The provisions of the plan must be carried out immediately whenever there...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-08
... provisions to the Office of Information and Regulatory Affairs, Office of Management and Budget (OMB), Attn... information provided, unless the comment includes information claimed to be Confidential Business Information... EPA information about these programs and hazards in a risk management plan (RMP). The implementing...
Incineration is often the preferred technology for disposing of hazardous waste and remediating Superfund sites. The effective implementation of this technology is frequently impeded by strong public opposition to hazardous waste incineration (HWI). One of the reasons cited for t...
Safety Guided Design of Crew Return Vehicle in Concept Design Phase Using STAMP/STPA
NASA Astrophysics Data System (ADS)
Nakao, H.; Katahira, M.; Miyamoto, Y.; Leveson, N.
2012-01-01
In the concept development and design phase of a new space system, such as a Crew Vehicle, designers tend to focus on how to implement new technology. Designers also consider the difficulty of using the new technology and trade off several system design candidates. Then they choose an optimal design from the candidates. Safety should be a key aspect driving optimal concept design. However, in past concept design activities, safety analysis such as FTA has not used to drive the design because such analysis techniques focus on component failure and component failure cannot be considered in the concept design phase. The solution to these problems is to apply a new hazard analysis technique, called STAMP/STPA. STAMP/STPA defines safety as a control problem rather than a failure problem and identifies hazardous scenarios and their causes. Defining control flow is the essential in concept design phase. Therefore STAMP/STPA could be a useful tool to assess the safety of system candidates and to be part of the rationale for choosing a design as the baseline of the system. In this paper, we explain our case study of safety guided concept design using STPA, the new hazard analysis technique, and model-based specification technique on Crew Return Vehicle design and evaluate benefits of using STAMP/STPA in concept development phase.
[Failure modes and effects analysis in the prescription, validation and dispensing process].
Delgado Silveira, E; Alvarez Díaz, A; Pérez Menéndez-Conde, C; Serna Pérez, J; Rodríguez Sagrado, M A; Bermejo Vicedo, T
2012-01-01
To apply a failure modes and effects analysis to the prescription, validation and dispensing process for hospitalised patients. A work group analysed all of the stages included in the process from prescription to dispensing, identifying the most critical errors and establishing potential failure modes which could produce a mistake. The possible causes, their potential effects, and the existing control systems were analysed to try and stop them from developing. The Hazard Score was calculated, choosing those that were ≥ 8, and a Severity Index = 4 was selected independently of the hazard Score value. Corrective measures and an implementation plan were proposed. A flow diagram that describes the whole process was obtained. A risk analysis was conducted of the chosen critical points, indicating: failure mode, cause, effect, severity, probability, Hazard Score, suggested preventative measure and strategy to achieve so. Failure modes chosen: Prescription on the nurse's form; progress or treatment order (paper); Prescription to incorrect patient; Transcription error by nursing staff and pharmacist; Error preparing the trolley. By applying a failure modes and effects analysis to the prescription, validation and dispensing process, we have been able to identify critical aspects, the stages in which errors may occur and the causes. It has allowed us to analyse the effects on the safety of the process, and establish measures to prevent or reduce them. Copyright © 2010 SEFH. Published by Elsevier Espana. All rights reserved.
Sampling methods for microbiological analysis of red meat and poultry carcasses.
Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos
2004-06-01
Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.
NASA Technical Reports Server (NTRS)
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
NASA Astrophysics Data System (ADS)
Schubert, Jochen E.; Burns, Matthew J.; Fletcher, Tim D.; Sanders, Brett F.
2017-10-01
This research outlines a framework for the case-specific assessment of Green Infrastructure (GI) performance in mitigating flood hazard in small urban catchments. The urban hydrologic modeling tool (MUSIC) is coupled with a fine resolution 2D hydrodynamic model (BreZo) to test to what extent retrofitting an urban watershed with GI, rainwater tanks and infiltration trenches in particular, can propagate flood management benefits downstream and support intuitive flood hazard maps useful for communicating and planning with communities. The hydrologic and hydraulic models are calibrated based on current catchment conditions, then modified to represent alternative GI scenarios including a complete lack of GI versus a full implementation of GI. Flow in the hydrologic/hydraulic models is forced using a range of synthetic rainfall events with annual exceedance probabilities (AEPs) between 1-63% and durations from 10 min to 24 h. Flood hazard benefits mapped by the framework include maximum flood depths and extents, flow intensity (m2/s), flood duration, and critical storm duration leading to maximum flood conditions. Application of the system to the Little Stringybark Creek (LSC) catchment shows that across the range of AEPs tested and for storm durations equal or less than 3 h, presently implemented GI reduces downstream flooded area on average by 29%, while a full implementation of GI would reduce downstream flooded area on average by 91%. A full implementation of GI could also lower maximum flow intensities by 83% on average, reducing the drowning hazard posed by urban streams and improving the potential for access by emergency responders. For storm durations longer than 3 h, a full implementation of GI lacks the capacity to retain the resulting rainfall depths and only reduces flooded area by 8% and flow intensity by 5.5%.
Organic Liquids Distribution: National Emission Standards for Hazardous Air Pollutants (NESHAP)
National emission standards for hazardous air pollutants (NESHAP) for organic liquidsdistribution (OLD) (non-gasoline) operations. Includes rule history, Federal Registry citations, implementation and compliance information.
Hanks, Thomas C.; Abrahamson, Norm A.; Boore, David M.; Coppersmith, Kevin J.; Knepprath, Nichole E.
2009-01-01
In April 1997, after four years of deliberations, the Senior Seismic Hazard Analysis Committee released its report 'Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts' through the U.S. Nuclear Regulatory Commission as NUREG/CR-6372, hereafter SSHAC (1997). Known informally ever since as the 'SSHAC Guidelines', SSHAC (1997) addresses why and how multiple expert opinions - and the intrinsic uncertainties that attend them - should be used in Probabilistic Seismic Hazard Analyses (PSHA) for critical facilities such as commercial nuclear power plants. Ten years later, in September 2007, the U.S. Geological Survey (USGS) entered into a 13-month agreement with the U.S. Nuclear Regulatory Commission (NRC) titled 'Practical Procedures for Implementation of the SSHAC Guidelines and for Updating PSHAs'. The NRC was interested in understanding and documenting lessons learned from recent PSHAs conducted at the higher SSHAC Levels (3 and 4) and in gaining input from the seismic community for updating PSHAs as new information became available. This study increased in importance in anticipation of new applications for nuclear power facilities at both existing and new sites. The intent of this project was not to replace the SSHAC Guidelines but to supplement them with the experience gained from putting the SSHAC Guidelines to work in practical applications. During the course of this project, we also learned that updating PSHAs for existing nuclear power facilities involves very different issues from the implementation of the SSHAC Guidelines for new facilities. As such, we report our findings and recommendations from this study in two separate documents, this being the first. The SSHAC Guidelines were written without regard to whether the PSHAs to which they would be applied were site-specific or regional in scope. Most of the experience gained to date from high-level SSHAC studies has been for site-specific cases, although three ongoing (as of this writing) studies are regional in scope. Updating existing PSHAs will depend more critically on the differences between site-specific and regional studies, and we will also address these differences in more detail in the companion report. Most of what we report here and in the second report on updating PSHAs emanates from three workshops held by the USGS at their Menlo Park facility: 'Lessons Learned from SSHAC Level 3 and 4 PSHAs' on January 30-31, 2008; 'Updates to Existing PSHAs' on May 6-7, 2008; and 'Draft Recommendations, SSHAC Implementation Guidance' on June 4-5, 2009. These workshops were attended by approximately 40 scientists and engineers familiar with hazard studies for nuclear facilities. This company included four of the authors of SSHAC (1997) and four other experts whose contributions to this document are mentioned in the Acknowledgments section; numerous scientists and engineers who in one role or another have participated in one or more high-level SSHAC PSHAs summarized later in this report; and representatives of the nuclear industry, the consulting world, the regulatory community, and academia with a keen interest and expertise in hazard analysis. This report is a community-based set of recommendations to NRC for improved practical procedures for implementation of the SSHAC Guidelines. In an early publication specifically addressing the SSHAC Guidelines, Hanks (1997) noted that the SSHAC Guidelines were likely to evolve for some time to come, and this remains true today. While the broad philosophical and theoretical dimensions of the SSHAC Guidelines will not change, much has been learned during the past decade from various applications of the SSHAC Guidelines to real PSHAs in terms of how they are implemented. We anticipate that, in their practical applications, the SSHAC Guidelines will continue to evolve as more experience is gained from future SSHAC applications. Indeed, to the extent that every PSHA has its
NASA Astrophysics Data System (ADS)
Michelini, A.; Wotawa, G.; Arnold-Arias, D.
2017-12-01
ARISTOTLE (http://aristotle.ingv.it/) is a Pilot Project funded by the DG ECHO (EU Humanitarian Aid and Civil Protection) that provides expert scientific advice on natural disasters around the world that may cause a country to seek international help to the EU's Emergency Response Coordination Centre (ERCC) and, consequently, to the Union Civil Protection Mechanism Participating States. The EU is committed to providing disaster response in a timely and efficient manner and to ensure European assistance meets the real needs in the population affected, whether in Europe or beyond. When a disaster strikes, every minute counts for saving lives and rapid, coordinated and pre-planned response is essential. The ARISTOTLE consortium includes 15 partner institutions (11 from EU Countries; 2 from non-EU countries and 2 European organizations) operating in the Meteorological and Geophysical domains. The project coordination is shared among INGV and ZAMG for the geophysical and meteorological communities, respectively. ARISTOTLE harnesses operational expertise from across Europe to form a multi-hazard perspective on natural disasters related to volcanoes, earthquake (and resulting tsunami), severe weather and flooding. Each Hazard Group brings together experts from the particular hazard domain to deliver a `collective analysis' which is then fed into the partnership multi-hazard discussions. Primary target of the pilot project has been the prototyping and the implementation of a scalable system (in terms of number of partners and hazards) capable of providing to ERCC the sought advice. To this end, the activities of the project have been focusing on the establishment of a "Multi-Hazard Operational Board" that is assigned the 24*7 operational duty regulated by a "Standard Operating Protocol" and the implementation of a dedicated IT platform to assembly the resulting reports. The project has reached the point where the routine and emergency advice services are being provided and will continue until the end of the project in January 2018. The presentation will illustrate the different modes of operation envisaged and the status and the solutions found by the project consortium to respond to the ERCC requirements.
Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon
NASA Astrophysics Data System (ADS)
Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.
2015-12-01
Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA
MEditerranean Supersite Volcanoes (MED-SUV) project: from objectives to results
NASA Astrophysics Data System (ADS)
Puglisi, Giuseppe; Spampinato, Letizia
2017-04-01
The MEditerranean Supersite Volcanoes (MED-SUV) was a FP7 3-year lasting project aimed at improving the assessment of volcanic hazards at two of the most active European volcanic areas - Campi Flegrei/Vesuvius and Mt. Etna. More than 3 million people are exposed to potential hazards in the two areas, and the geographic location of the volcanoes increases the number of people extending the impact to a wider region. MED-SUV worked on the (1) optimisation and integration of the existing and new monitoring systems, (2) understanding of volcanic processes, and on the (3) relationship between the scientific and end-user communities. MED-SUV fully exploited the unique multidisciplinary long-term in-situ datasets available for these volcanoes and integrated them with Earth observations. Technological developments and implemented algorithms allowed better constraint of pre-, sin- and post-eruptive phases. The wide range of styles and intensities of the volcanic phenomena observed at the targeted volcanoes - archetypes of 'closed' and 'open' conduit systems - observed by using the long-term multidisciplinary datasets, exceptionally upgraded the understanding of a variety of geo-hazards. Proper experiments and studies were carried out to advance the understanding of the volcanoes' internal structure and processes, and to recognise signals related to impending unrest/eruptive phases. Indeed, the hazard quantitative assessment benefitted from the outcomes of these studies and from their integration with cutting edge monitoring approaches, thus leading to step-changes in hazard awareness and preparedness, and leveraging the close relationship between scientists, SMEs, and end-users. Among the MED-SUV achievements, we can list the (i) implementation of a data policy compliant with the GEO Open Data Principles for ruling the exploitation and shared use of the project outcomes; (ii) MED-SUV e-infrastructure creation as test bed for designing an interoperable infrastructure to manage different data sources, applying the data policy, and envisaging sustainability strategies after the project in a coherent national and international framework; (iii) improvement of the SAR capability in detecting and monitoring ground deformation; (iv) development/implementation and testing of prototypes and software for measuring and retrieving more accurate/novel parameters; (v) integration of satellite and in-situ data; and (vi) novel methods of data analysis increasing the knowledge of volcanic process dynamics and improving alert systems. The project has fostered the assessment of short-term volcanic hazard in the Italian Supersites, and exploitation of the information provided by the monitoring. The main breakthroughs in the hazard focused on fine-tuning the Bayesian approach for the probabilistic evaluation of the occurrence of eruptive events at Campi Flegrei and its effects in the area, and the preliminary application to assess the occurrence of flank eruptions and the effects of volcanic plume fallout at Mt. Etna. Indeed, MED-SUV worked also on the communication between scientists and decision makers by evaluating the suitability of scientific outcomes (e.g. hazard maps) to be informative for this goal. Dissemination of the outcomes aimed at spreading new volcanology knowledge among the scientific community, as well as among decision-maker bodies and public, and allowing the end-user community access to the two Italian Supersites' data through a proper implemented e-infrastructure.
Selected considerations of implementation of the GNSS
NASA Astrophysics Data System (ADS)
Cwiklak, Janusz; Fellner, Andrzej; Fellner, Radoslaw; Jafernik, Henryk; Sledzinski, Janusz
2014-05-01
The article describes analysis of the safety and risk for the implementation of precise approach procedures (Localizer Performance and Vertical Guidance - LPV) with GNSS sensor at airports in Warsaw and Katowice. There were used some techniques of the identification of threats (inducing controlled flight into terrain, landing accident, mid-air collision) and evaluations methods based on Fault Tree Analysis, probability of the risk, safety risk evaluation matrix and Functional Hazard Assesment. Also safety goals were determined. Research led to determine probabilities of appearing of threats, as well as allow compare them with regard to the ILS. As a result of conducting the Preliminary System Safety Assessment (PSSA), there were defined requirements essential to reach the required level of the safety. It is worth to underline, that quantitative requirements were defined using FTA.
New insights into liquid chromatography for more eco-friendly analysis of pharmaceuticals.
Shaaban, Heba
2016-10-01
Greening the analytical methods used for analysis of pharmaceuticals has been receiving great interest aimed at eliminating or minimizing the amount of organic solvents consumed daily worldwide without loss in chromatographic performance. Traditional analytical LC techniques employed in pharmaceutical analysis consume tremendous amounts of hazardous solvents and consequently generate large amounts of waste. The monetary and ecological impact of using large amounts of solvents and waste disposal motivated the analytical community to search for alternatives to replace polluting analytical methodologies with clean ones. In this context, implementing the principles of green analytical chemistry (GAC) in analytical laboratories is highly desired. This review gives a comprehensive overview on different green LC pathways for implementing GAC principles in analytical laboratories and focuses on evaluating the greenness of LC analytical procedures. This review presents green LC approaches for eco-friendly analysis of pharmaceuticals in industrial, biological, and environmental matrices. Graphical Abstract Green pathways of liquid chromatography for more eco-friendly analysis of pharmaceuticals.
[Design of a HACCP plan for the industrial process of frozen sardines].
Rosas, Patricia; Reyes, Genara
2009-09-01
The Hazard Analysis and Critical Control Point (HACCP) is a system to identify, assess and control the hazards related with production, processing, distribution and consumption in order to get safe food. The aim of this study was to design a HACCP plan for implementing in processing line of frozen whole sardine (Sardinella aurita). The methodology was based in the evaluation of the accomplishment of the pre-requisite programs (GMP/SSOP in a previous study), the application of the principles of the HACCP and the sequence of stages settles down by the COVENIN Venezuelan standard No 3802. Time-temperature was recorded in each processing step. Histamine was determined by VERATOX NEOGEN. Results showed that some sardine batches arrived to the plant with high time-temperature records, finding up to 5 ppm of histamine due to the abuse of temperature during transportation. A HACCP plan is proposed with the scope, the selection of the team, the description of the product and the intended use, the flow diagram of the process, hazard analysis and identification of CCP, monitoring system, corrective actions and records. The potential hazards were identified as pathogen growth, presence of histamine and physical objects in the sardines. The control measures of PCC are referred as control of time-temperature during transportation and processing, monitoring of ice supplies and sanitary conditions in the process.
This page contains the current National Emission Standards for Hazardous Air Pollutants (NESHAP) for Reciprocating Internal Combustion Engines and additional information regarding rule compliance and implementation.
Secondary Aluminum Production: National Emission Standards for Hazardous Air Pollutants
National emission standards for hazardous air pollutants (NESHAP) for new and existing sources at secondary aluminum production facilities. Includes rule history, summary, federal register citations and implementation information.
National emission standards for hazardous air pollutants (NESHAP) from facilities that manufacture pharmaceutical products. Includes rule history, Federal Register citations, implementation and compliance information, and additional resources.
Precipitation and floodiness: forecasts of flood hazard at the regional scale
NASA Astrophysics Data System (ADS)
Stephens, Liz; Day, Jonny; Pappenberger, Florian; Cloke, Hannah
2016-04-01
In 2008, a seasonal forecast of an increased likelihood of above-normal rainfall in West Africa led the Red Cross to take early humanitarian action (such as prepositioning of relief items) on the basis that this forecast implied heightened flood risk. However, there are a number of factors that lead to non-linearity between precipitation anomalies and flood hazard, so in this presentation we use a recently developed global-scale hydrological model driven by the ERA-Interim/Land precipitation reanalysis (1980-2010) to quantify this non-linearity. Using these data, we introduce the concept of floodiness to measure the incidence of floods over a large area, and quantify the link between monthly precipitation, river discharge and floodiness anomalies. Our analysis shows that floodiness is not well correlated with precipitation, demonstrating the problem of using seasonal precipitation forecasts as a proxy for forecasting flood hazard. This analysis demonstrates the value of developing hydrometeorological forecasts of floodiness for decision-makers. As a result, we are now working with the European Centre for Medium-Range Weather Forecasts and the Joint Research Centre, as partners of the operational Global Flood Awareness System (GloFAS), to implement floodiness forecasts in real-time.
NASA Astrophysics Data System (ADS)
Huang, Qianrui; Cheng, Xianfeng; Qi, Wufu; Xu, Jun; Yang, Shuran
2017-12-01
Dahongshan Fe&Cu mine in Yunnan Province was endowed with the title of “National Green Mine Pilots” by Chinese Ministry of Land and Resources in April 2013. In order to verify the implementation effects of the green mine and better drive the construction of the green mine by other mine enterprises in Yunnan, the project team investigated overlying deposit in rivers around the Dahongshan mine in the wet season (August) of 2016, investigated mine enterprises, and applied the Potential Ecological Risk Index to evaluate potential ecological hazards of heavy metal pollution in overlying deposit. The results showed that all sampling points were less than 105, indicating the lower ecological hazard degree.
76 FR 6564 - Florida: Final Authorization of State Hazardous Waste Management Program Revisions
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-07
...: Final Authorization of State Hazardous Waste Management Program Revisions AGENCY: Environmental... implement the RCRA hazardous waste management program. We granted authorization for changes to their program..., 06/ 62-730.185(1) F.A.C. Universal Waste Management. 29/07. State Initiated Changes to the 62-730.210...
2013 Los Alamos National Laboratory Hazardous Waste Minimization Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salzman, Sonja L.; English, Charles J.
2015-08-24
Waste minimization and pollution prevention are inherent goals within the operating procedures of Los Alamos National Security, LLC (LANS). The US Department of Energy (DOE) and LANS are required to submit an annual hazardous waste minimization report to the New Mexico Environment Department (NMED) in accordance with the Los Alamos National Laboratory (LANL or the Laboratory) Hazardous Waste Facility Permit. The report was prepared pursuant to the requirements of Section 2.9 of the LANL Hazardous Waste Facility Permit. This report describes the hazardous waste minimization program (a component of the overall Waste Minimization/Pollution Prevention [WMin/PP] Program) administered by the Environmentalmore » Stewardship Group (ENV-ES). This report also supports the waste minimization and pollution prevention goals of the Environmental Programs Directorate (EP) organizations that are responsible for implementing remediation activities and describes its programs to incorporate waste reduction practices into remediation activities and procedures. LANS was very successful in fiscal year (FY) 2013 (October 1-September 30) in WMin/PP efforts. Staff funded four projects specifically related to reduction of waste with hazardous constituents, and LANS won four national awards for pollution prevention efforts from the National Nuclear Security Administration (NNSA). In FY13, there was no hazardous, mixedtransuranic (MTRU), or mixed low-level (MLLW) remediation waste generated at the Laboratory. More hazardous waste, MTRU waste, and MLLW was generated in FY13 than in FY12, and the majority of the increase was related to MTRU processing or lab cleanouts. These accomplishments and analysis of the waste streams are discussed in much more detail within this report.« less
SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin
A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less
National emissions standards for control of hazardous air pollutants (HAP) from the chemical preparations area source category. Includes rule history, Federal Registry citations, implementation information, and additional resources.
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
NASA Astrophysics Data System (ADS)
Masure, P.
2003-04-01
The GEMITIS method has been implemented since 1995 into a global and integrated Risk Reduction Strategy for improving the seismic risk-assessment effectiveness in urban areas, including the generation of crisis scenarios and mid- to long term- seismic impact assessment. GEMITIS required us to provide more precise definitions of notions in common use by natural-hazard specialists, such as elements at risk and vulnerability. Until then, only the physical and human elements had been considered, and analysis of their vulnerability referred to their fragility in the face of aggression by nature. We have completed this approach by also characterizing the social and cultural vulnerability of a city and its inhabitants, and, with a wider scope, the functional vulnerability of the "urban system". This functional vulnerability depends upon the relations between the system elements (weak links in chains, functional relays, and defense systems) and upon the city's relations with the outside world (interdependence). Though well developed in methods for evaluating industrial risk (fault-tree analysis, event-tree analysis, multiple defense barriers, etc.), this aspect had until now been ignored by the "hard-science" specialists working on natural hazards. Based on the implementation of an Urban System Exposure methodology, we were able to identify specific human, institutional, or functional vulnerability factors for each urban system, which until had been very little discussed by risk-analysis and civil-protection specialists. In addition, we have defined the new concept of "main stakes" of the urban system, ranked by order of social value (or collective utility). Obviously, vital or strategic issues must be better resistant or protected against natural hazards than issues of secondary importance. The ranking of exposed elements of a city in terms of "main stakes" provides a very useful guide for adapting vulnerability studies and for orienting preventive actions. For this, GEMITIS is based on a systemic approach of the city and on value analysis of exposed elements. It facilitates a collective expertise for the definition of a preventive action plan based on the participation of the main urban actors (crisis preparedness, construction, land-use, etc.).
Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)
NASA Astrophysics Data System (ADS)
Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.
2016-06-01
We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.
Design and implementation of a risk assessment module in a spatial decision support system
NASA Astrophysics Data System (ADS)
Zhang, Kaixi; van Westen, Cees; Bakker, Wim
2014-05-01
The spatial decision support system named 'Changes SDSS' is currently under development. The goal of this system is to analyze changing hydro-meteorological hazards and the effect of risk reduction alternatives to support decision makers in choosing the best alternatives. The risk assessment module within the system is to assess the current risk, analyze the risk after implementations of risk reduction alternatives, and analyze the risk in different future years when considering scenarios such as climate change, land use change and population growth. The objective of this work is to present the detailed design and implementation plan of the risk assessment module. The main challenges faced consist of how to shift the risk assessment from traditional desktop software to an open source web-based platform, the availability of input data and the inclusion of uncertainties in the risk analysis. The risk assessment module is developed using Ext JS library for the implementation of user interface on the client side, using Python for scripting, as well as PostGIS spatial functions for complex computations on the server side. The comprehensive consideration of the underlying uncertainties in input data can lead to a better quantification of risk assessment and a more reliable Changes SDSS, since the outputs of risk assessment module are the basis for decision making module within the system. The implementation of this module will contribute to the development of open source web-based modules for multi-hazard risk assessment in the future. This work is part of the "CHANGES SDSS" project, funded by the European Community's 7th Framework Program.
Canister Storage Building (CSB) Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
POWERS, T.B.
2000-03-16
This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less
Assessment of oil slick hazard and risk at vulnerable coastal sites.
Melaku Canu, Donata; Solidoro, Cosimo; Bandelj, Vinko; Quattrocchi, Giovanni; Sorgente, Roberto; Olita, Antonio; Fazioli, Leopoldo; Cucco, Andrea
2015-05-15
This work gives an assessment of the hazard faced by Sicily coasts regarding potential offshore surface oil spill events and provides a risk assessment for Sites of Community Importance (SCI) and Special Protection Areas (SPA). A lagrangian module, coupled with a high resolution finite element three dimensional hydrodynamic model, was used to track the ensemble of a large number of surface trajectories followed by particles released over 6 selected areas located inside the Sicily Channel. The analysis was carried out under multiple scenarios of meteorological conditions. Oil evaporation, oil weathering, and shore stranding are also considered. Seasonal hazard maps for different stranding times and seasonal risk maps were then produced for the whole Sicilian coastline. The results highlight that depending on the meteo-marine conditions, particles can reach different areas of the Sicily coast, including its northern side, and illustrate how impacts can be greatly reduced through prompt implementation of mitigation strategies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Safety management of a complex R&D ground operating system
NASA Technical Reports Server (NTRS)
Connors, J.; Mauer, R. A.
1975-01-01
Report discusses safety program implementation for large R&D operating system. Analytical techniques are defined and suggested as tools for identifying potential hazards and determining means to effectively control or eliminate hazards.
Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less
Multi Hazard Assessment: The Azores Archipelagos (PT) case
NASA Astrophysics Data System (ADS)
Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos
2016-04-01
The COPERNICUS EMS Risk & Recovery Mapping (RRM) activity offers services to support efficient design and implementation of mitigation measures and recovery planning based on EO data exploitation. The Azores Archipelagos case was realized in the context of the FWC 259811 Copernicus EMS RRM, and provides potential impact information for a number of natural disasters. The analysis identified population and assets at risk (infrastructures and environment). The risk assessment was based on hazard and vulnerability of structural elements, road network characteristics, etc. Integration of different hazards and risks was accounted in establishing the necessary first response/ first aid infrastructure. EO data (Pleiades and WV-2), were used to establish a detailed background information, common for the assessment of the whole of the risks. A qualitative Flood hazard level was established, through a "Flood Susceptibility Index" that accounts for upstream drainage area and local slope along the drainage network (Manfreda et al. 2014). Indicators, representing different vulnerability typologies, were accounted for. The risk was established through intersecting hazard and vulnerability (risk- specific lookup table). Probabilistic seismic hazards maps (PGA) were obtained by applying the Cornell (1968) methodology as implemented in CRISIS2007 (Ordaz et al. 2007). The approach relied on the identification of potential sources, the assessment of earthquake recurrence and magnitude distribution, the selection of ground motion model, and the mathematical model to calculate seismic hazard. Lava eruption areas and a volcanic activity related coefficient were established through available historical data. Lava flow paths and their convergence were estimated through applying a cellular, automata based, Lava Flow Hazard numerical model (Gestur Leó Gislason, 2013). The Landslide Hazard Index of NGI (Norwegian Geotechnical Institute) for heavy rainfall (100 year extreme monthly rainfall) and earthquake (475 years return period) was used. Topography, lithology, soil moisture and LU/LC were also accounted for. Soil erosion risk was assessed through the empirical model RUSLE (Renard et al. 1991b). Rainfall erosivity, topography and vegetation cover are the main parameters which were used for predicting the proneness to soil loss. Expected, maximum tsunami wave heights were estimated for a specific earthquake scenario at designated forecast points along the coasts. Deformation at the source was calculated by utilizing the Okada code (Okada, 1985). Tsunami waves' generation and propagation is based on the SWAN model (JRC/IPSC modification). To estimate the wave height (forecast points) the Green's Law function was used (JRC Tsunami Analysis Tool). Storm tracks' historical data indicate a return period of 17 /41 years for H1 /H2 hurricane categories respectively. NOAA WAVEWATCH III model hindcast reanalysis was used to estimate the maximum significant wave height (wind and swell) along the coastline during two major storms. The associated storm-surge risk assessment accounted also for the coastline morphology. Seven empirical (independent) indicators were used to express the erosion susceptibility of the coasts. Each indicator is evaluated according to a semi?quantitative score that represents low, medium and high level of erosion risk or impact. The estimation of the coastal erosion hazard was derived through aggregating the indicators in a grid scale.
Natural phenomena hazards design and evaluation criteria for Department of Energy Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-01-01
The Department of Energy (DOE) has issued an Order 420.1 which establishes policy for its facilities in the event of natural phenomena hazards (NPH) along with associated NPH mitigation requirements. This DOE Standard gives design and evaluation criteria for NPH effects as guidance for implementing the NPH mitigation requirements of DOE Order 420.1 and the associated implementation Guides. These are intended to be consistent design and evaluation criteria for protection against natural phenomena hazards at DOE sites throughout the United States. The goal of these criteria is to assure that DOE facilities can withstand the effects of natural phenomena suchmore » as earthquakes, extreme winds, tornadoes, and flooding. These criteria apply to the design of new facilities and the evaluation of existing facilities. They may also be used for modification and upgrading of existing facilities as appropriate. The design and evaluation criteria presented herein control the level of conservatism introduced in the design/evaluation process such that earthquake, wind, and flood hazards are treated on a consistent basis. These criteria also employ a graded approach to ensure that the level of conservatism and rigor in design/evaluation is appropriate for facility characteristics such as importance, hazards to people on and off site, and threat to the environment. For each natural phenomena hazard covered, these criteria consist of the following: Performance Categories and target performance goals as specified in the DOE Order 420.1 NPH Implementation Guide, and DOE-STD-1 021; specified probability levels from which natural phenomena hazard loading on structures, equipment, and systems is developed; and design and evaluation procedures to evaluate response to NPH loads and criteria to assess whether or not computed response is permissible.« less
Water safety plans: bridges and barriers to implementation in North Carolina.
Amjad, Urooj Quezon; Luh, Jeanne; Baum, Rachel; Bartram, Jamie
2016-10-01
First developed by the World Health Organization, and now used in several countries, water safety plans (WSPs) are a multi-step, preventive process for managing drinking water hazards. While the beneficial impacts of WSPs have been documented in diverse countries, how to successfully implement WSPs in the United States remains a challenge. We examine the willingness and ability of water utility leaders to implement WSPs in the US state of North Carolina. Our findings show that water utilities have more of a reactive than preventive organizational culture, that implementation requires prioritization of time and resources, perceived comparative advantage to other hazard management plans, leadership in implementation, and identification of how WSPs can be embedded in existing work practices. Future research could focus on whether WSP implementation provides benefits such as decreases in operational costs, and improved organization of records and communication.
INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.J. Garrett
2005-02-17
The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less
PLASMA-field barrier sentry (PFBS)
NASA Astrophysics Data System (ADS)
Gonzaga, Ernesto A.; Cossette, Harold James
2013-06-01
This paper describes the concept and method in designing and developing a unique security system apparatus that will counter unauthorized personnel: to deny access to or occupy an area or facility, to control or direct crowd or large groups, and to incapacitate individuals or small groups until they can be secured by military or law enforcement personnel. The system exploits Tesla coil technology. Application of basic engineering circuit analysis and principle is demonstrated. Transformation from classical spark gap method to modern solid state design was presented. The analysis shows how the optimum design can be implemented to maximize performance of the apparatus. Discussion of the hazardous effects of electrical elements to human physiological conditions was covered. This serves to define guidelines in implementing safety limits and precautions on the performance of the system. The project is strictly adhering towards non-lethal technologies and systems.
ERIC Educational Resources Information Center
Congress of the U. S., Washington, DC. House Committee on Government Operations.
A hearing was held by the Subcommittee on Environment, Energy, and Natural Resources on the Environmental Protection Agency's (EPA) implementation of laws regulating asbestos hazards in schools and in the air. Presented are testimony as well as letters and statements submitted for the record by leading authorities in the area including: (1) James…
40 CFR 63.632 - Implementation and enforcement.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Section 63.632 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants From Phosphate Fertilizers Production Plants § 63...
40 CFR 63.632 - Implementation and enforcement.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Section 63.632 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants From Phosphate Fertilizers Production Plants § 63...
40 CFR 63.632 - Implementation and enforcement.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Section 63.632 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants From Phosphate Fertilizers Production Plants § 63...
40 CFR 63.632 - Implementation and enforcement.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Section 63.632 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants From Phosphate Fertilizers Production Plants § 63...
SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM ANNUAL REPORT TO CONGRESS FY2000
The Superfund Innovative Technology Evaluation Program promotes the development, commercialization, and implementation of innovative hazardous waste treatment technologies. SITE offers a mechanism for conducting joint demonstration and evaluation projects at hazardous waste site...
10 CFR 851.22 - Hazard prevention and abatement.
Code of Federal Regulations, 2010 CFR
2010-01-01
... controls that limit worker exposures; and (4) Personal protective equipment. (c) Contractors must address hazards when selecting or purchasing equipment, products, and services. ... the risk to workers; (ii) Implement interim protective measures pending final abatement; and (iii...
10 CFR 851.22 - Hazard prevention and abatement.
Code of Federal Regulations, 2011 CFR
2011-01-01
... controls that limit worker exposures; and (4) Personal protective equipment. (c) Contractors must address hazards when selecting or purchasing equipment, products, and services. ... the risk to workers; (ii) Implement interim protective measures pending final abatement; and (iii...
Developing a safe on-orbit cryogenic depot
NASA Technical Reports Server (NTRS)
Bahr, Nicholas J.
1992-01-01
New U.S. space initiatives will require technology to realize planned programs such as piloted lunar and Mars missions. Key to the optimal execution of such missions are high performance orbit transfer vehicles and propellant storage facilities. Large amounts of liquid hydrogen and oxygen demand a uniquely designed on-orbit cryogenic propellant depot. Because of the inherent dangers in propellant storage and handling, a comprehensive system safety program must be established. This paper shows how the myriad and complex hazards demonstrate the need for an integrated safety effort to be applied from program conception through operational use. Even though the cryogenic depot is still in the conceptual stage, many of the hazards have been identified, including fatigue due to heavy thermal loading from environmental and operating temperature extremes, micrometeoroid and/or depot ancillary equipment impact (this is an important problem due to the large surface area needed to house the large quantities of propellant), docking and maintenance hazards, and hazards associated with extended extravehicular activity. Various safety analysis techniques were presented for each program phase. Specific system safety implementation steps were also listed. Enhanced risk assessment was demonstrated through the incorporation of these methods.
Loss Estimations due to Earthquakes and Secondary Technological Hazards
NASA Astrophysics Data System (ADS)
Frolova, N.; Larionov, V.; Bonnin, J.
2009-04-01
Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.
Integrated risk management and communication: case study of Canton Vaud (Switzerland)
NASA Astrophysics Data System (ADS)
Artigue, Veronica; Aye, Zar Chi; Gerber, Christian; Derron, Marc-Henri; Jaboyedoff, Michel
2017-04-01
Canton Vaud's history is marked by events that remind us that any territory may have to cope with natural hazards such as devastating floods of the Baye and the Veraye rivers in Montreux (1927), the overflowing of the Rhône by dam failure (1935), the mud flow of Pissot (1995) and avalanches in the Prealps (1999). All of these examples have caused significant damage, and sometimes even fatalities, in the regions of Canton Vaud. In response to these new issues, the Swiss Confederation and the local authorities of the Canton decided to implement an integrated management policy of natural risks. The realization of natural hazards maps was the first step of the integrated management process. This work resulted in more than 10'000 maps and related documents for 94% of the municipalities of the Canton, covering 17% of its total surface. From this significant amount of data, the main issue is to propose a relevant communication and to build an integrated risk management structure. To make this available information relevant for end users, the implied teams worked to realize documents and tools for a better understanding of these data by all stakeholders. The first step of this process was to carry out a statistical and geographical analysis of hazard maps that allows identifying the most exposed areas to natural hazards. An atlas could thus be created. Then, continued under this framework, several topics have been discussed for each identified risk. The results show that 88 of 318 municipalities in Canton Vaud have at least a high hazard level on their territory, 108 with a moderate hazard level, 41 with a low level and 8 with a residual level. Only 73 of 318 municipalities remain with a minimum or zero hazard level. Concerning the type of hazard considered, 16% of the building zones are exposed to floods, 18% to mud flow, 16% to deep landslides, 14% to spontaneous surface landslides, 6% to rockfall, 55% to rock collapses and less than 5% to avalanches. As the national policies require to take into account the risk at the building scale, further analysis on the buildings have been made. 1'154 buildings are exposed to a high hazard level, while 8409, 21'130 and 14'980 buildings are exposed to a moderate, low and residual hazard level respectively. This paper addresses the complexity of the realization of the hazard map products of the Canton Vaud, particularly through the statistical analysis and the difficulties encountered for data availability and quality at the building scale. The authors highlight the necessary processes to build a robust communication for all the implied stakeholders of risk management in a dynamic and changing area through the example of the Canton Vaud.
Hazard Analysis Guidelines for Transit Projects
DOT National Transportation Integrated Search
2000-01-01
These hazard analysis guidelines discuss safety critical systems and subsystems, types of hazard analyses, when hazard analyses should be performed, and the hazard analysis philosophy. These guidelines are published by FTA to assist the transit indus...
A pilot GIS database of active faults of Mt. Etna (Sicily): A tool for integrated hazard evaluation
NASA Astrophysics Data System (ADS)
Barreca, Giovanni; Bonforte, Alessandro; Neri, Marco
2013-02-01
A pilot GIS-based system has been implemented for the assessment and analysis of hazard related to active faults affecting the eastern and southern flanks of Mt. Etna. The system structure was developed in ArcGis® environment and consists of different thematic datasets that include spatially-referenced arc-features and associated database. Arc-type features, georeferenced into WGS84 Ellipsoid UTM zone 33 Projection, represent the five main fault systems that develop in the analysed region. The backbone of the GIS-based system is constituted by the large amount of information which was collected from the literature and then stored and properly geocoded in a digital database. This consists of thirty five alpha-numeric fields which include all fault parameters available from literature such us location, kinematics, landform, slip rate, etc. Although the system has been implemented according to the most common procedures used by GIS developer, the architecture and content of the database represent a pilot backbone for digital storing of fault parameters, providing a powerful tool in modelling hazard related to the active tectonics of Mt. Etna. The database collects, organises and shares all scientific currently available information about the active faults of the volcano. Furthermore, thanks to the strong effort spent on defining the fields of the database, the structure proposed in this paper is open to the collection of further data coming from future improvements in the knowledge of the fault systems. By layering additional user-specific geographic information and managing the proposed database (topological querying) a great diversity of hazard and vulnerability maps can be produced by the user. This is a proposal of a backbone for a comprehensive geographical database of fault systems, universally applicable to other sites.
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Alewine, Neal Jon
1993-01-01
Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.
Anderson, Carl A; McRae, Allan F; Visscher, Peter M
2006-07-01
Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.
A HW-SW Co-Designed System for the Lunar Lander Hazard Detection and Avoidance Breadboarding
NASA Astrophysics Data System (ADS)
Palomo, Pedro; Latorre, Antonio; Valle, Carlos; Gomez de Aguero, Sergio; Hagenfeldt, Miguel; Parreira, Baltazar; Lindoso, Almudena; Portela, Marta; Garcia, Mario; San Millan, Enrique; Zharikov, Yuri; Entrena, Luis
2014-08-01
This paper presents the HW-SW co-design approach followed to tackle the design of the Hazard Detection and Avoidance (HDA) system breadboarding for the Lunar Lander ESA mission, undertaken given the fact that novel GNC technologies used to promote autonomous systems demand processing capabilities that current (and forthcoming) space processors are not able to satisfy. The paper shows how the current system design has been performed in a process in which the original HDA functionally validated design has been partitioned between SW (deemed for execution in a microprocessor) and HW algorithms (to be executed in an FPGA), considering the performance requirements and resorting to a deep analysis of the algorithms in view of their adequacy to HW or SW implementation.
Hazard Detection Software for Lunar Landing
NASA Technical Reports Server (NTRS)
Huertas, Andres; Johnson, Andrew E.; Werner, Robert A.; Montgomery, James F.
2011-01-01
The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing a system for safe and precise manned lunar landing that involves novel sensors, but also specific algorithms. ALHAT has selected imaging LIDAR (light detection and ranging) as the sensing modality for onboard hazard detection because imaging LIDARs can rapidly generate direct measurements of the lunar surface elevation from high altitude. Then, starting with the LIDAR-based Hazard Detection and Avoidance (HDA) algorithm developed for Mars Landing, JPL has developed a mature set of HDA software for the manned lunar landing problem. Landing hazards exist everywhere on the Moon, and many of the more desirable landing sites are near the most hazardous terrain, so HDA is needed to autonomously and safely land payloads over much of the lunar surface. The HDA requirements used in the ALHAT project are to detect hazards that are 0.3 m tall or higher and slopes that are 5 or greater. Steep slopes, rocks, cliffs, and gullies are all hazards for landing and, by computing the local slope and roughness in an elevation map, all of these hazards can be detected. The algorithm in this innovation is used to measure slope and roughness hazards. In addition to detecting these hazards, the HDA capability also is able to find a safe landing site free of these hazards for a lunar lander with diameter .15 m over most of the lunar surface. This software includes an implementation of the HDA algorithm, software for generating simulated lunar terrain maps for testing, hazard detection performance analysis tools, and associated documentation. The HDA software has been deployed to Langley Research Center and integrated into the POST II Monte Carlo simulation environment. The high-fidelity Monte Carlo simulations determine the required ground spacing between LIDAR samples (ground sample distances) and the noise on the LIDAR range measurement. This simulation has also been used to determine the effect of viewing on hazard detection performance. The software has also been deployed to Johnson Space Center and integrated into the ALHAT real-time Hardware-in-the-Loop testbed.
An organized approach to the control of hazards to health at work.
Molyneux, M K; Wilson, H G
1990-04-01
Shell U.K. has an approach which facilitates the implementation of its occupational hygiene programme in its many locations. The main elements of the system are Company Policy, Standards, Methods and Management. The Policy sets the scene and is rigorous in its aims. The new COSHH legislation has emphasized particular duties which have influenced the approach. The Company Occupational Health Guidelines [Guidelines on Health at Work for Shell in the U.K. Shell U.K. Ltd, London (1989)] set the standards for control of exposure, among other things, and the Company adopts appropriate methods to achieve them. Of particular note is the Company's COSHH Programme [Implementation of the Shell U.K. Policy on the Control of Substances Hazardous to Health. Shell U.K. Ltd, London (1989)] which applies to all hazards to health (including physical and biological agents) in the workplace. Its introduction has been given full corporate support and is in the process of implementation. Appropriate procedures have been introduced for assessments of risk and for work histories. Guidance has been given on competence, reflecting a philosphy based on a team approach using local resources to the full, supported by corporate resources as required. The awards of the British Examining and Registration Board in Occupational Hygiene (1987) are used as the professional standard. Because of difficulties in obtaining basic hazard data, an internal core hazard data system (CHADS) [Core Hazard Data System. Shell U.K Ltd, London (1989)] has been introduced. The whole programme is managed through Occupational Hygiene Focal Points (OHFP) which represent local activities but also participate in corporate strategy. Through them the multidisciplinary approach is promoted, working in conjunction with local and sector Medical Advisers. Work done by the central Occupational Hygiene Unit is recorded and the reports are used for time management and recovery of costs. In its entirety, the approach is being used successfully to implement a comprehensive occupational hygiene programme in a diversified and dispersed industrial organization.
NASA Astrophysics Data System (ADS)
Odbert, Henry; Aspinall, Willy
2014-05-01
Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss the uncertainty of inferences, and how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
NASA Astrophysics Data System (ADS)
Odbert, Henry; Hincks, Thea; Aspinall, Willy
2015-04-01
Volcanic hazard assessments must combine information about the physical processes of hazardous phenomena with observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We show how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
Moser, Heidrun; Roembke, Joerg; Donnevert, Gerhild; Becker, Roland
2011-02-01
The ecotoxicological characterization of waste is part of its assessment as hazardous or non-hazardous according to the European Waste List. For this classification 15 hazard criteria are derived from the Council Directive 91/689/EEC on hazardous waste. Some of the hazard criteria are based on the content of dangerous substances. The criterion H14 'ecotoxic' lacks of an assessment and testing strategy and no specific threshold values have been defined so far. Based on the recommendations of CEN guideline 14735 (2005), an international round robin test (ring test) was organized by the German Federal Environment Agency in order to define suitable test methods for the biological assessment of waste and waste eluates. A basic test battery, consisting of three aquatic and three terrestrial tests, was compiled. In addition, data were submitted for ten additional tests (five aquatic (including a genotoxicity test) and five terrestrial ones). The tests were performed with three representative waste types: an ash from an incineration plant, a soil containing high concentrations of organic contaminants (polycyclic aromatic hydrocarbons) and a preserved wood waste. The results of this ring test confirm that a combination of a battery of biological tests and chemical residual analysis is needed for an ecotoxicological characterization of wastes. With small modifications the basic test battery is considered to be well suitable for the hazard and risk assessment of wastes and waste eluates. All results and documents are accessible via a web-based data bank application.
Lunar mission safety and rescue: Hazards analysis and safety requirements
NASA Technical Reports Server (NTRS)
1971-01-01
The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.
SAMCO: Society Adaptation for coping with Mountain risks in a global change COntext
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Bernardie, Severine; Malet, Jean-Philippe; Puissant, Anne; Houet, Thomas; Berger, Frederic; Fort, Monique; Pierre, Daniel
2013-04-01
The SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points with (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform. The strength and originality of the SAMCO project will be to combine different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) and to gather various interdisciplinary expertises in earth sciences, environmental sciences, and social sciences. The multidisciplinary background of the members could potentially lead to the development of new concepts and emerging strategies for mountain hazard/risk adaptation. Research areas, characterized by a variety of environmental, economical and social settings, are severely affected by landslides, and have experienced significant land use modifications (reforestation, abandonment of traditional agricultural practices) and human interferences (urban expansion, ski resorts construction) over the last century.
Lin, Ting; Harmsen, Stephen C.; Baker, Jack W.; Luco, Nicolas
2013-01-01
The conditional spectrum (CS) is a target spectrum (with conditional mean and conditional standard deviation) that links seismic hazard information with ground-motion selection for nonlinear dynamic analysis. Probabilistic seismic hazard analysis (PSHA) estimates the ground-motion hazard by incorporating the aleatory uncertainties in all earthquake scenarios and resulting ground motions, as well as the epistemic uncertainties in ground-motion prediction models (GMPMs) and seismic source models. Typical CS calculations to date are produced for a single earthquake scenario using a single GMPM, but more precise use requires consideration of at least multiple causal earthquakes and multiple GMPMs that are often considered in a PSHA computation. This paper presents the mathematics underlying these more precise CS calculations. Despite requiring more effort to compute than approximate calculations using a single causal earthquake and GMPM, the proposed approach produces an exact output that has a theoretical basis. To demonstrate the results of this approach and compare the exact and approximate calculations, several example calculations are performed for real sites in the western United States. The results also provide some insights regarding the circumstances under which approximate results are likely to closely match more exact results. To facilitate these more precise calculations for real applications, the exact CS calculations can now be performed for real sites in the United States using new deaggregation features in the U.S. Geological Survey hazard mapping tools. Details regarding this implementation are discussed in this paper.
2016 Los Alamos National Laboratory Hazardous Waste Minimization Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salzman, Sonja L.; English, Charles Joe
Waste minimization and pollution prevention are goals within the operating procedures of Los Alamos National Security, LLC (LANS). The US Department of Energy (DOE), inclusive of the National Nuclear Security Administration (NNSA) and the Office of Environmental Management, and LANS are required to submit an annual hazardous waste minimization report to the New Mexico Environment Department (NMED) in accordance with the Los Alamos National Laboratory (LANL or the Laboratory) Hazardous Waste Facility Permit. The report was prepared pursuant to the requirements of Section 2.9 of the LANL Hazardous Waste Facility Permit. This report describes the hazardous waste minimization program, whichmore » is a component of the overall Pollution Prevention (P2) Program, administered by the Environmental Stewardship Group (EPC-ES). This report also supports the waste minimization and P2 goals of the Associate Directorate of Environmental Management (ADEM) organizations that are responsible for implementing remediation activities and describes its programs to incorporate waste reduction practices into remediation activities and procedures. This report includes data for all waste shipped offsite from LANL during fiscal year (FY) 2016 (October 1, 2015 – September 30, 2016). LANS was active during FY2016 in waste minimization and P2 efforts. Multiple projects were funded that specifically related to reduction of hazardous waste. In FY2016, there was no hazardous, mixed-transuranic (MTRU), or mixed low-level (MLLW) remediation waste shipped offsite from the Laboratory. More non-remediation hazardous waste and MLLW was shipped offsite from the Laboratory in FY2016 compared to FY2015. Non-remediation MTRU waste was not shipped offsite during FY2016. These accomplishments and analysis of the waste streams are discussed in much more detail within this report.« less
40 CFR 63.611 - Implementation and enforcement.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Section 63.611 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants From Phosphoric Acid Manufacturing Plants § 63.611...
40 CFR 63.853 - Implementation and enforcement.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Section 63.853 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Primary Aluminum Reduction Plants § 63.853...
40 CFR 63.853 - Implementation and enforcement.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Section 63.853 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Primary Aluminum Reduction Plants § 63.853...
40 CFR 63.611 - Implementation and enforcement.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Section 63.611 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants From Phosphoric Acid Manufacturing Plants § 63.611...
40 CFR 63.611 - Implementation and enforcement.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Section 63.611 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants From Phosphoric Acid Manufacturing Plants § 63.611...
40 CFR 63.853 - Implementation and enforcement.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Section 63.853 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Primary Aluminum Reduction Plants § 63.853...
40 CFR 63.611 - Implementation and enforcement.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Section 63.611 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants From Phosphoric Acid Manufacturing Plants § 63.611...
40 CFR 63.853 - Implementation and enforcement.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Section 63.853 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Primary Aluminum Reduction Plants § 63.853...
44 CFR 79.3 - Responsibilities.
Code of Federal Regulations, 2012 CFR
2012-10-01
... SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION GRANTS § 79.3...-related hazard mitigation programs and grants, including: (1) Issue program implementation procedures, as... governments regarding the mitigation and grants management process; (5) Review and approve State, Indian...
Code of Federal Regulations, 2014 CFR
2014-01-01
... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...
Code of Federal Regulations, 2011 CFR
2011-01-01
... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...
Code of Federal Regulations, 2013 CFR
2013-01-01
... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...
Code of Federal Regulations, 2012 CFR
2012-01-01
... Federally Assisted New Building Construction § 1792.101 General. (a) The Earthquake Hazards Reduction Act of... establishment and maintenance of an effective earthquake hazards reduction program (the National Earthquake... development and implementation of feasible design and construction methods to make structures earthquake...
E-research platform of EPOS Thematic Core Service "ANTHROPOGENIC HAZARDS"
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanisław; Grasso, Jean Robert; Schmittbuhl, Jean; Kwiatek, Grzegorz; Garcia, Alexander; Cassidy, Nigel; Sterzel, Mariusz; Szepieniec, Tomasz; Dineva, Savka; Biggare, Pascal; Saccorotti, Gilberto; Sileny, Jan; Fischer, Tomas
2016-04-01
EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) aims to create new research opportunities in the field of anthropogenic hazards evoked by exploitation of georesources. TCS AH, based on the prototype built in the framework of the IS-EPOS project (https://tcs.ah-epos.eu/), financed from Polish structural funds (POIG.02.03.00-14-090/13-00), is being further developed within EPOS IP project (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). TCS AH is designed as a functional e-research environment to ensure a researcher the maximum possible freedom for in silico experimentation by providing a virtual laboratory in which researcher will be able to create own workspace with own processing streams. The unique integrated RI is: (i) data gathered in the so- called "episodes", comprehensively describing a geophysical process, induced or triggered by human technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment and (ii) problem-oriented, specific high-level services, with the particular attention devoted to methods analyzing correlations between technology, geophysical response and resulting hazard. Services to be implemented are grouped within six blocks: (1) Basic services for data integration and handling; (2) Services for physical models of stress/strain changes over time and space as driven by geo-resource production; (3) Services for analysing geophysical signals; (4) Services to extract the relation between technological operations and observed induced seismic/deformation; (5) Services to quantitative probabilistic assessments of anthropogenic seismic hazard - statistical properties of anthropogenic seismic series and their dependence on time-varying anthropogenesis; ground motion prediction equations; stationary and time-dependent probabilistic seismic hazard estimates, related to time-changeable technological factors inducing the seismic process; (6) Simulator for Multi-hazard/multi-risk assessment in ExploRation/exploitation of GEoResources (MERGER) - numerical estimate of the occurrence probability of chains of events or processes impacting the environment. TCS AH will also serve the public sector expert knowledge and background information. In order to fulfill this aim the services for outreach, dissemination & communication will be implemented. From the technical point of view the implementation of services will proceed according to the methods worked within the mentioned before IS-EPOS project. The detailed workflows of implementation process of aforementioned services & interaction between user & TCS AH have been already prepared.
Chi, Feng; Zhou, Jun; Zhang, Qi; Wang, Yong; Huang, Panling
2017-01-01
The vibration control of a construction vehicle must be carried out in order to meet the aims of sustainable environmental development and to avoid the potential human health hazards. In this paper, based on market feedback, the driver seat vibration of a type of wheel loader in the left and right direction, is found to be significant over a certain speed range. In order to find abnormal vibration components, the order tracking technique (OTT) and transmission path analysis (TPA) were used to analyze the vibration sources of the wheel loader. Through this analysis, it can be seen that the abnormal vibration comes from the interaction between the tire tread and the road, and this is because the vibration was amplified by the cab mount, which was eventually transmitted to the cab seat. Finally, the seat vibration amplitudes were decreased by up to 50.8%, after implementing the vibration reduction strategy. PMID:28282849
Chi, Feng; Zhou, Jun; Zhang, Qi; Wang, Yong; Huang, Panling
2017-03-08
The vibration control of a construction vehicle must be carried out in order to meet the aims of sustainable environmental development and to avoid the potential human health hazards. In this paper, based on market feedback, the driver seat vibration of a type of wheel loader in the left and right direction, is found to be significant over a certain speed range. In order to find abnormal vibration components, the order tracking technique (OTT) and transmission path analysis (TPA) were used to analyze the vibration sources of the wheel loader. Through this analysis, it can be seen that the abnormal vibration comes from the interaction between the tire tread and the road, and this is because the vibration was amplified by the cab mount, which was eventually transmitted to the cab seat. Finally, the seat vibration amplitudes were decreased by up to 50.8%, after implementing the vibration reduction strategy.
Jonkman, Nini H; Westland, Heleen; Groenwold, Rolf H H; Ågren, Susanna; Atienza, Felipe; Blue, Lynda; Bruggink-André de la Porte, Pieta W F; DeWalt, Darren A; Hebert, Paul L; Heisler, Michele; Jaarsma, Tiny; Kempen, Gertrudis I J M; Leventhal, Marcia E; Lok, Dirk J A; Mårtensson, Jan; Muñiz, Javier; Otsu, Haruka; Peters-Klimm, Frank; Rich, Michael W; Riegel, Barbara; Strömberg, Anna; Tsuyuki, Ross T; van Veldhuisen, Dirk J; Trappenburg, Jaap C A; Schuurmans, Marieke J; Hoes, Arno W
2016-03-22
Self-management interventions are widely implemented in the care for patients with heart failure (HF). However, trials show inconsistent results, and whether specific patient groups respond differently is unknown. This individual patient data meta-analysis assessed the effectiveness of self-management interventions in patients with HF and whether subgroups of patients respond differently. A systematic literature search identified randomized trials of self-management interventions. Data from 20 studies, representing 5624 patients, were included and analyzed with the use of mixed-effects models and Cox proportional-hazard models, including interaction terms. Self-management interventions reduced the risk of time to the combined end point of HF-related hospitalization or all-cause death (hazard ratio, 0.80; 95% confidence interval [CI], 0.71-0.89), time to HF-related hospitalization (hazard ratio, 0.80; 95% CI, 0.69-0.92), and improved 12-month HF-related quality of life (standardized mean difference, 0.15; 95% CI, 0.00-0.30). Subgroup analysis revealed a protective effect of self-management on the number of HF-related hospital days in patients <65 years of age (mean, 0.70 versus 5.35 days; interaction P=0.03). Patients without depression did not show an effect of self-management on survival (hazard ratio for all-cause mortality, 0.86; 95% CI, 0.69-1.06), whereas in patients with moderate/severe depression, self-management reduced survival (hazard ratio, 1.39; 95% CI, 1.06-1.83, interaction P=0.01). This study shows that self-management interventions had a beneficial effect on time to HF-related hospitalization or all-cause death and HF-related hospitalization alone and elicited a small increase in HF-related quality of life. The findings do not endorse limiting self-management interventions to subgroups of patients with HF, but increased mortality in depressed patients warrants caution in applying self-management strategies in these patients. © 2016 American Heart Association, Inc.
Social transformation in transdisciplinary natural hazard management
NASA Astrophysics Data System (ADS)
Attems, Marie-Sophie; Fuchs, Sven; Thaler, Thomas
2017-04-01
Due to annual increases of natural hazard losses, there is a discussion among authorities and communities in Europe on innovative solutions to increase resilience, and consequently, business-as-usual in risk management practices is often questioned. Therefore, the current situation of risk management requests a societal transformation to response adequately and effectively to the new global dynamics. An emerging concept is the implementation of multiple-use mitigation systems against hazards such as floods, avalanches and land-slides. However, one key aspect refers to the involvement of knowledge outside academic research. Therefore, transdisciplinary knowledge can be used to discuss vital factors which are needed to upscale the implementation of multiple-use mitigation measures. The method used in this contribution is an explorative scenario analysis applied in Austria and processes the knowledge gained in transdisciplinary workshops. The scenario analysis combines qualitative data and the quantitative relations in order to generate a set of plausible future outcomes. The goal is to establish a small amount of consistent scenarios, which are efficient and thereby representative as well as significantly different from each other. The results of the discussions among relevant stakeholders within the workshops and a subsequent quantitative analysis, showed that vital variables influencing the multiple use of mitigation measures are the (1) current legislation, (2) risk acceptance among authorities and the public, (3) land-use pressure, (4) the demand for innovative solutions, (5) the available technical standards and possibilities and (6) finally the policy entrepreneurship. Four different scenarios were the final result of the analysis. Concluding the results, in order to make multiple-use alleviations systems possible contemporary settings concerning risk management strategies will have to change in the future. Legislation and thereby current barriers have to be altered in order to create a possibility for innovative solutions. If the state of the art in technical perspectives allows constructions with limited additional risk, multiple-use structures are an option in risk management. The present and future land-use pressure also intensifies the economic interest in finding and accepting such measures.
A Software Tool for Quantitative Seismicity Analysis - ZMAP
NASA Astrophysics Data System (ADS)
Wiemer, S.; Gerstenberger, M.
2001-12-01
Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.
A New Seismic Hazard Model for Mainland China
NASA Astrophysics Data System (ADS)
Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.
2017-12-01
We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.
Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes
NASA Astrophysics Data System (ADS)
Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.
2012-07-01
Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.
Integrated risk reduction framework to improve railway hazardous materials transportation safety.
Liu, Xiang; Saat, M Rapik; Barkan, Christopher P L
2013-09-15
Rail transportation plays a critical role to safely and efficiently transport hazardous materials. A number of strategies have been implemented or are being developed to reduce the risk of hazardous materials release from train accidents. Each of these risk reduction strategies has its safety benefit and corresponding implementation cost. However, the cost effectiveness of the integration of different risk reduction strategies is not well understood. Meanwhile, there has been growing interest in the U.S. rail industry and government to best allocate resources for improving hazardous materials transportation safety. This paper presents an optimization model that considers the combination of two types of risk reduction strategies, broken rail prevention and tank car safety design enhancement. A Pareto-optimality technique is used to maximize risk reduction at a given level of investment. The framework presented in this paper can be adapted to address a broader set of risk reduction strategies and is intended to assist decision makers for local, regional and system-wide risk management of rail hazardous materials transportation. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yurkovich, E. S.; Howell, D. G.
2002-12-01
Exploding population and unprecedented urban development within the last century helped fuel an increase in the severity of natural disasters. Not only has the world become more populated, but people, information and commodities now travel greater distances to service larger concentrations of people. While many of the earth's natural hazards remain relatively constant, understanding the risk to increasingly interconnected and large populations requires an expanded analysis. To improve mitigation planning we propose a model that is accessible to planners and implemented with public domain data and industry standard GIS software. The model comprises 1) the potential impact of five significant natural hazards: earthquake, flood, tropical storm, tsunami and volcanic eruption assessed by a comparative index of risk, 2) population density, 3) infrastructure distribution represented by a proxy, 4) the vulnerability of the elements at risk (population density and infrastructure distribution) and 5) the connections and dependencies of our increasingly 'globalized' world, portrayed by a relative linkage index. We depict this model with the equation, Risk = f(H, E, V, I) Where H is an index normalizing the impact of five major categories of natural hazards; E is one element at risk, population or infrastructure; V is a measure of the vulnerability for of the elements at risk; and I pertains to a measure of interconnectivity of the elements at risk as a result of economic and social globalization. We propose that future risk analysis include the variable I to better define and quantify risk. Each assessment reflects different repercussions from natural disasters: losses of life or economic activity. Because population and infrastructure are distributed heterogeneously across the Pacific region, two contrasting representations of risk emerge from this study.
Berlin, M A; Anand, Sheila
2014-01-01
This paper presents Direction based Hazard Routing Protocol (DHRP) for disseminating information about fixed road hazards such as road blocks, tree fall, boulders on road, snow pile up, landslide, road maintenance work and other obstacles to the vehicles approaching the hazardous location. The proposed work focuses on dissemination of hazard messages on highways with sparse traffic. The vehicle coming across the hazard would report the presence of the hazard. It is proposed to use Road Side fixed infrastructure Units for reliable and timely delivery of hazard messages to vehicles. The vehicles can then take appropriate safety action to avoid the hazardous location. The proposed protocol has been implemented and tested using SUMO simulator to generate road traffic and NS 2.33 network simulator to analyze the performance of DHRP. The performance of the proposed protocol was also compared with simple flooding protocol and the results are presented.
Using Integrated Earth and Social Science Data for Disaster Risk Assessment
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.; Yetman, G.
2016-12-01
Society faces many different risks from both natural and technological hazards. In some cases, disaster risk managers focus on only a few risks, e.g., in regions where a single hazard such as earthquakes dominate. More often, however, disaster risk managers deal with multiple hazards that pose diverse threats to life, infrastructure, and livelihoods. From the viewpoint of scientists, hazards are often studied based on traditional disciplines such as seismology, hydrology, climatology, and epidemiology. But from the viewpoint of disaster risk managers, data are needed on all hazards in a specific region and on the exposure and vulnerability of population, infrastructure, and economic resources and activity. Such managers also need to understand how hazards, exposures, and vulnerabilities may interact, and human and environmental systems respond, to hazard events, as in the case of the Fukushima nuclear disaster that followed from the Sendai earthquake and tsunami. In this regard, geospatial tools that enable visualization and analysis of both Earth and social science data can support the use case of disaster risk managers who need to quickly assess where specific hazard events occur relative to population and critical infrastructure. Such information can help them assess the potential severity of actual or predicted hazard events, identify population centers or key infrastructure at risk, and visualize hazard dynamics, e.g., earthquakes and their aftershocks or the paths of severe storms. This can then inform efforts to mitigate risks across multiple hazards, including reducing exposure and vulnerability, strengthening system resiliency, improving disaster response mechanisms, and targeting mitigation resources to the highest or most critical risks. We report here on initial efforts to develop hazard mapping tools that draw on open web services and support simple spatial queries about population exposure. The NASA Socioeconomic Data and Applications Center (SEDAC) Hazards Mapper, a web-based mapping tool, enables users to estimate population living in areas subject to flood or tornado warnings, near recent earthquakes, or around critical infrastructure. The HazPop mobile app, implemented for iOS devices, utilizes location services to support disaster risk managers working in field conditions.
SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM: TECHNOLOGY WITH AN IMPACT
SITE promotes the development and implementation of innovative technologies for remediating hazardous waste sites and for evaluating the nature and extent of hazardous waste site contamination through four component segments. The SITE Program is a key element in EPA's efforts...
On November 8, 1984, the President signed into law the ...
On November 8, 1984, the President signed into law the Hazardous and Solid Waste Amendments of 1984. These amendments to the RCRA of 1976 require EPA to promulgate rules to implement new section 3017 regarding exports of hazardous waste.
40 CFR 63.7141 - Who implements and enforces this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... subpart? 63.7141 Section 63.7141 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Lime Manufacturing Plants Other...
40 CFR 63.7141 - Who implements and enforces this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... subpart? 63.7141 Section 63.7141 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Lime Manufacturing Plants Other...
40 CFR 63.7141 - Who implements and enforces this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... subpart? 63.7141 Section 63.7141 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Lime Manufacturing Plants Other...
40 CFR 63.7141 - Who implements and enforces this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... subpart? 63.7141 Section 63.7141 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Lime Manufacturing Plants Other...
DETERMINING INHALATION RISK -- TOOLS FOR ASSESSING HAZARD.
The Clean Air Act focuses on reduction of the potential for specific air pollutants to cause adverse health effects. Implementation of standards to control release of the 188 hazardous air pollutants (HAPs) requires the Environmental Protection Agency (EPA) to evaluate the healt...
USEPA SITE PROGRAM APPROACH TO TECHNOLOGY TRANSFER AND REGULATORY ACCEPTANCE
The SITE Program was created to meet the increased demand for innovative technologies for hazardous waste treatment. To accomplish this mission, the program seeks to advance the development, implementation and commercialization of innovative technologies for hazardous waste chara...
Tools for Material Design and Selection
NASA Astrophysics Data System (ADS)
Wehage, Kristopher
The present thesis focuses on applications of numerical methods to create tools for material characterization, design and selection. The tools generated in this work incorporate a variety of programming concepts, from digital image analysis, geometry, optimization, and parallel programming to data-mining, databases and web design. The first portion of the thesis focuses on methods for characterizing clustering in bimodal 5083 Aluminum alloys created by cryomilling and powder metallurgy. The bimodal samples analyzed in the present work contain a mixture of a coarse grain phase, with a grain size on the order of several microns, and an ultra-fine grain phase, with a grain size on the order of 200 nm. The mixing of the two phases is not homogeneous and clustering is observed. To investigate clustering in these bimodal materials, various microstructures were created experimentally by conventional cryomilling, Hot Isostatic Pressing (HIP), Extrusion, Dual-Mode Dynamic Forging (DMDF) and a new 'Gradient' cryomilling process. Two techniques for quantitative clustering analysis are presented, formulated and implemented. The first technique, the Area Disorder function, provides a metric of the quality of coarse grain dispersion in an ultra-fine grain matrix and the second technique, the Two-Point Correlation function, provides a metric of long and short range spatial arrangements of the two phases, as well as an indication of the mean feature size in any direction. The two techniques are implemented on digital images created by Scanning Electron Microscopy (SEM) and Electron Backscatter Detection (EBSD) of the microstructures. To investigate structure--property relationships through modeling and simulation, strategies for generating synthetic microstructures are discussed and a computer program that generates randomized microstructures with desired configurations of clustering described by the Area Disorder Function is formulated and presented. In the computer program, two-dimensional microstructures are generated by Random Sequential Adsorption (RSA) of voxelized ellipses representing the coarse grain phase. A simulated annealing algorithm is used to geometrically optimize the placement of the ellipses in the model to achieve varying user-defined configurations of spatial arrangement of the coarse grains. During the simulated annealing process, the ellipses are allowed to overlap up to a specified threshold, allowing triple junctions to form in the model. Once the simulated annealing process is complete, the remaining space is populated by smaller ellipses representing the ultra-fine grain phase. Uniform random orientations are assigned to the grains. The program generates text files that can be imported in to Crystal Plasticity Finite Element Analysis Software for stress analysis. Finally, numerical methods and programming are applied to current issues in green engineering and hazard assessment. To understand hazards associated with materials and select safer alternatives, engineers and designers need access to up-to-date hazard information. However, hazard information comes from many disparate sources and aggregating, interpreting and taking action on the wealth of data is not trivial. In light of these challenges, a Framework for Automated Hazard Assessment based on the GreenScreen list translator is presented. The framework consists of a computer program that automatically extracts data from the GHS-Japan hazard database, loads the data into a machine-readable JSON format, transforms the JSON document in to a GreenScreen JSON document using the GreenScreen List Translator v1.2 and performs GreenScreen Benchmark scoring on the material. The GreenScreen JSON documents are then uploaded to a document storage system to allow human operators to search for, modify or add additional hazard information via a web interface.
A new Geo-Information Architecture for Risk Management in the Alps
NASA Astrophysics Data System (ADS)
Baruffini, Mi.; Thuering, M.
2009-04-01
During the last decades land-use increased significantly in the Swiss (and European) mountain regions. Due to the scarceness of areas suitable for development, anthropic activities were extended into areas prone to natural hazards such as avalanches, debris flows and rockfalls (Smith 2001). Furthermore, the transalpine transport system necessity to develop effective links in an important area collides with the need to ensure the safety of travelers and the health of the population. Consequently, an increase in losses due to hazards can be observed. To mitigate these associated losses, both traditional protective measures and land-use planning policies are to be developed and implemented to optimize future investments. Efficient protection alternatives can be obtained considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. As part of the Swiss National Science Foundation Project 54 "Evaluation of the optimal resilience for vulnerable infrastructure networks - An interdisciplinary pilot study on the transalpine transportation corridors" we study the vulnerability of infrastructures due to natural hazards. The project aims to study various natural hazards (and later, even man-made) and to obtain an evaluation of the resilience according to an interdisciplinary approach, considering the possible damage by means of risk criteria and pointing out the feasibility of conceivable measures to reduce potential damage. The project consists of a geoscientific part and an application. The fist part consists in studying the dangers (natural) and related risks in terms of infrastructure vulnerability. The application considers different types of danger (logically intersected with the transport infrastructure) and compares them with fixed values to obtain a so-called deficit. As framework we adopt The Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). In this way the project develops a methodology that makes possible a risk analysis aiming to optimize the infrastructure vulnerability and therefore allows to obtain a model designed to optimize the functionality of the network infrastructure. A simulation environment, RiskBox, is developed within the open-source GIS environment GRASS (Geographic Resources Analysis Support System) and a database (PostgreSQL) in order to manage a infrastructure data catalog. The targeted simulation environment includes the elements that identify the consecutive steps of risk analysis: hazard - vulnerability - risk. The initial results of the experimental case study show how useful a GIS-based system, which identify the risk of any single vulnerable element in the corridor and to assess the risk to the global system on the basis of priorities of the actors involved, can be for effective and efficient disaster response management, as explained in (ARMONIA Project 2007). In our work we wanted to highlight the complexity of the risk analysis methodology, difficulty that is amplified by many peculiarities in the mountain areas. In particular, the illustrative performed process can give an overview of the interests and the need to act to reduce vulnerability and the hazardous nature of the Gotthard corridor. We present the concept and current state of development of our project and our application to the testbed, the Alps-crossing corridor of St. Gotthard. REFERENCES ARMONIA Project 2007: Land use plans in Risky areas fro Unwise to Wise Practices - Materials 2nd conference. Politecnico di Milano. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004 : La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Smith, K. 2001: Environmental hazards. Assessing the risk and reducing disaster. Third edition. London
A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.
Morag, Ido; Luria, Gil
2013-01-01
Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.
Abdollahian, Nina; Ratliff, Jamie L.; Wood, Nathan J.
2016-11-09
IntroductionUnderstanding if and how community exposure to coastal hazards may change over time is crucial information for coastal managers tasked with developing climate adaptation plans. This report summarizes estimates of population and asset exposure to coastal-inundation hazards associated with sea-level-rise and storm scenarios in six coastal communities of the Great Marsh region of Essex County, Massachusetts. This U.S. Geological Survey (USGS) analysis was conducted in collaboration with National Wildlife Federation (NWF) representatives, who are working with local stakeholders to develop local climate adaptation plans for the Towns of Salisbury, Newbury, Rowley, Ipswich, and Essex and the City of Newburyport (hereafter referred to as communities). Community exposure was characterized by integrating various community indicators (land cover and land use, population, economic assets, critical facilities, and infrastructure) with coastal-hazard zones that estimate inundation extents and water depth for three time periods.Estimates of community exposure are based on the presence of people, businesses, and assets in hazard zones that are calculated from geospatial datasets using geographic-information-system (GIS) tools. Results are based on current distributions of people and assets in hazard zones and do not take into account projections of human population, asset, or land-use changes over time. Results are not loss estimates based on engineering analysis or field surveys for any particular facility and do not take into account aspects of individual and household preparedness before an extreme event, adaptive capacity of a community during an event, or long-term resilience of individuals and communities after an event. Potential losses would match reported inventories only if all residents, business owners, public managers, and elected officials were unaware of what to do if warned of an imminent threat, failed to take protective measures during an extreme event, or failed to implement any long-term strategies to mitigate potential impacts. This analysis is intended to serve as a foundation for additional risk-related studies, plans, and mitigation efforts that are tailored to local needs. After a summary of the geospatial methods used in the analysis, results are organized by community so that local officials can easily use them in their local adaptation planning efforts.
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary
2015-06-01
PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.
Abdullahi, Auwalu; Hassan, Azmi; Kadarman, Norizhar; Junaidu, Yakubu Muhammad; Adeyemo, Olanike Kudrat; Lua, Pei Lin
2016-01-01
Purpose This study aims to investigate the occupational hazards among the abattoir workers associated with noncompliance to the meat processing and waste disposal laws in Terengganu State, Malaysia. Occupational hazards are the major source of morbidity and mortality among the animal workers due to exposure to many hazardous situations in their daily practices. Occupational infections mostly contracted by abattoir workers could be caused by iatrogenic or transmissible agents, including viruses, bacteria, fungi, and parasites and the toxins produced by these organisms. Materials and methods The methodology was based on a cross-sectional survey using cluster sampling technique in the four districts of Terengganu State, Malaysia. One hundred and twenty-one abattoir workers from five abattoirs were assessed using a validated structured questionnaire and an observation checklist. Results The mean and standard deviation of occupational hazards scores of the workers were 2.32 (2.721). Physical, chemical, biological, psychosocial, musculoskeletal, and ergonomics hazards were the major findings of this study. However, the highest prevalence of occupational hazards identified among the workers was injury by sharp equipment such as a knife (20.0%), noise exposure (17.0%), and due to offensive odor within the abattoir premises (12.0%). Conclusion The major occupational hazards encountered by the workers in the study area were physical, chemical, biological, psychosocial, musculoskeletal, and ergonomics hazards. To ensure proper control of occupational health hazards among the abattoir workers, standard design and good environmental hygiene must be taken into consideration all the time. Exposure control plan, which includes risk identification, risk characterization, assessment of workers at risk, risk control, workers’ education/training, and implementation of safe work procedures, should be implemented by the government and all the existing laws governing the abattoir operation in the country should be enforced. PMID:27471416
The hidden costs of coastal hazards: Implications for risk assessment and mitigation
Kunreuther, H.; Platt, R.; Baruch, S.; Bernknopf, R.L.; Buckley, M.; Burkett, V.; Conrad, D.; Davidson, T.; Deutsch, K.; Geis, D.; Jannereth, M.; Knap, A.; Lane, H.; Ljung, G.; McCauley, M.; Mileti, D.; Miller, T.; Morrow, B.; Meyers, J.; Pielke, R.; Pratt, A.; Tripp, J.
2000-01-01
Society has limited hazard mitigation dollars to invest. Which actions will be most cost effective, considering the true range of impacts and costs incurred? In 1997, the H. John Heinz III Center for Science, Economics and the Environment began a two-year study with a panel of experts to help develop new strategies to identify and reduce the costs of weather-related hazards associated with rapidly increasing coastal development activities.The Hidden Costs of Coastal Hazards presents the panel's findings, offering the first in-depth study that considers the costs of coastal hazards to natural resources, social institutions, business, and the built environment. Using Hurricane Hugo, which struck South Carolina in 1989, as a case study, it provides for the first time information on the full range of economic costs caused by a major coastal hazard event. The book:describes and examines unreported, undocumented, and hidden costs such as losses due to business interruption, reduction in property values, interruption of social services, psychological trauma, damage to natural systems, and othersexamines the concepts of risk and vulnerability, and discusses conventional approaches to risk assessment and the emerging area of vulnerability assessmentrecommends a comprehensive framework for developing and implementing mitigation strategiesdocuments the human impact of Hurricane Hugo and provides insight from those who lived through it.The Hidden Costs of Coastal Hazards takes a structured approach to the problem of coastal hazards, offering a new framework for community-based hazard mitigation along with specific recommendations for implementation. Decisionmakers -- both policymakers and planners -- who are interested in coastal hazard issues will find the book a unique source of new information and insight, as will private-sector decisionmakers including lenders, investors, developers, and insurers of coastal property.
International Space Station (ISS) Low Pressure Intramodule Quick Disconnect Failures
NASA Technical Reports Server (NTRS)
Lewis, John F.; Harris, Danny; Link, Dwight; Morrison, Russel
2004-01-01
A failure of an ISS intermodule Quick Disconnect (QD) during protoflight vibration testing of ISS regenerative Environmental Control and Life Support (ECLS) hardware led to the discovery of QD design, manufacturing, and test flaws which can yield the male QD susceptible to failure of the secondary housing seal and inadequate housing assembly locking mechanisms. Discovery of this failure had large implications when considering that currently there are 399 similar units on orbit and approximately 1100 units on the ground integrated into flight hardware. Discovery of the nature of the failure required testing and analysis and implementation of a recovery plan requiring part screening and review of element level and project hazard analysis to determine if secondary seals are required. Implementation also involves coordination with the Nodes and MPLM project offices, Regenerative ECLS Project, ISS Payloads, JAXA, ESA, and ISS Logistics and Maintenance.
NASA Astrophysics Data System (ADS)
Ding, R.; He, T.
2017-12-01
With the increased popularity in mobile applications and services, there has been a growing demand for more advanced mobile technologies that utilize real-time Location Based Services (LBS) data to support natural hazard response efforts. Compared to traditional sources like the census bureau that often can only provide historical and static data, an LBS service can provide more current data to drive a real-time natural hazard response system to more accurately process and assess issues such as population density in areas impacted by a hazard. However, manually preparing or preprocessing the data to suit the needs of the particular application would be time-consuming. This research aims to implement a population heatmap visual analytics system based on real-time data for natural disaster emergency management. System comprised of a three-layered architecture, including data collection, data processing, and visual analysis layers. Real-time, location-based data meeting certain polymerization conditions are collected from multiple sources across the Internet, then processed and stored in a cloud-based data store. Parallel computing is utilized to provide fast and accurate access to the pre-processed population data based on criteria such as the disaster event and to generate a location-based population heatmap as well as other types of visual digital outputs using auxiliary analysis tools. At present, a prototype system, which geographically covers the entire region of China and combines population heat map based on data from the Earthquake Catalogs database has been developed. It Preliminary results indicate that the generation of dynamic population density heatmaps based on the prototype system has effectively supported rapid earthquake emergency rescue and evacuation efforts as well as helping responders and decision makers to evaluate and assess earthquake damage. Correlation analyses that were conducted revealed that the aggregation and movement of people depended on various factors, including earthquake occurrence time and location of epicenter. This research hopes to continue to build upon the success of the prototype system in order to improve and extend the system to support the analysis of earthquakes and other types of natural hazard events.
NASA Technical Reports Server (NTRS)
Miller, Ralinda R.
2016-01-01
This document presents the Corrective Measures Implementation (CMI) Year 10 Annual Report for implementation of corrective measures at the Hypergol Maintenance Facility (HMF) Hazardous Waste South Staging Areas at Kennedy Space Center, Florida. The work is being performed by Tetra Tech, Inc., for the National Aeronautics and Space Administration (NASA) under Indefinite Delivery Indefinite Quantity (IDIQ) NNK12CA15B, Task Order (TO) 07. Mr. Harry Plaza, P.E., of NASA's Environmental Assurance Branch is the Remediation Project Manager for John F. Kennedy Space Center. The Tetra Tech Program Manager is Mr. Mark Speranza, P.E., and the Tetra Tech Project Manager is Robert Simcik, P.E.
NASA Astrophysics Data System (ADS)
Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.
2012-04-01
Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m resolution was used for the analysis. Three failure mechanisms were analyzed: planar and wedge sliding, as well as toppling. Based on this kinematic analysis, areas where failure is feasible were used as source areas for run out analysis using Rockyfor3D v. 4.1 (www.ecorisq.org). The software calculates trajectories of single falling blocks in three dimensions using physically based algorithms developed under a stochastic approach. The ALS-DEM was down-scaled to 5 m resolution to optimize processing time. Results were compared with run-out simulations using Rockyfor3D with the whole rock wall as source area, and with maps of deposits generated from field observations and aerial photo interpretation. The results product of our implementation show a better correlation with field observations, and help to produce more accurate rock fall hazard assessment maps by a better definition of the source areas. It reduces the time processing for the analysis as well. The findings presented in this contribution are part of an effort to produce guidelines for natural hazard mapping in Norway. Guidelines will be used in upcoming years for hazard mapping in areas where larger groups of population are exposed to mass movements from steep slopes.
Installation Restoration Program Records Search for Kingsley Field, Oregon.
1982-06-01
Hazardous Assesment Rating Methodology (HARM), is now used for all Air Force IRP studies. To maintain consistency, AFESC had their on-call contractors review...Installation History D. Industrial Facilities E. POL Storage Tanks F. Abandoned Tanks G. Oil/Water Separators :" H. Site Hazard Rating Methodology I. Site...and implementing regulations. The pur- pose of DOD policy is to control the migration of hazardous material contaminants from DOD installations. 3
Graham, T; Lessin, N; Mirer, F
1993-07-01
The Supreme Court's March 1991 ruling in United Automobile Workers (UAW) versus Johnson Controls barring corporate "fetal protection policies" was a major victory for women's employment rights and has health and safety implications for both sexes. However, 2 years after the Court's decision, the union's work is far from over. The UAW has yet to see what policy Johnson Controls will implement in place of the old one. Formulating solutions to the concerns of workers who are exposed daily to reproductive health hazards on the job will continue to be on labor's agenda. Preventing hazardous exposures is the first priority. This goal would be furthered by setting occupational health and safety standards designed to protect workers' general and reproductive health. Support for the Comprehensive Occupational Safety and Health Reform Act (COSHRA) would also positively affect health and safety in the workplace. Where hazards have not yet been abated, the framework of transfers and income protections for all workers with temporary job restrictions should be examined. The Legal/Labor Working Group convened at the Occupational and Environmental Reproductive Hazards Working Conference authored guidelines for developing a model reproductive hazards policy. These recommendations can serve as a guide for implementation of nondiscriminatory and health-protective policies by employers.
Graham, T; Lessin, N; Mirer, F
1993-01-01
The Supreme Court's March 1991 ruling in United Automobile Workers (UAW) versus Johnson Controls barring corporate "fetal protection policies" was a major victory for women's employment rights and has health and safety implications for both sexes. However, 2 years after the Court's decision, the union's work is far from over. The UAW has yet to see what policy Johnson Controls will implement in place of the old one. Formulating solutions to the concerns of workers who are exposed daily to reproductive health hazards on the job will continue to be on labor's agenda. Preventing hazardous exposures is the first priority. This goal would be furthered by setting occupational health and safety standards designed to protect workers' general and reproductive health. Support for the Comprehensive Occupational Safety and Health Reform Act (COSHRA) would also positively affect health and safety in the workplace. Where hazards have not yet been abated, the framework of transfers and income protections for all workers with temporary job restrictions should be examined. The Legal/Labor Working Group convened at the Occupational and Environmental Reproductive Hazards Working Conference authored guidelines for developing a model reproductive hazards policy. These recommendations can serve as a guide for implementation of nondiscriminatory and health-protective policies by employers. PMID:8243392
Processing LiDAR Data to Predict Natural Hazards
NASA Technical Reports Server (NTRS)
Fairweather, Ian; Crabtree, Robert; Hager, Stacey
2008-01-01
ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.
Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai
2017-01-01
Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011
Interdisciplinary modeling and analysis to reduce loss of life from tsunamis
NASA Astrophysics Data System (ADS)
Wood, N. J.
2016-12-01
Recent disasters have demonstrated the significant loss of life and community impacts that can occur from tsunamis. Minimizing future losses requires an integrated understanding of the range of potential tsunami threats, how individuals are specifically vulnerable to these threats, what is currently in place to improve their chances of survival, and what risk-reduction efforts could be implemented. This presentation will provide a holistic perspective of USGS research enabled by recent advances in geospatial modeling to assess and communicate population vulnerability to tsunamis and the range of possible interventions to reduce it. Integrated research includes efforts to characterize the magnitude and demography of at-risk individuals in tsunami-hazard zones, their evacuation potential based on landscape conditions, nature-based mitigation to improve evacuation potential, evacuation pathways and population demand at assembly areas, siting considerations for vertical-evacuation refuges, community implications of multiple evacuation zones, car-based evacuation modeling for distant tsunamis, and projected changes in population exposure to tsunamis over time. Collectively, this interdisciplinary research supports emergency managers in their efforts to implement targeted risk-reduction efforts based on local conditions and needs, instead of generic regional strategies that only focus on hazard attributes.
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-25
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... Management Programs AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... Nation's gas distribution pipeline systems through development of inspection methods and guidance for the...
40 CFR 63.11567 - Who implements and enforces this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
...). 2. A high-efficiency air filter or fiber bed filter a. Inlet gas temperature b, andb. Pressure drop...) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Area Sources: Asphalt...
APPLICATION OF EXAMS AS THE SURFACE WATER MODULE IN THE HWIR MULTIMEDIA RISK ASSESSMENT SYSTEM
Multimedia, multipathway risk assessment software has been developed for implementing the Hazardous Waste Identification Rule (HWIR). This regulation is intended to determine whether a waste should be considered hazardous, and confined to Subtitle D facilities, or safely release...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-02
... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-29
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Liquid Systems AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice; Issuance of Advisory Bulletin. SUMMARY: This notice advises owners and operators of gas pipeline...
44 CFR 80.17 - Project implementation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Project implementation. 80.17... RELOCATION FOR OPEN SPACE Post-Award Requirements § 80.17 Project implementation. (a) Hazardous materials. The subgrantee shall take steps to ensure it does not acquire or include in the project properties...
44 CFR 80.17 - Project implementation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Project implementation. 80.17... RELOCATION FOR OPEN SPACE Post-Award Requirements § 80.17 Project implementation. (a) Hazardous materials. The subgrantee shall take steps to ensure it does not acquire or include in the project properties...
44 CFR 80.17 - Project implementation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Project implementation. 80.17... RELOCATION FOR OPEN SPACE Post-Award Requirements § 80.17 Project implementation. (a) Hazardous materials. The subgrantee shall take steps to ensure it does not acquire or include in the project properties...
Dujardin, Pierre-Philippe; Reverdy, Thomas; Valette, Annick; François, Patrice
2016-06-01
Introduction : project management is on the expected proficiencies for head nurses. Context : The work on the organizations’ improvement carried out by head nurses, is rarely covered in the literature. Objectives : to follow the implementation of actions from projects led by head nurses and to analyze the parameters of success. Method : for a year, an intervention study has followed 17 projects initiating improvement measures. Semistructured interviews were conducted with health-care teams and managers. All of them reported the results of the implementation of each measure as an operational improvement. A mixed analysis containing a logistic regression investigated associations between the result of the action and the various contextual characteristics. Results : this study involved 111 actions. 71 % of them concluded an operational improvement. The organizational and supporting actions had a high success rate, which decreased when hazards were not managed by healthcare managers. Discussion : this study highlights the place of strategies through the implementing methods and the chosen actions. Recommendations are made in order to promote a collective assessment. Conclusion : scientific approaches are proposed to discuss the organizational work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riddle, F. J.
2003-06-26
The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less
Hazardous child labor in Nepal: The case of brick kilns.
Larmar, Stephen; O'Leary, Patrick; Chui, Cheryl; Benfer, Katherine; Zug, Sebastian; Jordan, Lucy P
2017-10-01
Hazardous child labor in Nepal is a serious concern, particularly in the brick kiln industry. Although a range of interventions have been implemented in Nepal to address hazardous child labor, there is a lack of research to both measure success and shape further development in interventions that integrate sound child protection practices to ensure the wellbeing of all children. This paper provides a review of the literature outlining interventions for children working in brick kilns in Nepal, and presents preliminary case study findings of one current intervention in the Kathmandu Valley. The paper highlights the strength of applying foundational child protection principles and advocates for the development and implementation of future programs underpinned by broad civil society principles within a child rights and protection framework. Copyright © 2017. Published by Elsevier Ltd.
Compiler-assisted multiple instruction rollback recovery using a read buffer
NASA Technical Reports Server (NTRS)
Alewine, Neal J.; Chen, Shyh-Kwei; Fuchs, W. Kent; Hwu, Wen-Mei W.
1995-01-01
Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper describes compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. The compiler-assisted scheme presented consists of hardware that is less complex than shadow files, history files, history buffers, or delayed write buffers, while experimental evaluation indicates performance improvement over compiler-based schemes.
Hydrothermal Liquefaction Treatment Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less
Hazard Analysis for Building 34 Vacuum Glove Box Assembly
NASA Technical Reports Server (NTRS)
Meginnis, Ian
2014-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".
NASA Technical Reports Server (NTRS)
Huang, Zhao-Feng; Fint, Jeffry A.; Kuck, Frederick M.
2005-01-01
This paper is to address the in-flight reliability of a liquid propulsion engine system for a launch vehicle. We first establish a comprehensive list of system and sub-system reliability drivers for any liquid propulsion engine system. We then build a reliability model to parametrically analyze the impact of some reliability parameters. We present sensitivity analysis results for a selected subset of the key reliability drivers using the model. Reliability drivers identified include: number of engines for the liquid propulsion stage, single engine total reliability, engine operation duration, engine thrust size, reusability, engine de-rating or up-rating, engine-out design (including engine-out switching reliability, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction), propellant specific hazards, engine start and cutoff transient hazards, engine combustion cycles, vehicle and engine interface and interaction hazards, engine health management system, engine modification, engine ground start hold down with launch commit criteria, engine altitude start (1 in. start), Multiple altitude restart (less than 1 restart), component, subsystem and system design, manufacturing/ground operation support/pre and post flight check outs and inspection, extensiveness of the development program. We present some sensitivity analysis results for the following subset of the drivers: number of engines for the propulsion stage, single engine total reliability, engine operation duration, engine de-rating or up-rating requirements, engine-out design, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction, and engine health management system implementation (basic redlines and more advanced health management systems).
Towards sets of hazardous waste indicators. Essential tools for modern industrial management.
Peterson, Peter J; Granados, Asa
2002-01-01
Decision-makers require useful tools, such as indicators, to help them make environmentally sound decisions leading to effective management of hazardous wastes. Four hazardous waste indicators are being tested for such a purpose by several countries within the Sustainable Development Indicator Programme of the United Nations Commission for Sustainable Development. However, these indicators only address the 'down-stream' end-of-pipe industrial situation. More creative thinking is clearly needed to develop a wider range of indicators that not only reflects all aspects of industrial production that generates hazardous waste but considers socio-economic implications of the waste as well. Sets of useful and innovative indicators are proposed that could be applied to the emerging paradigm shift away from conventional end-of-pipe management actions and towards preventive strategies that are being increasingly adopted by industry often in association with local and national governments. A methodological and conceptual framework for the development of a core-set of hazardous waste indicators has been developed. Some of the indicator sets outlined quantify preventive waste management strategies (including indicators for cleaner production, hazardous waste reduction/minimization and life cycle analysis), whilst other sets address proactive strategies (including changes in production and consumption patterns, eco-efficiency, eco-intensity and resource productivity). Indicators for quantifying transport of hazardous wastes are also described. It was concluded that a number of the indicators proposed could now be usefully implemented as management tools using existing industrial and economic data. As cleaner production technologies and waste minimization approaches are more widely deployed, and industry integrates environmental concerns at all levels of decision-making, it is expected that the necessary data for construction of the remaining indicators will soon become available.
Ball, Brita; Wilcock, Anne; Aung, May
2009-06-01
Small and medium sized food businesses have been slow to adopt food safety management systems (FSMSs) such as good manufacturing practices and Hazard Analysis Critical Control Point (HACCP). This study identifies factors influencing workers in their implementation of food safety practices in small and medium meat processing establishments in Ontario, Canada. A qualitative approach was used to explore in-plant factors that influence the implementation of FSMSs. Thirteen in-depth interviews in five meat plants and two focus group interviews were conducted. These generated 219 pages of verbatim transcripts which were analysed using NVivo 7 software. Main themes identified in the data related to production systems, organisational characteristics and employee characteristics. A socio-psychological model based on the theory of planned behaviour is proposed to describe how these themes and underlying sub-themes relate to FSMS implementation. Addressing the various factors that influence production workers is expected to enhance FSMS implementation and increase food safety.
Sensitivity test and ensemble hazard assessment for tephra fallout at Campi Flegrei, Italy
NASA Astrophysics Data System (ADS)
Selva, J.; Costa, A.; De Natale, G.; Di Vito, M. A.; Isaia, R.; Macedonio, G.
2018-02-01
We present the results of a statistical study on tephra dispersal in the case of a reactivation of the Campi Flegrei volcano. To represent the spectrum of possible eruptive sizes, four classes of eruptions were considered. Excluding the lava emission, three classes are explosive (Small, Medium, and Large) and can produce a significant quantity of volcanic ash. Hazard assessments were made through simulations of atmospheric dispersion of ash and lapilli, considering the full variability of winds and eruptive vents. The results are presented in form of conditional hazard curves given the occurrence of specific eruptive sizes, representative members of each size class, and then combined to quantify the conditional hazard given an eruption of any size. The main focus of this analysis was to constrain the epistemic uncertainty (i.e. associated with the level of scientific knowledge of phenomena), in order to provide unbiased hazard estimations. The epistemic uncertainty on the estimation of hazard curves was quantified, making use of scientifically acceptable alternatives to be aggregated in the final results. The choice of such alternative models was made after a comprehensive sensitivity analysis which considered different weather databases, alternative modelling of submarine eruptive vents and tephra total grain-size distributions (TGSD) with a different relative mass fraction of fine ash, and the effect of ash aggregation. The results showed that the dominant uncertainty is related to the combined effect of the uncertainty with regard to the fraction of fine particles with respect to the total mass and on how ash aggregation is modelled. The latter is particularly relevant in the case of magma-water interactions during explosive eruptive phases, when a large fraction of fine ash can form accretionary lapilli that might contribute significantly in increasing the tephra load in the proximal areas. The variability induced by the use of different meteorological databases and the selected approach to modelling offshore eruptions were relatively insignificant. The uncertainty arising from the alternative implementations, which would have been neglected in standard (Bayesian) quantifications, were finally quantified by ensemble modelling, and represented by hazard and probability maps produced at different confidence levels.
ARISTOTLE (All Risk Integrated System TOwards The hoListic Early-warning)
NASA Astrophysics Data System (ADS)
Michelini, Alberto; Wotawa, Gerhard; Arnold-Arias, Delia
2017-04-01
The Emergency Response Coordination Centre (ERCC) is the EU coordination office for humanitarian aid and civil protection operations of DG ECHO (EU Humanitarian Aid and Civil Protection). ERCC needs rapidly authoritative multi-hazard scientific expertise and analysis on 24*7 basis since, when a disaster strikes, every minute counts for saving lives and immediate, coordinated and pre-planned response is essential. The EU is committed to providing disaster response in a timely and efficient manner and to ensure European assistance meets the real needs in the population affected, whether in Europe or beyond. The ARISTOTLE consortium was awarded the European Commission's DG ECHO "Pilot project in the area of Early Warning System for natural disasters" (OJ 2015 S/154-283349). The tender articulates the needs and expectations of DG ECHO in respect of the provision of multi-hazard advice to the Emergency Response & Coordination Centre in Brussels. Specifically, the tender aims to fill the gap in knowledge that exists in the: • first 3 hours immediately after an event that has the potential to require a country to call on international help • provision of longer term advice following an emergency • provision of advice when a potential hazardous event is starting to form; this will usually be restricted to severe weather and flooding events and when possible to volcanic events. The ARISTOTLE Consortium was awarded the tender and the project effectively started on February 1st, 2016, for a duration of 2 years. ARISTOTLE (aristotle.ingv.it) is a multi-hazard partnership created by combining expertise from of total of 5 hazard groups [4 main hazard groups plus a sub-hazard - Severe Weather, Floods, Volcanos (only for ashes and gases hazard deriving from eruptions), Earthquakes and the related Tsunamis as a sub-hazard given its peculiarities and potential huge impact]. Each Hazard Group brings together experts from the particular hazard domain to deliver a 'collective analysis' which is then fed into the partnership multi-hazard discussions. The hazards are very different and have very diverse timelines for phenomenological occurrence (Figure 1). The ARISTOTLE consortium includes 15 partner institutions (11 from EU Countries; 2 from non-EU countries and 2 European organizations) operating in the Meteorological and Geophysical domains. The project coordination is shared among INGV and ZAMG for the geophysical and meteorological communities, respectively. Primary target of the tender project is the prototyping and the implementation of a scalable system (in terms of number of partners and hazards) capable of providing to ERCC the "desiderata" above. To this end, the activities of the project have been focusing on the establishment of a multi-hazard operational board (MHOB) that is assigned the 24*7 operational duty regulated by a "Standard Operating Protocol". The presentation will illustrate the different modes of operation envisaged and the status and the solutions found by the project consortium to respond to the ERCC requirements.
Tsunamis: Global Exposure and Local Risk Analysis
NASA Astrophysics Data System (ADS)
Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.
2014-12-01
The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.
NASA Astrophysics Data System (ADS)
Bartolini, Stefania; Sobradelo, Rosa; Martí, Joan
2016-08-01
Short-term hazard assessment is an important part of the volcanic management cycle, above all at the onset of an episode of volcanic agitation (unrest). For this reason, one of the main tasks of modern volcanology is to use monitoring data to identify and analyse precursory signals and so determine where and when an eruption might occur. This work follows from Sobradelo and Martí [Short-term volcanic hazard assessment through Bayesian inference: retrospective application to the Pinatubo 1991 volcanic crisis. Journal of Volcanology and Geothermal Research 290, 111, 2015] who defined the principle for a new methodology for conducting short-term hazard assessment in unrest volcanoes. Using the same case study, the eruption on Pinatubo (15 June 1991), this work introduces a new free Python tool, ST-HASSET, for implementing Sobradelo and Martí (2015) methodology in the time evolution of unrest indicators in the volcanic short-term hazard assessment. Moreover, this tool is designed for complementing long-term hazard assessment with continuous monitoring data when the volcano goes into unrest. It is based on Bayesian inference and transforms different pre-eruptive monitoring parameters into a common probabilistic scale for comparison among unrest episodes from the same volcano or from similar ones. This allows identifying common pre-eruptive behaviours and patterns. ST-HASSET is especially designed to assist experts and decision makers as a crisis unfolds, and allows detecting sudden changes in the activity of a volcano. Therefore, it makes an important contribution to the analysis and interpretation of relevant data for understanding the evolution of volcanic unrest.
NEL, ANDRE; XIA, TIAN; MENG, HUAN; WANG, XIANG; LIN, SIJIE; JI, ZHAOXIA; ZHANG, HAIYUAN
2014-01-01
Conspectus The production of engineered nanomaterials (ENMs) is a scientific breakthrough in material design and the development of new consumer products. While the successful implementation of nanotechnology is important for the growth of the global economy, we also need to consider the possible environmental health and safety (EHS) impact as a result of the novel physicochemical properties that could generate hazardous biological outcomes. In order to assess ENM hazard, reliable and reproducible screening approaches are needed to test the basic materials as well as nano-enabled products. A platform is required to investigate the potentially endless number of bio-physicochemical interactions at the nano/bio interface, in response to which we have developed a predictive toxicological approach. We define a predictive toxicological approach as the use of mechanisms-based high throughput screening in vitro to make predictions about the physicochemical properties of ENMs that may lead to the generation of pathology or disease outcomes in vivo. The in vivo results are used to validate and improve the in vitro high throughput screening (HTS) and to establish structure-activity relationships (SARs) that allow hazard ranking and modeling by an appropriate combination of in vitro and in vivo testing. This notion is in agreement with the landmark 2007 report from the US National Academy of Sciences, “Toxicity Testing in the 21st Century: A Vision and a Strategy” (http://www.nap.edu/catalog.php?record_id=11970), which advocates increased efficiency of toxicity testing by transitioning from qualitative, descriptive animal testing to quantitative, mechanistic and pathway-based toxicity testing in human cells or cell lines using high throughput approaches. Accordingly, we have implemented HTS approaches to screen compositional and combinatorial ENM libraries to develop hazard ranking and structure-activity relationships that can be used for predicting in vivo injury outcomes. This predictive approach allows the bulk of the screening analysis and high volume data generation to be carried out in vitro, following which limited, but critical, validation studies are carried out in animals or whole organisms. Risk reduction in the exposed human or environmental populations can then focus on limiting or avoiding exposures that trigger these toxicological responses as well as implementing safer design of potentially hazardous ENMs. In this communication, we review the tools required for establishing predictive toxicology paradigms to assess inhalation and environmental toxicological scenarios through the use of compositional and combinatorial ENM libraries, mechanism-based HTS assays, hazard ranking and development of nano-SARs. We will discuss the major injury paradigms that have emerged based on specific ENM properties, as well as describing the safer design of ZnO nanoparticles based on characterization of dissolution chemistry as a major predictor of toxicity. PMID:22676423
EVALUATION OF THE IMPLEMENTATION OF OPERATIONS AND MAINTENANCE PROGRAMS IN NEW JERSEY SCHOOLS
The Asbestos Hazard Emergency Response Act (AHERA) required all schools to develop and implement an asbestos management plan (AMP). The key component of the AMP is the operations and maintenance (O&M) program. A study was conducted to evaluate the implementation of O&M programs a...
Medical Services: DoD Hazardous Food and Nonprescription Drug Recall System
1986-08-15
This publication implements policy of the Office of the Under Secretary of Defense for Research and Engineering for the establishment of a hazardous ... food and nonprescription drug recall system. It has been coordinated with and concurred in by the DMSB and the Services.
40 CFR 262.102 - What special definitions are included in this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS APPLICABLE TO GENERATORS OF HAZARDOUS WASTE University... particularly hazardous substances as designated in a University's Chemical Hygiene Plan under OSHA, or... Management Plan (EMP) means a written program developed and implemented by the university which sets forth...
78 FR 2359 - Approval and Promulgation of State Implementation Plans: Idaho
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-11
... travel hazards, protecting sensitive populations, and recordkeeping. Under IDAPA 58.01.01.624.04.c, spot... provisions and provisions relating to preventing travel hazards, protecting sensitive populations, and... limited to, the Coeur d'Alene Reservation, the Duck Valley Reservation, the Reservation of the Kootenai...
77 FR 21961 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-12
... the responses of California and Colorado residents to different scenarios related to fire hazard... researchers provide better information to natural resources, forest, and fire managers when they are contemplating the kind and type of fire hazard reduction programs to implement to achieve forest land management...
Frequent Questions about the Hazardous Waste Export-Import Revisions Final Rule
Answers questions such as: What new requirements did EPA finalize in the Hazardous Waste Export-Import Revisions Final Rule? Why did EPA implement these changes now? What are the benefits of the final rule? What are the compliance dates for the final rule?
14 CFR 417.407 - Hazard control implementation.
Code of Federal Regulations, 2011 CFR
2011-01-01
... conduct periodic inspections of related hardware, software, and facilities. A launch operator must ensure... operations. (f) Hazardous materials. A launch operator must establish procedures for the receipt, storage... protecting the public that complies with the accident investigation plan as defined in § 417.111(h)(2). These...
14 CFR 417.407 - Hazard control implementation.
Code of Federal Regulations, 2013 CFR
2013-01-01
... conduct periodic inspections of related hardware, software, and facilities. A launch operator must ensure... operations. (f) Hazardous materials. A launch operator must establish procedures for the receipt, storage... protecting the public that complies with the accident investigation plan as defined in § 417.111(h)(2). These...
14 CFR 417.407 - Hazard control implementation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... conduct periodic inspections of related hardware, software, and facilities. A launch operator must ensure... operations. (f) Hazardous materials. A launch operator must establish procedures for the receipt, storage... protecting the public that complies with the accident investigation plan as defined in § 417.111(h)(2). These...
14 CFR 417.407 - Hazard control implementation.
Code of Federal Regulations, 2012 CFR
2012-01-01
... conduct periodic inspections of related hardware, software, and facilities. A launch operator must ensure... operations. (f) Hazardous materials. A launch operator must establish procedures for the receipt, storage... protecting the public that complies with the accident investigation plan as defined in § 417.111(h)(2). These...
14 CFR 417.407 - Hazard control implementation.
Code of Federal Regulations, 2014 CFR
2014-01-01
... conduct periodic inspections of related hardware, software, and facilities. A launch operator must ensure... operations. (f) Hazardous materials. A launch operator must establish procedures for the receipt, storage... protecting the public that complies with the accident investigation plan as defined in § 417.111(h)(2). These...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-13
... Natural Gas Operators AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No...: ``Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting Requirements.'' The final rule...
41 CFR 128-1.8001 - Background.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...
41 CFR 128-1.8001 - Background.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...
41 CFR 128-1.8001 - Background.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...
41 CFR 128-1.8001 - Background.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...
41 CFR 128-1.8001 - Background.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Earthquake Hazards Reduction Act of 1977 (Act), 42 U.S.C. 7701, et seq., as amended, directs the Federal government to establish and maintain an effective earthquake hazards reduction program to reduce the risks to life and property from future earthquakes. Executive Order 12699 implements certain provisions of the...
Snell, Kym I E; Hua, Harry; Debray, Thomas P A; Ensor, Joie; Look, Maxime P; Moons, Karel G M; Riley, Richard D
2016-01-01
Our aim was to improve meta-analysis methods for summarizing a prediction model's performance when individual participant data are available from multiple studies for external validation. We suggest multivariate meta-analysis for jointly synthesizing calibration and discrimination performance, while accounting for their correlation. The approach estimates a prediction model's average performance, the heterogeneity in performance across populations, and the probability of "good" performance in new populations. This allows different implementation strategies (e.g., recalibration) to be compared. Application is made to a diagnostic model for deep vein thrombosis (DVT) and a prognostic model for breast cancer mortality. In both examples, multivariate meta-analysis reveals that calibration performance is excellent on average but highly heterogeneous across populations unless the model's intercept (baseline hazard) is recalibrated. For the cancer model, the probability of "good" performance (defined by C statistic ≥0.7 and calibration slope between 0.9 and 1.1) in a new population was 0.67 with recalibration but 0.22 without recalibration. For the DVT model, even with recalibration, there was only a 0.03 probability of "good" performance. Multivariate meta-analysis can be used to externally validate a prediction model's calibration and discrimination performance across multiple populations and to evaluate different implementation strategies. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ballesteros-Cánovas, Juan Antonio; Stoffel, Markus; Trappmann, Daniel; Shekhar, Mayank; Bhattacharyya, Amalava
2016-04-01
Floods are a common natural hazard in the Western Indian Himalayas. They usually occur when humid monsoon airs are lifted along the Himalayan relief, thereby creating intense orographic rainfall and runoff, a process which is often enhanced by simultaneous snowmelt. Monsoon floods are considered a major threat in the region and frequently affect inhabited valleys, disturbing the status quo of communities, stressing the future welfare and condition of their economic development. Given the assumption that ongoing and future climatic changes may impact on monsoon patterns and extreme precipitation, the implementation of adaptation policies in this region is critically needed in order to improve local resilience of Himalayan communities. However, its success implementation is highly dependent on system knowledge and hence reliable baseline data of past disasters. In this communication, we demonstrate how newly gained knowledge on past flood incidents may improve flood hazard and risk assessments. Based on growth-ring analysis of trees growing in the floodplains and other, more classical paleo-hydrology techniques, we reconstruct the regional flood activity for the last decades. This information is then included as non-systematic data into the regional flood frequency by using Bayesian Markov Monte Carlo Chain algorithms, so as to analyse the impact of the additional data on flood hazard assessments. Moreover, through a detailed analysis of three flood risk hotspots, we demonstrate how the newly gained knowledge on past flood disasters derived from indirect proxies can explain failures in the implementation of disaster risk management (DRM). Our methodology allowed identification of thirty-four unrecorded flood events at the study sites located in the upper reaches since the early 20th century, and thus completion of the existing flood history in the region based on flow measurements in the lower part of the catchment. We observe that 56% of the floods occurred simultaneously in more than two catchments, and that in 15% of the cases more than four catchments were affected. By contrast, 44% of event years were related with one specific catchment, corroborating the assumption that large-scale atmospheric conditions and specific weather and/or geomorphic conditions may operate as triggers of floods in Kullu district. The inclusion of peak discharge data related with these ungauged extreme flood events into the regional flood frequency evidenced that flood hazard was systematically underestimated. Our results allowed to highlight the potential causes of three paradigmatic cases of flood disaster incidents at Kullus district, suggesting that the lack of knowledge on past flood disaster could play an important role in Disaster Risk managment (DRM) at three actors-levels i.e. civil engineering, local authorities and inhabitants. These observations show that reliable DRM implementation is conditioned by lack of data to characterize the flood process, and therefore put in value the palaeohydrological approach used in this study.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...
Fluor Daniel Hanford implementation plan for DOE Order 5480.28, Natural phenomena hazards mitigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conrads, T.J.
1997-09-12
Natural phenomena hazards (NPH) are unexpected acts of nature that pose a threat or danger to workers, the public, or the environment. Earthquakes, extreme winds (hurricane and tornado), snow, flooding, volcanic ashfall, and lightning strikes are examples of NPH that could occur at the Hanford Site. U.S. Department of Energy (DOE) policy requires facilities to be designed, constructed, and operated in a manner that protects workers, the public, and the environment from hazards caused by natural phenomena. DOE Order 5480.28, Natural Phenomena Hazards Mitigation, includes rigorous new natural phenomena criteria for the design of new DOE facilities, as well asmore » for the evaluation and, if necessary, upgrade of existing DOE facilities. The Order was transmitted to Westinghouse Hanford Company in 1993 for compliance and is also identified in the Project Hanford Management Contract, Section J, Appendix C. Criteria and requirements of DOE Order 5480.28 are included in five standards, the last of which, DOE-STD-1023, was released in fiscal year 1996. Because the Order was released before all of its required standards were released, enforcement of the Order was waived pending release of the last standard and determination of an in-force date by DOE Richland Operations Office (DOE-RL). Agreement also was reached between the Management and Operations Contractor and DOE-RL that the Order would become enforceable for new structures, systems, and components (SSCS) 60 days following issue of a new order-based design criteria in HNF-PRO-97, Engineering Design and Evaluation. The order also requires that commitments addressing existing SSCs be included in an implementation plan that is to be issued 1 year following the release of the last standard. Subsequently, WHC-SP-1175, Westinghouse Hanford Company Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, Rev. 0, was issued in November 1996, and this document, HNF-SP-1175, Fluor Daniel Hanford Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, is Rev. 1 of that plan.« less
Risk watershed analysis: a new approach to manage torrent control structures
NASA Astrophysics Data System (ADS)
Quefféléan, Yann; Carladous, Simon; Deymier, Christian; Marco, Olivier
2017-04-01
Torrential check dams have been built in French public forests since the 19th century, applying the Restoration and conservation of Mountainous Areas (RTM) laws (1860, 1864, 1882). The RTM department of the National Forestry Office (ONF) helps the government to decide on protective actions to implement within these areas. While more than 100 000 structures were registered in 1964, more than 14 000 check dams are currently registered and maintained within approximatively 380 000 ha of RTM public forests. The RTM department officers thus have a long experience in using check dams for soil restoration, but also in implementing other kinds of torrential protective structures such as sediment traps, embankments, bank protection, and so forth. As a part of the ONF, they are also experienced in forestry engineering. Nevertheless, some limits in torrent control management have been highlighted: - as existing protective structures are ageing, their effectiveness to protect elements at risk must be assessed but it is a difficult task ; - as available budget for maintenance is continuously decreasing, priorities have to be made but decisions are difficult : what are the existing check dams functions? what is their expected effect on torrential hazard? is maintenance cost too important given this expected effect to protect elements at risk? Given these questions, a new policy has been engaged by the RTM department since 2012. A technical overview at the torrential watershed scale is now needed to help better maintenance decisions: it has been called a Risk Watershed Analysis (Etude de Bassin de Risque in French, EBR) and is funded by the government. Its objectives are to: - recall initial objectives of protective structures : therefore, a detailed archive analysis is made ; - describe current elements at risk to protect ; - describe natural hazards at the torrential watershed scale and their evolution since protective structures implementation ; - describe civil engineering and forestry works that have been implemented within the watershed, including their cost ; - decide on current protective works to implement (maintenance and new investment). For each EBR, a multidisciplinary team is involved with specialists in geomorphology, hydrology, hydraulics, geology, civil engineering and forestry. Approximatively 1 100 EBRs should be implemented at the national scale, including other natural phenomena such as snow avalanches and rock falls. Since 2012, approximatively 10 % have been realized in areas with the most significant elements at risk. From a practical point of view, these studies have helped a better understanding of torrential watershed conditions and of torrent control expected effect over years. An analysis of these studies will be performed soon to have a first overview of torrent control effect. We claim that these EBRs could be a significant source of information to help a comprehensive evaluation of long-term effectiveness of torrent control.
NASA Technical Reports Server (NTRS)
2012-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.
NASA Astrophysics Data System (ADS)
Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara
2017-04-01
Geo-hazards and their effects are distributed geographically over wide regions. The effective mapping and monitoring is essential for hazard assessment and mitigation. It is often best achieved using satellite imagery and new object-based image analysis approaches to identify and delineate geo-hazard objects (landslides, floods, forest fires, storm damages, etc.). At the moment, several local/national databases and platforms provide and publish data of different types of geo-hazards as well as web-based risk maps and decision support systems. Also, the European commission implemented the Copernicus Emergency Management Service (EMS) in 2015 that publishes information about natural and man-made disasters and risks. Currently, no platform for landslides or geo-hazards as such exists that enables the integration of the user in the mapping and monitoring process. In this study we introduce the concept of a spatial data infrastructure for object delineation, web-processing and service provision of landslide information with the focus on user interaction in all processes. A first prototype for the processing and mapping of landslides in Austria and Italy has been developed within the project Land@Slide, funded by the Austrian Research Promotion Agency FFG in the Austrian Space Applications Program ASAP. The spatial data infrastructure and its services for the mapping, processing and analysis of landslides can be extended to other regions and to all types of geo-hazards for analysis and delineation based on Earth Observation (EO) data. The architecture of the first prototypical spatial data infrastructure includes four main areas of technical components. The data tier consists of a file storage system and the spatial data catalogue for the management of EO-data, other geospatial data on geo-hazards, as well as descriptions and protocols for the data processing and analysis. An interface to extend the data integration from external sources (e.g. Sentinel-2 data) is planned for the possibility of rapid mapping. The server tier consists of java based web and GIS server. Sub and main services are part of the service tier. Sub services are for example map services, feature editing services, geometry services, geoprocessing services and metadata services. For (meta)data provision and to support data interoperability, web standards of the OGC and the rest-interface is used. Four central main services are designed and developed: (1) a mapping service (including image segmentation and classification approaches), (2) a monitoring service to monitor changes over time, (3) a validation service to analyze landslide delineations from different sources and (4) an infrastructure service to identify affected landslides. The main services use and combine parts of the sub services. Furthermore, a series of client applications based on new technology standards making use of the data and services offered by the spatial data infrastructure. Next steps include the design to extend the current spatial data infrastructure to other areas and geo-hazard types to develop a spatial data infrastructure that can assist targeted mapping and monitoring of geo-hazards on a global context.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-03-01
Estimation of the costs associated with implementation of the Resource Conservation and Recovery Act (RCRA) regulations for non-hazardous and hazardous material disposal in the utility industry are provided. These costs are based on engineering studies at a number of coal-fired power plants in which the costs for hazardous and non-hazardous disposal are compared to the costs developed for the current practice design for each utility. The relationship of the three costs is displayed. The emphasis of this study is on the determination of incremental costs rather than the absolute costs for each case (current practice, non-hazardous, or hazardous). For themore » purpose of this project, the hazardous design cost was determined for both minimum and maximum compliance.« less
The Control of Hazardous Wastes and the Role of Environmental Educators.
ERIC Educational Resources Information Center
Pfortner, Ray
1984-01-01
Discusses legislation aimed at hazardous waste issues which are implemented by the Environmental Protection Agency and state governments. Particular attention is given to Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA). A case study of an abandoned acres superfund site is included with two related student…
Rocky Mountain Research Station USDA Forest Service
2004-01-01
Effective public education and communication campaigns about wildland fire and fuels management should have clear objectives, and use the right techniques to achieve these objectives. This fact sheet lists seven important considerations for planning or implementing a hazard communication effort.
40 CFR 266.106 - Standards to control metals emissions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... HAZARDOUS WASTE MANAGEMENT FACILITIES Hazardous Waste Burned in Boilers and Industrial Furnaces § 266.106... implemented by limiting feed rates of the individual metals to levels during the trial burn (for new... screening limit for the worst-case stack. (d) Tier III and Adjusted Tier I site-specific risk assessment...
40 CFR 266.106 - Standards to control metals emissions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... HAZARDOUS WASTE MANAGEMENT FACILITIES Hazardous Waste Burned in Boilers and Industrial Furnaces § 266.106... implemented by limiting feed rates of the individual metals to levels during the trial burn (for new... screening limit for the worst-case stack. (d) Tier III and Adjusted Tier I site-specific risk assessment...
77 FR 54863 - Polychlorinated Biphenyls (PCBs): Revisions to Manifesting Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-06
... and Recovery Act (RCRA) Uniform Hazardous Waste Manifest, under the Toxic Substances Control Act (TSCA... implement the Uniform Hazardous Waste Manifest form were promulgated on March, 4, 2005. DATES: Written... governmental jurisdiction that is a government of a city, county, town, school district or special district...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... Registry of Pipeline and Liquefied Natural Gas Operators AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts... Register (75 FR 72878) titled: ``Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting...
40 CFR 262.102 - What special definitions are included in this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... implementation of the Environmental Management Plan as measured against policy, objectives and targets... potential hazards to human health or the environment and which must include RCRA “P” wastes, and may include... work practices that both protect human health and the environment from the hazards presented by...
[Hazard function and life table: an introduction to the failure time analysis].
Matsushita, K; Inaba, H
1987-04-01
Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...
NASA Astrophysics Data System (ADS)
Baruffini, Mirko
2010-05-01
Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a GIS-based system can be for effective and efficient disaster response management. In the coming years our GIS application will be a data base containing all information needed for the evaluation of risk sites along the Gotthard line. Our GIS application can help the technical management to decide about protection measures because of, in addition to the visualisation, tools for spatial data analysis will be available. REFERENCES Bründl M. (Ed.) 2009 : Risikokonzept für Naturgefahren - Leitfaden. Nationale Plattform für Naturgefahren PLANAT, Bern. 416 S. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004: La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Maggi R. et al, 2009: Evaluation of the optimal resilience for vulnerable infrastructure networks. An interdisciplinary pilot study on the transalpine transportation corridors, NRP 54 "Sustainable Development of the Built Environment", Projekt Nr. 405 440, Final Scientific Report, Lugano
Comín-Colet, Josep; Verdú-Rotellar, José María; Vela, Emili; Clèries, Montse; Bustins, Montserrat; Mendoza, Lola; Badosa, Neus; Cladellas, Mercè; Ferré, Sofía; Bruguera, Jordi
2014-04-01
The efficacy of heart failure programs has been demonstrated in clinical trials but their applicability in the real world practice setting is more controversial. This study evaluates the feasibility and efficacy of an integrated hospital-primary care program for the management of patients with heart failure in an integrated health area covering a population of 309,345. For the analysis, we included all patients consecutively admitted with heart failure as the principal diagnosis who had been discharged alive from all of the hospitals in Catalonia, Spain, from 2005 to 2011, the period when the program was implemented, and compared mortality and readmissions among patients exposed to the program with the rates in the patients of all the remaining integrated health areas of the Servei Català de la Salut (Catalan Health Service). We included 56,742 patients in the study. There were 181,204 hospital admissions and 30,712 deaths during the study period. In the adjusted analyses, when compared to the 54,659 patients from the other health areas, the 2083 patients exposed to the program had a lower risk of death (hazard ratio=0.92 [95% confidence interval, 0.86-0.97]; P=.005), a lower risk of clinically-related readmission (hazard ratio=0.71 [95% confidence interval, 0.66-0.76]; P<.001), and a lower risk of readmission for heart failure (hazard ratio=0.86 [95% confidence interval, 0.80-0.94]; P<.001). The positive impact on the morbidity and mortality rates was more marked once the program had become well established. The implementation of multidisciplinary heart failure management programs that integrate the hospital and the community is feasible and is associated with a significant reduction in patient morbidity and mortality. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.
[Detection of occupational hazards in a large shipbuilding factory].
Du, Weijia; Wang, Zhi; Zhang, Hai; Zhou, Liping; Huang, Minzhi; Liu, Yimin
2014-03-01
To provide evidence for the prevention and treatment of occupational diseases by the analysis of existing major occupational hazards and health conditions of workers in a large shipbuilding factory. Field investigation of occupational conditions was conducted to examine the existence of occupational hazards from 2009 to 2012 in a large shipbuilding factory, and then the results of physical examination among its workers were analyzed. Other than the metal dust (total dust), the levels of other dusts and manganese dioxide were beyond the national standard to various degrees, and through a sampling point detection, it was found that the levels of manganese dioxide exceeded the standard by 42.8%. The maximum time-weighted average concentration in individuals was 27.927 mg/m(3), much higher than the national standard limit. For harmful gas detection in individuals, xylene was 38.4%above the standard level (the highest concentration reached 1447.7 mg/m(3)); moreover, both toluene and ethylbenzene exceeded the national standard at different levels. Among the noise-exposed workers, 71%worked in the environment where the daily noise was above the limit of the national standard (85 dB). Physical examinations in 2010 and 2012 showed that the abnormal rate of audiometry in workers was higher than 15%. Dust (total dust), manganese dioxide, benzene, and noise are the main occupational hazards among the workers in the large shipbuilding factory, and strict protection and control for these hazards should be implemented for the workers in the factory.
The Application of Software Safety to the Constellation Program Launch Control System
NASA Technical Reports Server (NTRS)
Kania, James; Hill, Janice
2011-01-01
The application of software safety practices on the LCS project resulted in the successful implementation of the NASA Software Safety Standard NASA-STD-8719.138 and CxP software safety requirements. The GOP-GEN-GSW-011 Hazard Report was the first report developed at KSC to identify software hazard causes and their controls. This approach can be applied to similar large software - intensive systems where loss of control can lead to a hazard.
Asante-Duah, K; Nagy, I V
2001-06-01
The production of large quantities of wastes globally has created a commercial activity involving the transfrontier shipments of hazardous wastes, intended to be managed at economically attractive waste-handling facilities located elsewhere. In fact, huge quantities of hazardous wastes apparently travel the world in search of "acceptable" waste management facilities. For instance, within the industrialized countries alone, millions of tonnes of potentially hazardous waste cross national frontiers each year on their way for recycling or to treatment, storage, and disposal facilities (TSDFs) because there is no local disposal capacity for these wastes, or because legal disposal or reuse in a foreign country may be more environmentally sound, or managing the wastes in the foreign country may be less expensive than at home. The cross-boundary traffic in hazardous wastes has lately been under close public scrutiny, however, resulting in the accession of several international agreements and laws to regulate such activities. This paper discusses and analyzes the most significant control measures and major agreements in this new commercial activity involving hazardous wastes. In particular, the discussion recognizes the difficulties with trying to implement the relevant international agreements among countries of vastly different socioeconomic backgrounds. Nonetheless, it is also noted that global environmental agreements will generally be a necessary component of ensuring adequate environmental protection for the world community-and thus a need for the careful implementation of such agreements and regulations.
Crowther, Michael J; Look, Maxime P; Riley, Richard D
2014-09-28
Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.
Garbayo, Luciana; Stahl, James
2017-03-01
Guidelines orient best practices in medicine, yet, in health care, many real world constraints limit their optimal realization. Since guideline implementation problems are not systematically anticipated, they will be discovered only post facto, in a learning curve period, while the already implemented guideline is tweaked, debugged and adapted. This learning process comes with costs to human health and quality of life. Despite such predictable hazard, the study and modeling of medical guideline implementation is still seldom pursued. In this article we argue that to systematically identify, predict and prevent medical guideline implementation errors is both an epistemic responsibility and an ethical imperative in health care, in order to properly provide beneficence, minimize or avoid harm, show respect for persons, and administer justice. Furthermore, we suggest that implementation knowledge is best achieved technically by providing simulation modeling studies to anticipate the realization of medical guidelines, in multiple contexts, with system and scenario analysis, in its alignment with the emerging field of implementation science and in recognition of learning health systems. It follows from both claims that it is an ethical imperative and an epistemic responsibility to simulate medical guidelines in context to minimize (avoidable) harm in health care, before guideline implementation.
Gori, Paula L.
1993-01-01
INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and engineering studies. Translated earthquake hazard maps have also been developed to identify areas that are particularly vulnerable to various causes of damage such as ground shaking, surface rupturing, and liquefaction. The implementation of earthquake hazard reduction plans are now under way in various communities in Utah. The results of a survey presented in this paper indicate that technical public officials (planners and building officials) have an understanding of the earthquake hazards and how to mitigate the risks. Although the survey shows that the general public has a slightly lower concern about the potential for economic losses, they recognize the potential problems and can support a number of earthquake mitigation measures. The study suggests that many community groups along the Wasatch Front, including volunteer groups, business groups, and elected and appointed officials, are ready for action-oriented educational programs. These programs could lead to a significant reduction in the risks associated with earthquake hazards. A DATA BASE DESIGNED FOR URBAN SEISMIC HAZARDS STUDIES: A computerized data base has been designed for use in urban seismic hazards studies conducted by the U.S. Geological Survey. The design includes file structures for 16 linked data sets, which contain geological, geophysical, and seismological data used in preparing relative ground response maps of large urban areas. The data base is organized along relational data base principles. A prototype urban hazards data base has been created for evaluation in two urban areas currently under investigation: the Wasatch Front region of Utah and the Puget Sound area of Washington. The initial implementation of the urban hazards data base was accomplished on a microcomputer using dBASE III Plus software and transferred to minicomputers and a work station. A MAPPING OF GROUND-SHAKING INTENSITIES FOR SALT LAKE COUNTY, UTAH: This paper documents the development of maps showing a
Influences on Adaptive Planning to Reduce Flood Risks among Parishes in South Louisiana.
Paille, Mary; Reams, Margaret; Argote, Jennifer; Lam, Nina S-N; Kirby, Ryan
2016-02-01
Residents of south Louisiana face a range of increasing, climate-related flood exposure risks that could be reduced through local floodplain management and hazard mitigation planning. A major incentive for community planning to reduce exposure to flood risks is offered by the Community Rating System (CRS) of the National Flood Insurance Program (NFIP). The NFIP encourages local collective action by offering reduced flood insurance premiums for individual policy holders of communities where suggested risk-reducing measures have been implemented. This preliminary analysis examines the extent to which parishes (counties) in southern Louisiana have implemented the suggested policy actions and identifies key factors that account for variation in the implementation of the measures. More measures implemented results in higher CRS scores. Potential influences on scores include socioeconomic attributes of residents, government capacity, average elevation and past flood events. The results of multiple regression analysis indicate that higher CRS scores are associated most closely with higher median housing values. Furthermore, higher scores are found in parishes with more local municipalities that participate in the CRS program. The number of floods in the last five years and the revenue base of the parish does not appear to influence CRS scores. The results shed light on the conditions under which local adaptive planning to mitigate increasing flood risks is more likely to be implemented and offer insights for program administrators, researchers and community stakeholders.
Influences on Adaptive Planning to Reduce Flood Risks among Parishes in South Louisiana
Paille, Mary; Reams, Margaret; Argote, Jennifer; Lam, Nina S.-N.; Kirby, Ryan
2016-01-01
Residents of south Louisiana face a range of increasing, climate-related flood exposure risks that could be reduced through local floodplain management and hazard mitigation planning. A major incentive for community planning to reduce exposure to flood risks is offered by the Community Rating System (CRS) of the National Flood Insurance Program (NFIP). The NFIP encourages local collective action by offering reduced flood insurance premiums for individual policy holders of communities where suggested risk-reducing measures have been implemented. This preliminary analysis examines the extent to which parishes (counties) in southern Louisiana have implemented the suggested policy actions and identifies key factors that account for variation in the implementation of the measures. More measures implemented results in higher CRS scores. Potential influences on scores include socioeconomic attributes of residents, government capacity, average elevation and past flood events. The results of multiple regression analysis indicate that higher CRS scores are associated most closely with higher median housing values. Furthermore, higher scores are found in parishes with more local municipalities that participate in the CRS program. The number of floods in the last five years and the revenue base of the parish does not appear to influence CRS scores. The results shed light on the conditions under which local adaptive planning to mitigate increasing flood risks is more likely to be implemented and offer insights for program administrators, researchers and community stakeholders. PMID:27330828
Novel Flood Detection and Analysis Method Using Recurrence Property
NASA Astrophysics Data System (ADS)
Wendi, Dadiyorto; Merz, Bruno; Marwan, Norbert
2016-04-01
Temporal changes in flood hazard are known to be difficult to detect and attribute due to multiple drivers that include processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defence, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic behavior to certain flood situations.
Dow, Christopher B; Collins, Brandon M; Stephens, Scott L
2016-03-01
Finding novel ways to plan and implement landscape-level forest treatments that protect sensitive wildlife and other key ecosystem components, while also reducing the risk of large-scale, high-severity fires, can prove to be difficult. We examined alternative approaches to landscape-scale fuel-treatment design for the same landscape. These approaches included two different treatment scenarios generated from an optimization algorithm that reduces modeled fire spread across the landscape, one with resource-protection constrains and one without the same. We also included a treatment scenario that was the actual fuel-treatment network implemented, as well as a no-treatment scenario. For all the four scenarios, we modeled hazardous fire potential based on conditional burn probabilities, and projected fire emissions. Results demonstrate that in all the three active treatment scenarios, hazardous fire potential, fire area, and emissions were reduced by approximately 50 % relative to the untreated condition. Results depict that incorporation of constraints is more effective at reducing modeled fire outputs, possibly due to the greater aggregation of treatments, creating greater continuity of fuel-treatment blocks across the landscape. The implementation of fuel-treatment networks using different planning techniques that incorporate real-world constraints can reduce the risk of large problematic fires, allow for landscape-level heterogeneity that can provide necessary ecosystem services, create mixed forest stand structures on a landscape, and promote resilience in the uncertain future of climate change.
Adaptation to Climatic Hazards in the Savannah Ecosystem: Improving Adaptation Policy and Action
NASA Astrophysics Data System (ADS)
Yiran, Gerald A. B.; Stringer, Lindsay C.
2017-10-01
People in Ghana's savannah ecosystem have historically experienced a range of climatic hazards that have affected their livelihoods. In view of current climate variability and change, and projected increases in extreme events, adaptation to climate risks is vital. Policies have been put in place to enhance adaptation across sub-Saharan Africa in accordance with international agreements. At the same time, local people, through experience, have learned to adapt. This paper examines current policy actions and their implementation alongside an assessment of barriers to local adaptation. In doing so it links adaptation policy and practice. Policy documents were analysed that covered key livelihood sectors, which were identified as climate sensitive. These included agriculture, water, housing and health policies, as well as the National Climate Change Policy. In-depth interviews and focus group discussions were also held with key stakeholders in the Upper East Region of Ghana. Analyses were carried using thematic content analysis. Although policies and actions complement each other, their integration is weak. Financial, institutional, social, and technological barriers hinder successful local implementation of some policy actions, while lack of local involvement in policy formulation also hinders adaptation practice. Integration of local perspectives into policy needs to be strengthened in order to enhance adaptation. Coupled with this is a need to consider adaptation to climate change in development policies and to pursue efforts to reduce or remove the key barriers to implementation at the local level.
Toxic release consequence analysis tool (TORCAT) for inherently safer design plant.
Shariff, Azmi Mohd; Zaini, Dzulkarnain
2010-10-15
Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage. 2010 Elsevier B.V. All rights reserved.
Multi-hazards risk assessment at different levels
NASA Astrophysics Data System (ADS)
Frolova, N.; Larionov, V.; Bonnin, J.
2012-04-01
Natural and technological disasters are becoming more frequent and devastating. Social and economic losses due to those events increase annually, which is definitely in relation with evolution of society. Natural hazards identification and analysis, as well natural risk assessment taking into account secondary technological accidents are the first steps in prevention strategy aimed at saving lives and protecting property against future events. The paper addresses methodological issues of natural and technological integrated risk assessment and mapping at different levels [1, 2]. At the country level the most hazardous natural processes, which may results in fatalities, injuries and economic loss in the Russian Federation, are considered. They are earthquakes, landslides, mud flows, floods, storms, avalanches. The special GIS environment for the country territory was developed which includes information about hazards' level and reoccurrence, an impact databases for the last 20 years, as well as models for estimating damage and casualties caused by these hazards. Federal maps of seismic individual and collective risk, as well as multi-hazards natural risk maps are presented. The examples of regional seismic risk assessment taking into account secondary accidents at fire, explosion and chemical hazardous facilities and regional integrated risk assessment are given for the earthquake prone areas of the Russian Federation. The paper also gives examples of loss computations due to scenario earthquakes taking into account accidents trigged by strong events at critical facilities: fire and chemical hazardous facilities, including oil pipe lines routes located in the earthquake prone areas. The estimations of individual seismic risk obtained are used by EMERCOM of the Russian Federation, as well as by other federal and local authorities, for planning and implementing preventive measures, aimed at saving lives and protecting property against future disastrous events. The results also allow to develop effective emergency response plans taking into account possible scenario events. Taking into consideration the size of the oil pipe line systems located in the highly active seismic zones, the results of seismic risk computation are used by TRANSNEFT JSC.
Raemer, Daniel B
2014-06-01
The story of Ignaz Semmelweis suggests a lesson to beware of unintended consequences, especially with in situ simulation. In situ simulation offers many important advantages over center-based simulation such as learning about the real setting, putting participants at ease, saving travel time, minimizing space requirements, involving patients and families. Some substantial disadvantages include frequent distractions, lack of privacy, logistics of setup, availability of technology, and supply costs. Importantly, in situ simulation amplifies some of the safety hazards of simulation itself including maintaining control of simulated medications and equipment, limiting the use of valuable hospital resources, preventing incorrect learning from simulation shortcuts, and profoundly upsetting patients and their families. Mitigating these hazards by labeling effectively, publishing policies and procedures, securing simulation supplies and equipment, educating simulation staff, and informing participants of the risks are all methods that may lessen the potential for an accident. Each requires a serious effort of analysis, design, and implementation.
Total Diet Studies as a Tool for Ensuring Food Safety
Lee, Joon-Goo; Kim, Sheen-Hee; Kim, Hae-Jung
2015-01-01
With the diversification and internationalization of the food industry and the increased focus on health from a majority of consumers, food safety policies are being implemented based on scientific evidence. Risk analysis represents the most useful scientific approach for making food safety decisions. Total diet study (TDS) is often used as a risk assessment tool to evaluate exposure to hazardous elements. Many countries perform TDSs to screen for chemicals in foods and analyze exposure trends to hazardous elements. TDSs differ from traditional food monitoring in two major aspects: chemicals are analyzed in food in the form in which it will be consumed and it is cost-effective in analyzing composite samples after processing multiple ingredients together. In Korea, TDSs have been conducted to estimate dietary intakes of heavy metals, pesticides, mycotoxins, persistent organic pollutants, and processing contaminants. TDSs need to be carried out periodically to ensure food safety. PMID:26483881
44 CFR 73.1 - Purpose of part.
Code of Federal Regulations, 2014 CFR
2014-10-01
... SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IMPLEMENTATION OF SECTION 1316 OF THE NATIONAL FLOOD INSURANCE ACT OF 1968 § 73.1 Purpose of part. This part implements section 1316 of the National Flood Insurance Act of 1968. ...
44 CFR 73.1 - Purpose of part.
Code of Federal Regulations, 2012 CFR
2012-10-01
... SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IMPLEMENTATION OF SECTION 1316 OF THE NATIONAL FLOOD INSURANCE ACT OF 1968 § 73.1 Purpose of part. This part implements section 1316 of the National Flood Insurance Act of 1968. ...
Babiker, Amir; Amer, Yasser S; Osman, Mohamed E; Al-Eyadhy, Ayman; Fatani, Solafa; Mohamed, Sarar; Alnemri, Abdulrahman; Titi, Maher A; Shaikh, Farheen; Alswat, Khalid A; Wahabi, Hayfaa A; Al-Ansary, Lubna A
2018-02-01
Implementation of clinical practice guidelines (CPGs) has been shown to reduce variation in practice and improve health care quality and patients' safety. There is a limited experience of CPG implementation (CPGI) in the Middle East. The CPG program in our institution was launched in 2009. The Quality Management department conducted a Failure Mode and Effect Analysis (FMEA) for further improvement of CPGI. This is a prospective study of a qualitative/quantitative design. Our FMEA included (1) process review and recording of the steps and activities of CPGI; (2) hazard analysis by recording activity-related failure modes and their effects, identification of actions required, assigned severity, occurrence, and detection scores for each failure mode and calculated the risk priority number (RPN) by using an online interactive FMEA tool; (3) planning: RPNs were prioritized, recommendations, and further planning for new interventions were identified; and (4) monitoring: after reduction or elimination of the failure mode. The calculated RPN will be compared with subsequent analysis in post-implementation phase. The data were scrutinized from a feedback of quality team members using a FMEA framework to enhance the implementation of 29 adapted CPGs. The identified potential common failure modes with the highest RPN (≥ 80) included awareness/training activities, accessibility of CPGs, fewer advocates from clinical champions, and CPGs auditing. Actions included (1) organizing regular awareness activities, (2) making CPGs printed and electronic copies accessible, (3) encouraging senior practitioners to get involved in CPGI, and (4) enhancing CPGs auditing as part of the quality sustainability plan. In our experience, FMEA could be a useful tool to enhance CPGI. It helped us to identify potential barriers and prepare relevant solutions. © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Schaub, Y.; Huggel, C.; Serraino, M.; Haeberli, W.
2012-04-01
The changes in high-mountain environments are increasingly fast and complex. GIS-based models of the Swiss Alps show that numerous topographic overdeepenings are likely to appear on progressively exposed glacier beds, which are considered as potential sites of future lake formation. In many cases these newly forming lakes will be situated in an over-steepened and destabilized high-mountain environment and are, therefore, prone to impact waves from landslides. The risk of glacier lake outburst floods, endangering infrastructure, residential areas and persons further downvalley, is increasing with further lake formation and glacier recession. This risk may persist for many decades if not centuries. Future-oriented hazard assessments have to be integrative and must deal with all possible process chains. Reference studies and methodologies are still scarce, however. We present an approach to compare risks resulting from high-mountain lakes in the Swiss Alps amongst each other. Already existing lakes are thereby as much included in the analysis as future ones. The presented risk assessment approach integrates the envisaged high-mountain hazard process chain with present and future socio-economic conditions. Applying the concept of integral risk management, the hazard and damage potentials have to be analyzed. The areas that feature the topographic potential for rock/iceavalanches to reach a lake were analyzed regarding their susceptibility to slope failure including the factors slope inclination, permafrost occurrence, glacier recession and bedrock lithology. Together with the analysis of the lakes (volume and runout path of potential outburst floods), the hazard analysis of the process chain was completed. As an example, high long-term hazard potentials in the Swiss Alps have, for instance, to be expected in the area of the Great Aletsch glacier. A methodology for the assessment of the damage potential was elaborated and will be presented. In order to estimate the location of the largest damage potentials, driving forces of different spatial development scenarios for the Swiss Alps will be implemented in a land allocation model for the Swiss Alps. By bringing together hazard, exposure and vulnerability analyses, a risk assessment for the entire Swiss Alps regarding lake-outburst floods triggered by impacts of rock/ice avalanches can be conducted for today, the middle of the century and even beyond.
Integration of expert knowledge and uncertainty in natural risk assessment
NASA Astrophysics Data System (ADS)
Baruffini, Mirko; Jaboyedoff, Michel
2010-05-01
Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and uncertainties. We followed the same approach for each term of risks i.e. hazard, vulnerability, element at risk, exposition. This risk approach can be achieved by a comprehensive use of several artificial intelligence (AI) technologies, which are done through, for example: (1) GIS techniques; (2) FR or T-PDF for qualitatively predicting risks for possible review results; and (3) A Multi-Criteria Evaluation for analyzing weak points. The main advantages of FR or T-PDF involve the ability to express not-fully-formalized knowledge, easy knowledge representation and acquisition, and self updatability. The results show that such an approach points out quite wide zone of uncertainty. REFERENCES Zadeh L.A. 1965 : Fuzzy Sets. Information and Control, 8:338-353.
NASA Astrophysics Data System (ADS)
Zolfaghari, Mohammad R.
2009-07-01
Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2012 CFR
2012-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2014 CFR
2014-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2011 CFR
2011-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2013 CFR
2013-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-21
... ``ROADWAY STRIPING''. (5) Operational controls. A non-DOT specification cargo tank used for roadway striping... package or ship a hazardous material in a manner that varies from the regulations provided an equivalent... least an equivalent level of safety to that specified in the HMR. Implementation of new technologies and...
Code of Federal Regulations, 2011 CFR
2011-07-01
... associated with entering the port's confined spaces, and develop a confined space safe entry program that... implement the confined space safe entry program, the deepwater port operator must determine the education... protecting personnel from hazards associated with confined spaces? 150.623 Section 150.623 Navigation and...
40 CFR 261.1 - Purpose and scope.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Purpose and scope. 261.1 Section 261.1... AND LISTING OF HAZARDOUS WASTE General § 261.1 Purpose and scope. (a) This part identifies those solid... hazardous for purposes of the regulations implementing subtitle C of RCRA. For example, it does not apply to...
40 CFR 261.1 - Purpose and scope.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 27 2012-07-01 2012-07-01 false Purpose and scope. 261.1 Section 261.1... AND LISTING OF HAZARDOUS WASTE General § 261.1 Purpose and scope. (a) This part identifies those solid... hazardous for purposes of the regulations implementing subtitle C of RCRA. For example, it does not apply to...
A proposed new policy for planetary protection
NASA Technical Reports Server (NTRS)
Barengoltz, J. B.; Bergstrom, S. L.; Hobby, G. L.; Stabekis, P. D.
1981-01-01
A critical review of the present policy was conducted with emphasis on its application to future planetary exploration. The probable impact of recent data on the implementation of the present policy was also assessed. The existing policy and its implementation were found to: be excessive for certain missions (e.g., Voyager), neglect the contamination hazard posed by the bulk constituent organics of spacecraft, be ambiguous for certain missions (e.g., Pioneer Venus), and treat all extraterrestrial sample return missions alike. The major features of the proposed policy are planet/mission combinations, a qualitative top level statement, and implementation by exception rather than rule. The concept of planet/mission categories permits the imposition of requirements according to both biological interest in the target planet and the relative contamination hazard of the mission type.
The Spatial Assessment of the Current Seismic Hazard State for Hard Rock Underground Mines
NASA Astrophysics Data System (ADS)
Wesseloo, Johan
2018-06-01
Mining-induced seismic hazard assessment is an important component in the management of safety and financial risk in mines. As the seismic hazard is a response to the mining activity, it is non-stationary and variable both in space and time. This paper presents an approach for implementing a probabilistic seismic hazard assessment to assess the current hazard state of a mine. Each of the components of the probabilistic seismic hazard assessment is considered within the context of hard rock underground mines. The focus of this paper is the assessment of the in-mine hazard distribution and does not consider the hazard to nearby public or structures. A rating system and methodologies to present hazard maps, for the purpose of communicating to different stakeholders in the mine, i.e. mine managers, technical personnel and the work force, are developed. The approach allows one to update the assessment with relative ease and within short time periods as new data become available, enabling the monitoring of the spatial and temporal change in the seismic hazard.
1996-01-01
failure as due to an adhesive layer between the foil and inner polypropylene layers. "* Under subcontract, NFPA provided HACCP draft manuals for the...parameters of the production process and to ensure that they are within their target values. In addition, a HACCP program was used to assure product...played an important part in implementing Hazard Analysis Critical Control Points ( HACCP ) as part of the Process and Quality Control manual. The National
A framework for the probabilistic analysis of meteotsunamis
Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.
2014-01-01
A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.
40 CFR 63.368 - Implementation and enforcement.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 9 2010-07-01 2010-07-01 false Implementation and enforcement. 63.368 Section 63.368 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES Ethylene Oxide...
DESCRIPTION OF RISK REDUCTION ENGINEERING LABORATORY TEST AND EVALUATION FACILITIES
An onsite team of multidisciplined engineers and scientists conduct research and provide technical services in the areas of testing, design, and field implementation for both solid and hazardous waste management. Engineering services focus on the design and implementation of...
Sun, F; Chen, J; Tong, Q; Zeng, S
2007-01-01
Management of drinking water safety is changing towards an integrated risk assessment and risk management approach that includes all processes in a water supply system from catchment to consumers. However, given the large number of water supply systems in China and the cost of implementing such a risk assessment procedure, there is a necessity to first conduct a strategic screening analysis at a national level. An integrated methodology of risk assessment and screening analysis is thus proposed to evaluate drinking water safety of a conventional water supply system. The violation probability, indicating drinking water safety, is estimated at different locations of a water supply system in terms of permanganate index, ammonia nitrogen, turbidity, residual chlorine and trihalomethanes. Critical parameters with respect to drinking water safety are then identified, based on which an index system is developed to prioritize conventional water supply systems in implementing a detailed risk assessment procedure. The evaluation results are represented as graphic check matrices for the concerned hazards in drinking water, from which the vulnerability of a conventional water supply system is characterized.
14 CFR 417.227 - Toxic release hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...
An offline-online Web-GIS Android application for fast data acquisition of landslide hazard and risk
NASA Astrophysics Data System (ADS)
Olyazadeh, Roya; Sudmeier-Rieux, Karen; Jaboyedoff, Michel; Derron, Marc-Henri; Devkota, Sanjaya
2017-04-01
Regional landslide assessments and mapping have been effectively pursued by research institutions, national and local governments, non-governmental organizations (NGOs), and different stakeholders for some time, and a wide range of methodologies and technologies have consequently been proposed. Land-use mapping and hazard event inventories are mostly created by remote-sensing data, subject to difficulties, such as accessibility and terrain, which need to be overcome. Likewise, landslide data acquisition for the field navigation can magnify the accuracy of databases and analysis. Open-source Web and mobile GIS tools can be used for improved ground-truthing of critical areas to improve the analysis of hazard patterns and triggering factors. This paper reviews the implementation and selected results of a secure mobile-map application called ROOMA (Rapid Offline-Online Mapping Application) for the rapid data collection of landslide hazard and risk. This prototype assists the quick creation of landslide inventory maps (LIMs) by collecting information on the type, feature, volume, date, and patterns of landslides using open-source Web-GIS technologies such as Leaflet maps, Cordova, GeoServer, PostgreSQL as the real DBMS (database management system), and PostGIS as its plug-in for spatial database management. This application comprises Leaflet maps coupled with satellite images as a base layer, drawing tools, geolocation (using GPS and the Internet), photo mapping, and event clustering. All the features and information are recorded into a GeoJSON text file in an offline version (Android) and subsequently uploaded to the online mode (using all browsers) with the availability of Internet. Finally, the events can be accessed and edited after approval by an administrator and then be visualized by the general public.
NASA Astrophysics Data System (ADS)
Kwiatek, Grzegorz; Blanke, Aglaja; Olszewska, Dorota; Orlecka-Sikora, Beata; Lasocki, Stanisław; Kozlovskaya, Elena; Nevalainen, Jouni; Schmittbuhl, Jean; Grasso, Jean-Robert; Schaming, Marc; Bigarre, Pascal; Kinscher, Jannes-Lennart; Saccorotti, Gilberto; Garcia, Alexander; Cassidy, Nigel; Toon, Sam; Mutke, Grzegorz; Sterzel, Mariusz; Szepieniec, Tomasz
2017-04-01
The Thematic Core Service "Anthropogenic Hazards" (TCS AH) integrates data and provides various data services in a form of complete e-research infrastructure for advanced analysis and geophysical modelling of anthropogenic hazard due to georesources exploitation. TCS AH is based on the prototype built in the framework of the IS-EPOS project POIG.02.03.00-14-090/13-00 (https://tcs.ah-epos.eu/). The TCS AH is currently being further developed within EPOS Implementation phase (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). The TCS AH aims to have a measurable impact on innovative research and development by providing a comprehensive, wide-scale and high quality research infrastructure available to the scientific community, industrial partners and public. One of the main deliverable of TCS AH is the access to numerous induced seismicity datasets called "episodes". The episode is defined as a comprehensive set of data describing the geophysical process induced or triggered by technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment. The episode is a time-correlated, standardized collection of geophysical, technological and other relevant geodata forming complete documentation of seismogenic process. In addition to the 6 episodes already implemented during previous phase of integration, and 3 episodes integrated within SHEER project, at least 18 new episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production are currently being integrated into the TCS AH. The heterogeneous multi-disciplinary data from different episodes are subjected to an extensive quality control (QC) procedure composed of five steps and involving the collaborative work of data providers, quality control team, IT team, that is being supervised by the quality control manager with the aid of Redmine platform. The first three steps of QC are performed at local data center and include the (1) transfer of episode data to the local data center, (2) data standardization and validation of formats, (3) metadata preparation according to TCS AH metadata scheme. The final two steps of QC are performed already at the level of TCS AH website and include (4) Contextual analysis of data quality followed by appearance of episode in TCS AH maintenance area, and finally the (5) Episode publication at TCS AH website.
Murray, Justine V; Jansen, Cassie C; De Barro, Paul
2016-01-01
In an effort to eliminate dengue, a successful technology was developed with the stable introduction of the obligate intracellular bacteria Wolbachia pipientis into the mosquito Aedes aegypti to reduce its ability to transmit dengue fever due to life shortening and inhibition of viral replication effects. An analysis of risk was required before considering release of the modified mosquito into the environment. Expert knowledge and a risk assessment framework were used to identify risk associated with the release of the modified mosquito. Individual and group expert elicitation was performed to identify potential hazards. A Bayesian network (BN) was developed to capture the relationship between hazards and the likelihood of events occurring. Risk was calculated from the expert likelihood estimates populating the BN and the consequence estimates elicited from experts. The risk model for "Don't Achieve Release" provided an estimated 46% likelihood that the release would not occur by a nominated time but generated an overall risk rating of very low. The ability to obtain compliance had the greatest influence on the likelihood of release occurring. The risk model for "Cause More Harm" provided a 12.5% likelihood that more harm would result from the release, but the overall risk was considered negligible. The efficacy of mosquito management had the most influence, with the perception that the threat of dengue fever had been eliminated, resulting in less household mosquito control, and was scored as the highest ranked individual hazard (albeit low risk). The risk analysis was designed to incorporate the interacting complexity of hazards that may affect the release of the technology into the environment. The risk analysis was a small, but important, implementation phase in the success of this innovative research introducing a new technology to combat dengue transmission in the environment.
Yilmaz, Ozge; Can, Zehra S; Toroz, Ismail; Dogan, Ozgur; Oncel, Salim; Alp, Emre; Dilek, Filiz B; Karanfil, Tanju; Yetis, Ulku
2014-08-01
Hazardous waste (HW) generation information is an absolute necessity for ensuring the proper planning, implementation, and monitoring of any waste management system. Unfortunately, environmental agencies in developing countries face difficulties in gathering data directly from the creators of such wastes. It is possible, however, to construct theoretical HW inventories using the waste generation factors (WGFs). The objective of this study was to develop a complete nationwide HW inventory of Turkey that relies on nation-specific WGFs to support management activities of the Turkish Ministry of Environment and Urbanization (MoEU). Inventory studies relied on WGFs from: (a) the literature and (b) field studies and analysis of waste declarations reflecting country-specific industrial practices. Moreover, new tools were introduced to the monitoring infrastructure of MoEU to obtain a comprehensive waste generation data set. Through field studies and a consideration of country specific conditions, it was possible to more thoroughly elucidate HW generation trends in Turkey, a method that was deemed superior to other alternatives. Declaration and literature based WGFs also proved most helpful in supplementing field observations that could not always be conducted. It was determined that these theoretical inventories could become valuable assets in supporting regulating agencies in developing countries for a more thorough implementation of HW management systems. © The Author(s) 2014.
Index based regional vulnerability assessment to cyclones hazards of coastal area of Bangladesh
NASA Astrophysics Data System (ADS)
Mohammad, Q. A.; Kervyn, M.; Khan, A. U.
2016-12-01
Cyclone, storm surge, coastal flooding, salinity intrusion, tornado, nor'wester, and thunderstorms are the listed natural hazards in the coastal areas of Bangladesh. Bangladesh was hit by devastating cyclones in 1970, 1991, 2007, 2009, and 2016. Intensity and frequency of natural hazards in the coastal area are likely to increase in future due to climate change. Risk assessment is one of the most important steps of disaster risk reduction. As a climate change victim nation, Bangladesh claims compensation from green climate fund. It also created its own climate funds. It is therefore very important to assess vulnerability of the coast of Bangladesh to natural hazards for efficient allocation of financial investment to support the national risk reduction. This study aims at identifying the spatial variations in factors contributing to vulnerability of the coastal inhabitants of Bangladesh to natural hazards. An exploratory factor analysis method has been used to assess the vulnerability at each local administrative unit. The 141 initially selected 141 socio-economic indicators were reduced to 41 by converting some of them to meaningful widely accepted indicators and removing highly correlated indicators. Principle component analysis further reduced 41 indicators to 13 dimensions which explained 79% of total variation. PCA dimensions show three types of characteristics of the people that may lead people towards vulnerability. They are (a) demographic, education and job opportunities, (b) access to basic needs and facilities, and (c) special needs people. Vulnerability maps of the study area has been prepared by weighted overlay of the dimensions. Study revealed that 29 and 8 percent of total coastal area are very high and high vulnerable to natural hazards respectively. These are distributed along sea boundary and major rivers. Comparison of this spatial distribution with the capacities to face disaster show that highly vulnerable areas are well covered by cyclone shelters but are not the zone with the most resistant building and the most dense road networks. The findings will be helpful for policy makers to initiate, plan and implement short, medium and long term DRR strategies.
Quality Management Framework for Total Diet Study centres in Europe.
Pité, Marina; Pinchen, Hannah; Castanheira, Isabel; Oliveira, Luisa; Roe, Mark; Ruprich, Jiri; Rehurkova, Irena; Sirot, Veronique; Papadopoulos, Alexandra; Gunnlaugsdóttir, Helga; Reykdal, Ólafur; Lindtner, Oliver; Ritvanen, Tiina; Finglas, Paul
2018-02-01
A Quality Management Framework to improve quality and harmonization of Total Diet Study practices in Europe was developed within the TDS-Exposure Project. Seventeen processes were identified and hazards, Critical Control Points and associated preventive and corrective measures described. The Total Diet Study process was summarized in a flowchart divided into planning and practical (sample collection, preparation and analysis; risk assessment analysis and publication) phases. Standard Operating Procedures were developed and implemented in pilot studies in five organizations. The flowchart was used to develop a quality framework for Total Diet Studies that could be included in formal quality management systems. Pilot studies operated by four project partners were visited by project assessors who reviewed implementation of the proposed framework and identified areas that could be improved. The quality framework developed can be the starting point for any Total Diet Study centre and can be used within existing formal quality management approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.
Abebe, Gumataw K; Chalak, Ali; Abiad, Mohamad G
2017-07-01
Food safety is a key public health issue worldwide. This study aims to characterise existing governance mechanisms - governance structures (GSs) and food safety management systems (FSMSs) - and analyse the alignment thereof in detecting food safety hazards, based on empirical evidence from Lebanon. Firm-to-firm and public baseline are the dominant FSMSs applied in a large-scale, while chain-wide FSMSs are observed only in a small-scale. Most transactions involving farmers are relational and market-based in contrast to (large-scale) processors, which opt for hierarchical GSs. Large-scale processors use a combination of FSMSs and GSs to minimise food safety hazards albeit potential increase in coordination costs; this is an important feature of modern food supply chains. The econometric analysis reveals contract period, on-farm inspection and experience having significant effects in minimising food safety hazards. However, the potential to implement farm-level FSMS is influenced by formality of the contract, herd size, trading partner choice, and experience. Public baseline FSMSs appear effective in controlling food safety hazards; however, this may not be viable due to the scarcity of public resources. We suggest public policies to focus on long-lasting governance mechanisms by introducing incentive schemes and farm-level FSMSs by providing loans and education to farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Abed Gatea, Mezher; Ahmed, Anwar A.; jundee kadhum, Saad; Ali, Hasan Mohammed; Hussein Muheisn, Abbas
2018-05-01
The Safety Assessment Framework (SAFRAN) software has implemented here for radiological safety analysis; to verify that the dose acceptance criteria and safety goals are met with a high degree of confidence for dismantling of Tammuz-2 reactor core at Al-tuwaitha nuclear site. The activities characterizing, dismantling and packaging were practiced to manage the generated radioactive waste. Dose to the worker was considered an endpoint-scenario while dose to the public has neglected due to that Tammuz-2 facility is located in a restricted zone and 30m berm surrounded Al-tuwaitha site. Safety assessment for dismantling worker endpoint-scenario based on maximum external dose at component position level in the reactor pool and internal dose via airborne activity while, for characterizing and packaging worker endpoints scenarios have been done via external dose only because no evidence for airborne radioactivity hazards outside the reactor pool. The in-situ measurements approved that reactor core components are radiologically activated by Co-60 radioisotope. SAFRAN results showed that the maximum received dose for workers are (1.85, 0.64 and 1.3mSv/y) for activities dismantling, characterizing and packaging of reactor core components respectively. Hence, the radiological hazards remain below the low level hazard and within the acceptable annual dose for workers in radiation field
PSHREG: A SAS macro for proportional and nonproportional subdistribution hazards regression
Kohl, Maria; Plischke, Max; Leffondré, Karen; Heinze, Georg
2015-01-01
We present a new SAS macro %pshreg that can be used to fit a proportional subdistribution hazards model for survival data subject to competing risks. Our macro first modifies the input data set appropriately and then applies SAS's standard Cox regression procedure, PROC PHREG, using weights and counting-process style of specifying survival times to the modified data set. The modified data set can also be used to estimate cumulative incidence curves for the event of interest. The application of PROC PHREG has several advantages, e.g., it directly enables the user to apply the Firth correction, which has been proposed as a solution to the problem of undefined (infinite) maximum likelihood estimates in Cox regression, frequently encountered in small sample analyses. Deviation from proportional subdistribution hazards can be detected by both inspecting Schoenfeld-type residuals and testing correlation of these residuals with time, or by including interactions of covariates with functions of time. We illustrate application of these extended methods for competing risk regression using our macro, which is freely available at: http://cemsiis.meduniwien.ac.at/en/kb/science-research/software/statistical-software/pshreg, by means of analysis of a real chronic kidney disease study. We discuss differences in features and capabilities of %pshreg and the recent (January 2014) SAS PROC PHREG implementation of proportional subdistribution hazards modelling. PMID:25572709
Cyber Safety and Security for Reduced Crew Operations (RCO)
NASA Technical Reports Server (NTRS)
Driscoll, Kevin
2017-01-01
NASA and the Aviation Industry is looking into reduced crew operations (RCO) that would cut today's required two-person flight crews down to a single pilot with support from ground-based crews. Shared responsibility across air and ground personnel will require highly reliable and secure data communication and supporting automation, which will be safety-critical for passenger and cargo aircraft. This paper looks at the different types and degrees of authority delegation given from the air to the ground and the ramifications of each, including the safety and security hazards introduced, the mitigation mechanisms for these hazards, and other demands on an RCO system architecture which would be highly invasive into (almost) all safety-critical avionics. The adjacent fields of unmanned aerial systems and autonomous ground vehicles are viewed to find problems that RCO may face and related aviation accident scenarios are described. The paper explores possible data communication architectures to meet stringent performance and information security (INFOSEC) requirements of RCO. Subsequently, potential challenges for RCO data communication authentication, encryption and non-repudiation are identified. The approach includes a comprehensive safety-hazard analysis of the RCO system to determine top level INFOSEC requirements for RCO and proposes an option for effective RCO implementation. This paper concludes with questioning the economic viability of RCO in light of the expense of overcoming the operational safety and security hazards it would introduce.
Food safety issues affecting the dairy beef industry.
Stefan, G
1997-12-01
The ability of dairy farmers to market cull cows and veal calves may be affected by the final rule on Pathogen Reduction and HACCP (Hazard Analysis Critical Control Points) Systems, a sweeping reform of USDA food safety regulations that was published on July 25, 1996. Although the regulations apply only to slaughter and processing plants handling meat and poultry, the rule will have an impact on food animal producers, including dairy farmers. Under this regulation, plant operators are required to evaluate potential hazards and to devise and implement controls that are appropriate for each product and plant to prevent or reduce those hazards. Processing plants may need to consider the potential hazards associated with incoming animals, such as illegal drug residues, which may result in marked changes in the relationships among some producers, livestock markets, and slaughter plants. Such information may actually improve the marketability of some animal classes because documentation will help the packer ensure the safety of products for sale to domestic and foreign markets. Dairy scientists are in an excellent position to explain the food safety issues to dairy farmers and to help develop the appropriate strategies that are necessary to guide the changes needed. These scientists can be conduits for information, the research leaders for practical solutions to reduce public health risks, and valuable resources to help farmers adjust to the impact of these new in-plant regulatory systems.
Trasande, Leonardo; Liu, Yinghua
2011-05-01
A 2002 analysis documented $54.9 billion in annual costs of environmentally mediated diseases in US children. However, few important changes in federal policy have been implemented to prevent exposures to toxic chemicals. We therefore updated and expanded the previous analysis and found that the costs of lead poisoning, prenatal methylmercury exposure, childhood cancer, asthma, intellectual disability, autism, and attention deficit hyperactivity disorder were $76.6 billion in 2008. To prevent further increases in these costs, efforts are needed to institute premarket testing of new chemicals; conduct toxicity testing on chemicals already in use; reduce lead-based paint hazards; and curb mercury emissions from coal-fired power plants.
Dixon, Shane Michael; Theberge, Nancy
2011-11-01
This article provides an analysis of the evolution of the division of labour in participatory ergonomics (PE) programmes in two worksites. The analysis is based on interviews and field observations in the worksites. In both settings there was meaningful participation by both worker and management members of ergonomic change teams (ECTs) in the hazard assessment and solution identification stages, but as the teams moved to the implementation stage, worker representatives were marginalised and the participatory nature of the programmes was severely curtailed. The removal of workers from the process was the outcome of the interplay among the type of activities pursued in the implementation stage, the skills and knowledge required to carry out those activities, and workers' limited influence in the organisational hierarchies. Findings highlight the salience of the social context in which participatory programmes are located and the importance of examining participatory programmes as they evolve over time. STATEMENT OF RELEVANCE: This article contributes to a growing literature on the process and implementation of PE programmes. The article's focus on social and organisational factors that affect the division of labour and attention to the evolution of involvement over time extend current understandings of participation in ergonomics programmes.
ERIC Educational Resources Information Center
Cliffe, Roger
1978-01-01
Hearing damage from noise exposure and approaches to implementing hearing safety in school industrial laboratories through noise reduction and protective equipment are discussed. Although all states have not adopted the Occupational Safety and Health Act, teachers should be aware of noise hazards and act to protect hearing. (MF)
Multi-criteria analysis for the detection of the most critical European UNESCO Heritage sites
NASA Astrophysics Data System (ADS)
Valagussa, Andrea; Frattini, Paolo; Berta, Nadia; Spizzichino, Daniele; Leoni, Gabriele; Margottini, Claudio; Battista Crosta, Giovanni
2017-04-01
A GIS-based multi-criteria analysis has been implemented to identify and to rank the most critical UNESCO Heritage sites at the European scale in the context of PROTHEGO JPI-Project. Two multi-criteria methods have been tested and applied to more than 300 European UNESCO Sites. First, the Analytic Hierarchy Procedure (AHP) was applied to the data of the UNESCO Periodic Report, in relation to 13 natural hazards that have affected or can potentially affect the Heritage sites. According to these reports, 22% of sites are without any documented hazard and 70% of the sites have at least one hazard affecting the site. The most important hazards on the European country are: fire (wildfire), storm, flooding, earthquake and erosion. For each UNESCO site, the potential risk was calculated as a weighed sum of the hazards that affect the site. The weighs of the 13 hazards were obtained by AHP procedure, which is a technique for multi-attribute decision making that enables the decomposition of a problem into hierarchy, based on the opinion of different experts about the dominance of risks. The weights are obtained by rescaling between 0 and 1 the eigenvectors relative to the maximum eigenvalue for the matrix of the coefficients. The internal coherence of the expert's attributions is defined through the calculation of the consistency ratio (Saaty, 1990). The result of the AHP method consists in a map of the UNESCO sites ranked according to the potential risk, where the site most at risk results to be the Geirangerfjord and Nærøyfjord in Norway. However, the quality of these results lies in the reliability of the Period Reports, which are produced by different experts with unknown level of scientific background. To test the reliability of these results, a comparison of the information of the periodic reports with available high-quality datasets (earthquake, volcano and landslide) at the Italian scale has been performed. Sites properly classified by the Period Reports range from 65% (earthquake hazard) to 98% (volcano hazard), with a high underestimation of landslide hazard. Due to this high value of uncertainty, we developed a new methodology to identify and to rank the most critical UNESCO Heritage sites on the basis of three natural hazards (landslide, earthquake, and volcano) for which reliable European-scale hazard maps are available. For each UNESCO site, a potential risk was calculated as the product of hazard (from the available maps) and potential vulnerability. The latter is obtained considering the typology of site (e.g. monument, cultural landscape, and cultural road), the presence or absence of resident and/or tourist, the position of the site (underground/over-ground). Through this methodology, a new ranking of the European UNESCO Sites has been obtained. In this ranking, the historic center of Naples results to be the most-at-danger site of the European continent.
Cha, DongHwan; Wang, Xin; Kim, Jeong Woo
2017-01-01
Hotspot analysis was implemented to find regions in the province of Alberta (Canada) with high frequency Cloud to Ground (CG) lightning strikes clustered together. Generally, hotspot regions are located in the central, central east, and south central regions of the study region. About 94% of annual lightning occurred during warm months (June to August) and the daily lightning frequency was influenced by the diurnal heating cycle. The association rule mining technique was used to investigate frequent CG lightning patterns, which were verified by similarity measurement to check the patterns’ consistency. The similarity coefficient values indicated that there were high correlations throughout the entire study period. Most wildfires (about 93%) in Alberta occurred in forests, wetland forests, and wetland shrub areas. It was also found that lightning and wildfires occur in two distinct areas: frequent wildfire regions with a high frequency of lightning, and frequent wild-fire regions with a low frequency of lightning. Further, the preference index (PI) revealed locations where the wildfires occurred more frequently than in other class regions. The wildfire hazard area was estimated with the CG lightning hazard map and specific land use types. PMID:29065564
Cha, DongHwan; Wang, Xin; Kim, Jeong Woo
2017-10-23
Hotspot analysis was implemented to find regions in the province of Alberta (Canada) with high frequency Cloud to Ground (CG) lightning strikes clustered together. Generally, hotspot regions are located in the central, central east, and south central regions of the study region. About 94% of annual lightning occurred during warm months (June to August) and the daily lightning frequency was influenced by the diurnal heating cycle. The association rule mining technique was used to investigate frequent CG lightning patterns, which were verified by similarity measurement to check the patterns' consistency. The similarity coefficient values indicated that there were high correlations throughout the entire study period. Most wildfires (about 93%) in Alberta occurred in forests, wetland forests, and wetland shrub areas. It was also found that lightning and wildfires occur in two distinct areas: frequent wildfire regions with a high frequency of lightning, and frequent wild-fire regions with a low frequency of lightning. Further, the preference index (PI) revealed locations where the wildfires occurred more frequently than in other class regions. The wildfire hazard area was estimated with the CG lightning hazard map and specific land use types.
Ta, Goh Choo; Mokhtar, Mazlin Bin; Mohd Mokhtar, Hj Anuar Bin; Ismail, Azmir Bin; Abu Yazid, Mohd Fadhil Bin Hj
2010-01-01
Chemical classification and labelling systems may be roughly similar from one country to another but there are significant differences too. In order to harmonize various chemical classification systems and ultimately provide consistent chemical hazard communication tools worldwide, the Globally Harmonized System of Classification and Labelling of Chemicals (GHS) was endorsed by the United Nations Economic and Social Council (ECOSOC). Several countries, including Japan, Taiwan, Korea and Malaysia, are now in the process of implementing GHS. It is essential to ascertain the comprehensibility of chemical hazard communication tools that are described in the GHS documents, namely the chemical labels and Safety Data Sheets (SDS). Comprehensibility Testing (CT) was carried out with a mixed group of industrial workers in Malaysia (n=150) and factors that influence the comprehensibility were analysed using one-way ANOVA. The ability of the respondents to retrieve information from the SDS was also tested in this study. The findings show that almost all the GHS pictograms meet the ISO comprehension criteria and it is concluded that the underlying core elements that enhance comprehension of GHS pictograms and which are also essential in developing competent persons in the use of SDS are training and education.
NASA Astrophysics Data System (ADS)
Bezawada, Rajesh; Uijt de Haag, Maarten
2010-04-01
This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environ...
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
User interface prototype for geospatial early warning systems - a tsunami showcase
NASA Astrophysics Data System (ADS)
Hammitzsch, M.; Lendholt, M.; Esbrí, M. Á.
2012-03-01
The command and control unit's graphical user interface (GUI) is a central part of early warning systems (EWS) for man-made and natural hazards. The GUI combines and concentrates the relevant information of the system and offers it to human operators. It has to support operators successfully performing their tasks in complex workflows. Most notably in critical situations when operators make important decisions in a limited amount of time, the command and control unit's GUI has to work reliably and stably, providing the relevant information and functionality with the required quality and in time. The design of the GUI application is essential in the development of any EWS to manage hazards effectively. The design and development of such GUI is performed repeatedly for each EWS by various software architects and developers. Implementations differ based on their application in different domains. But similarities designing and equal approaches implementing GUIs of EWS are not quite harmonized enough with related activities and do not exploit possible synergy effects. Thus, the GUI's implementation of an EWS for tsunamis is successively introduced, providing a generic approach to be applied in each EWS for man-made and natural hazards.
A critical analysis of hazard resilience measures within sustainability assessment frameworks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu
Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less
Seismic Hazard analysis of Adjaria Region in Georgia
NASA Astrophysics Data System (ADS)
Jorjiashvili, Nato; Elashvili, Mikheil
2014-05-01
The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude distribution [Youngs and Coppersmith, 1985]. Notably, the software can deal with uncertainty in the seismicity input parameters such as maximum magnitude value. CRISIS offers a set of built-in GMPEs, as well as the possibility of defining new ones by providing information in a tabular format. Our study shows that in case of Ajaristkali HPP study area, significant contribution to Seismic Hazard comes from local sources with quite low Mmax values, thus these two attenuation lows give us quite different PGA and SA values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abston, J.P.
1997-04-01
The Lockheed Martin Energy Systems, Inc. (Energy Systems) policy is to provide a safe and healthful workplace for all employees and subcontractors. The accomplishment of this policy requires that operations at the Gunite and Associated Tanks (GAAT) in the North and South Tank Farms (NTF and STF) at the Department of Energy (DOE) Oak Ridge National Laboratory are guided by an overall plan and consistent proactive approach to health and safety (H and S) issues. The policy and procedures in this plan apply to all GAAT operations in the NTF and STF. The provisions of this plan are to bemore » carried out whenever activities identifies s part of the GAAT are initiated that could be a threat to human health or the environment. This plan implements a policy and establishes criteria for the development of procedures for day-to-day operations to prevent or minimize any adverse impact to the environment and personnel safety and health and to meet standards that define acceptable management of hazardous and radioactive materials and wastes. The plan is written to utilize past experience and best management practices in order to minimize hazards to human health or the environment from events such as fires, explosions, falls, mechanical hazards, or any unplanned release of hazardous or radioactive materials to the air. This plan explains additional task-specific health and safety requirements such as the Site Safety and health Addendum and Activity Hazard Analysis, which should be used in concert with this plan and existing established procedures.« less
Integrated hazard assessment of Cirenmaco glacial lake in Zhangzangbo valley, Central Himalayas
NASA Astrophysics Data System (ADS)
Wang, Weicai; Gao, Yang; Iribarren Anacona, Pablo; Lei, Yanbin; Xiang, Yang; Zhang, Guoqing; Li, Shenghai; Lu, Anxin
2018-04-01
Glacial lake outburst floods (GLOFs) have recently become one of the primary natural hazards in the Himalayas. There is therefore an urgent need to assess GLOF hazards in the region. Cirenmaco, a moraine-dammed lake located in the upstream portion of Zhangzangbo valley, Central Himalayas, has received public attention after its damaging 1981 outburst flood. Here, by combining remote sensing methods, bathymetric survey and 2D hydraulic modeling, we assessed the hazard posed by Cirenmaco in its current status. Inter-annual variation of Cirenmaco lake area indicates a rapid lake expansion from 0.10 ± 0.08 km2 in 1988 to 0.39 ± 0.04 km2 in 2013. Bathymetric survey shows the maximum water depth of the lake in 2012 was 115 ± 2 m and the lake volume was calculated to be 1.8 × 107 m3. Field geomorphic analysis shows that Cirenmaco glacial lake is prone to GLOFs as mass movements and ice and snow avalanches can impact the lake and the melting of the dead ice in the moraine can lower the dam level. HEC-RAS 2D model was then used to simulate moraine dam failure of the Cirenmaco and assess GLOF impacts downstream. Reconstruction of Cirenmaco 1981 GLOF shows that HEC-RAS can produce reasonable flood extent and water depth, thus demonstrate its ability to effectively model complex GLOFs. GLOF modeling results presented can be used as a basis for the implementation of disaster prevention and mitigation measures. As a case study, this work shows how we can integrate different methods to GLOF hazard assessment.
NASA Astrophysics Data System (ADS)
Su, Weizhong
2017-03-01
There is growing interest in using the urban landscape for stormwater management studies, where land patterns and processes can be important controls for the sustainability of urban development and planning. This paper proposes an original index of Major Hazard Oriented Level (MHOL) and investigates the structure distribution, driving factors, and controlling suggestions of urban-rural land growth in flood-prone areas in the Taihu Lake watershed, China. The MHOL of incremental urban-rural land increased from M 31.51 during the years 1985-1995 to M 38.37 during the years 1995-2010 (M for medium structure distribution, and the number for high-hazard value). The index shows that urban-rural land was distributed uniformly in flood hazard levels and tended to move rapidly to high-hazard areas, where 72.68% of incremental urban-rural land was aggregated maximally in new urban districts along the Huning traffic line and the Yangtze River. Thus, the current accelerating growth of new urban districts could account for the ampliative exposure to high-hazard areas. New districts are driven by the powerful link between land financial benefits and political achievements for local governments and the past unsustainable process of "single objective" oriented planning. The correlation categorical analysis of the current development intensity and carrying capacity of hydrological ecosystems for sub-basins was used to determine four types of development areas and provide decision makers with indications on the future watershed-scale subdivision of Major Function Oriented Zoning implemented by the Chinese government.
GAPHAZ: improving knowledge management of glacier and permafrost hazards and risks in mountains
NASA Astrophysics Data System (ADS)
Huggel, Christian; Burn, Chris; Clague, John J.; Hewitt, Ken; Kääb, Andreas; Krautblatter, Michael; Kargel, Jeffrey S.; Reynolds, John; Sokratov, Sergey
2014-05-01
High-mountain environments worldwide are undergoing changes at an historically unprecedented pace due to the sensitivity of the high-mountain cryosphere to climate change. Humans have settled in many mountain regions hundreds, even thousands of years ago, but recent intensive socio-economic developments have increased exposure and vulnerability of people and infrastructure to a large range of natural hazards related to high-mountain processes. Resulting risks are therefore increasing and highly dynamic. GAPHAZ, the Standing Group on Glacier and Permafrost Hazards in Mountains of the International Association of Cryospheric Sciences (IACS) and International Permafrost Association (IPA), is positioned in this context. The objectives of GAPHAZ are to: • improve the international scientific communication on glacier and permafrost hazards; • stimulating and strengthen research collaborations in the field of glacier and permafrost hazards; • compile a state of knowledge related to glacier and permafrost hazards in high mountains; • work towards a greater transfer of information and improved communication between the scientific and governmental/policy communities; • signpost sources of advice to international and national agencies, responsible authorities, and private companies; and • act as a focal point for information for international media during relevant crises. GAPHAZ has initiated a variety of activities over the past years to meet these objectives. One of the important issues is the development of standards of (1) how to make and portray technical assessments of glacier and permafrost related hazards and risks; and (2) how to communicate these to the public and a range of actors including those who implement measures. Thereby, difficulties of and need for better translation between techno-scientific understanding, and the situations and concerns of people most at risk in cold regions need to be recognized. Knowledge-transfer from the few well-researched and monitored regions to the more extensive and diverse regions needs to be addressed.. Standards are required to ensure an adequate level of quality and to avoid incorrect assessments with potentially adverse consequences, as experiences in the past have shown. Concepts and terminologies related to hazard and risk assessments must follow recently issued consensus statements, such as those of UN-ISDR and IPCC. Hazard assessments must be undertaken routinely and regularly, combined with appropriate ground-based and remote sensing monitoring. Assessments need to adequately consider the physical processes and their interactions. Integrative risk assessments should be achieved by interdisciplinary cooperation. There is still a lack of integration of physical/engineering and social aspects of glacier and permafrost hazards; therefore communication and exchange between natural and social science experts must be strengthened. In the design and implementation of risk reduction and adaptation measures, a close collaboration among scientists, policy makers, and local populations is necessary. Recognizing different perceptions of risks among actors are particularly important if risk reduction efforts are to be successful. Measures should generally be adapted to the local social, cultural, economic, political, and institutional context. Early warning systems are becoming increasingly important, and a growing number of experiences are available also for high-mountain environments. A systematic analysis and exchange of experiences using dedicated expert networks will be fostered by GAPHAZ in collaboration with other initiatives and actors.
40 CFR 63.11427 - Who implements and enforces this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 14 2010-07-01 2010-07-01 false Who implements and enforces this subpart? 63.11427 Section 63.11427 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Lead Acid Battery...
40 CFR 63.6670 - Who implements and enforces this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 13 2010-07-01 2010-07-01 false Who implements and enforces this subpart? 63.6670 Section 63.6670 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Stationary Reciprocating Internal...
29 CFR 1960.56 - Training of safety and health specialists.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., laboratory experiences, field study, and other formal learning experiences to prepare them to perform the... program development and implementation, as well as hazard recognition, evaluation and control, equipment... tasks. (b) Each agency shall implement career development programs for their occupational safety and...
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Camp
Over the past four years, the Electrical Safety Program at PPPL has evolved in addressing changing regulatory requirements and lessons learned from accident events, particularly in regards to arc flash hazards and implementing NFPA 70E requirements. This presentation will discuss PPPL's approaches to the areas of electrical hazards evaluation, both shock and arc flash; engineered solutions for hazards mitigation such as remote racking of medium voltage breakers, operational changes for hazards avoidance, targeted personnel training and hazard appropriate personal protective equipment. Practical solutions for nominal voltage identification and zero voltage checks for lockout/tagout will also be covered. Finally, we willmore » review the value of a comprehensive electrical drawing program, employee attitudes expressed as a personal safety work ethic, integrated safety management, and sustained management support for continuous safety improvement.« less
Implementation of Haccp in the Mexican Poultry Processing Industry
NASA Astrophysics Data System (ADS)
Maldonado-Siman, Ema; Martínez-Hernández, Pedro Arturo; Ruíz-Flores, Agustín; García-Muñiz, José G.; Cadena-Meneses, José A.
Hazard Analysis and Critical Control Point (HACCP) is a safety and quality management tool used as major issue in international and domestic trade in food industry. However, detailed information on costs and benefits of HACCP implementation is needed to provide appropriate advice to food processing plants. This paper reports on the perceptions of costs and benefits by the Mexican poultry processing plants and sale destinations. The results suggest that the major costs of implementing and operating HACCP within poultry processing plants are record keeping and external technical advice. The main benefit indicated by the majority of processing plants is a reduction in microbial counts. Over 39% of poultry production is sent to nation-wide chains of supermarkets, and less than 13% is sent to international markets. It was concluded that the adoption of HACCP by the Mexican poultry processing sector is based on the concern to increase and keep the domestic market, rather than to compete in the international market.
Drummond, Colin; Deluca, Paolo; Coulton, Simon; Bland, Martin; Cassidy, Paul; Crawford, Mike; Dale, Veronica; Gilvarry, Eilish; Godfrey, Christine; Heather, Nick; McGovern, Ruth; Myles, Judy; Newbury-Birch, Dorothy; Oyefeso, Adenekan; Parrott, Steve; Patton, Robert; Perryman, Katherine; Phillips, Tom; Shepherd, Jonathan; Touquet, Robin; Kaner, Eileen
2014-01-01
Background Alcohol misuse is common in people attending emergency departments (EDs) and there is some evidence of efficacy of alcohol screening and brief interventions (SBI). This study investigated the effectiveness of SBI approaches of different intensities delivered by ED staff in nine typical EDs in England: the SIPS ED trial. Methods and Findings Pragmatic multicentre cluster randomized controlled trial of SBI for hazardous and harmful drinkers presenting to ED. Nine EDs were randomized to three conditions: a patient information leaflet (PIL), 5 minutes of brief advice (BA), and referral to an alcohol health worker who provided 20 minutes of brief lifestyle counseling (BLC). The primary outcome measure was the Alcohol Use Disorders Identification Test (AUDIT) status at 6 months. Of 5899 patients aged 18 or more presenting to EDs, 3737 (63·3%) were eligible to participate and 1497 (40·1%) screened positive for hazardous or harmful drinking, of whom 1204 (80·4%) gave consent to participate in the trial. Follow up rates were 72% (n = 863) at six, and 67% (n = 810) at 12 months. There was no evidence of any differences between intervention conditions for AUDIT status or any other outcome measures at months 6 or 12 in an intention to treat analysis. At month 6, compared to the PIL group, the odds ratio of being AUDIT negative for brief advice was 1·103 (95% CI 0·328 to 3·715). The odds ratio comparing BLC to PIL was 1·247 (95% CI 0·315 to 4·939). A per protocol analysis confirmed these findings. Conclusions SBI is difficult to implement in typical EDs. The results do not support widespread implementation of alcohol SBI in ED beyond screening followed by simple clinical feedback and alcohol information, which is likely to be easier and less expensive to implement than more complex interventions. Trial Registration Current Controlled Trials ISRCTN 93681536 PMID:24963731
NASA Astrophysics Data System (ADS)
Morris, Phillip A.
The prevalence of low-cost side scanning sonar systems mounted on small recreational vessels has created improved opportunities to identify and map submerged navigational hazards in freshwater impoundments. However, these economical sensors also present unique challenges for automated techniques. This research explores related literature in automated sonar imagery processing and mapping technology, proposes and implements a framework derived from these sources, and evaluates the approach with video collected from a recreational grade sonar system. Image analysis techniques including optical character recognition and an unsupervised computer automated detection (CAD) algorithm are employed to extract the transducer GPS coordinates and slant range distance of objects protruding from the lake bottom. The retrieved information is formatted for inclusion into a spatial mapping model. Specific attributes of the sonar sensors are modeled such that probability profiles may be projected onto a three dimensional gridded map. These profiles are computed from multiple points of view as sonar traces crisscross or come near each other. As lake levels fluctuate over time so do the elevation points of view. With each sonar record, the probability of a hazard existing at certain elevations at the respective grid points is updated with Bayesian mechanics. As reinforcing data is collected, the confidence of the map improves. Given a lake's current elevation and a vessel draft, a final generated map can identify areas of the lake that have a high probability of containing hazards that threaten navigation. The approach is implemented in C/C++ utilizing OpenCV, Tesseract OCR, and QGIS open source software and evaluated in a designated test area at Lake Lavon, Collin County, Texas.
Generic Safety Requirements for Developing Safe Insulin Pump Software
Zhang, Yi; Jetley, Raoul; Jones, Paul L; Ray, Arnab
2011-01-01
Background The authors previously introduced a highly abstract generic insulin infusion pump (GIIP) model that identified common features and hazards shared by most insulin pumps on the market. The aim of this article is to extend our previous work on the GIIP model by articulating safety requirements that address the identified GIIP hazards. These safety requirements can be validated by manufacturers, and may ultimately serve as a safety reference for insulin pump software. Together, these two publications can serve as a basis for discussing insulin pump safety in the diabetes community. Methods In our previous work, we established a generic insulin pump architecture that abstracts functions common to many insulin pumps currently on the market and near-future pump designs. We then carried out a preliminary hazard analysis based on this architecture that included consultations with many domain experts. Further consultation with domain experts resulted in the safety requirements used in the modeling work presented in this article. Results Generic safety requirements for the GIIP model are presented, as appropriate, in parameterized format to accommodate clinical practices or specific insulin pump criteria important to safe device performance. Conclusions We believe that there is considerable value in having the diabetes, academic, and manufacturing communities consider and discuss these generic safety requirements. We hope that the communities will extend and revise them, make them more representative and comprehensive, experiment with them, and use them as a means for assessing the safety of insulin pump software designs. One potential use of these requirements is to integrate them into model-based engineering (MBE) software development methods. We believe, based on our experiences, that implementing safety requirements using MBE methods holds promise in reducing design/implementation flaws in insulin pump development and evolutionary processes, therefore improving overall safety of insulin pump software. PMID:22226258
An Arduino project to record ground motion and to learn on earthquake hazard at high school
NASA Astrophysics Data System (ADS)
Saraò, Angela; Barnaba, Carla; Clocchiatti, Marco; Zuliani, David
2015-04-01
Through a multidisciplinary work that integrates Technology education with Earth Sciences, we implemented an educational program to raise the students' awareness of seismic hazard and to disseminate good practices of earthquake safety. Using free software and low-cost open hardware, the students of a senior class of the high school Liceo Paschini in Tolmezzo (NE Italy) implemented a seismograph using the Arduino open-source electronics platform and the ADXL345 sensors to emulate a low cost seismometer (e.g. O-NAVI sensor of the Quake-Catcher Network, http://qcn.stanford.edu). To accomplish their task the students were addressed to use the web resources for technical support and troubleshooting. Shell scripts, running on local computers under Linux OS, controlled the process of recording and display data. The main part of the experiment was documented using the DokuWiki style. Some propaedeutic lessons in computer sciences and electronics were needed to build up the necessary skills of the students and to fill in the gap of their background knowledge. In addition lectures by seismologists and laboratory activity allowed the class to exploit different aspects of the physics of the earthquake and particularly of the seismic waves, and to become familiar with the topics of seismic hazard through an inquiry-based learning. The Arduino seismograph achieved can be used for educational purposes and it can display tremors on the local network of the school. For sure it can record the ground motion due to a seismic event that can occur in the area, but further improvements are necessary for a quantitative analysis of the recorded signals.
Hazardous Materials Pharmacies - A Vital Component of a Robust P2 Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCarter, S.
2006-07-01
Integrating pollution prevention (P2) into the Department of Energy Integrated Safety Management (ISM) - Environmental Management System (EMS) approach, required by DOE Order 450.1, leads to an enhanced ISM program at large and complex installations and facilities. One of the building blocks to integrating P2 into a comprehensive environmental and safety program is the control and tracking of the amounts, types, and flow of hazardous materials used on a facility. Hazardous materials pharmacies (typically called HazMarts) provide a solid approach to resolving this issue through business practice changes that reduce use, avoid excess, and redistribute surplus. If understood from conceptmore » to implementation, the HazMart is a powerful tool for reducing pollution at the source, tracking inventory storage, controlling usage and flow, and summarizing data for reporting requirements. Pharmacy options can range from a strict, single control point for all hazardous materials to a virtual system, where the inventory is user controlled and reported over a common system. Designing and implementing HazMarts on large, diverse installations or facilities present a unique set of issues. This is especially true of research and development (R and D) facilities where the chemical use requirements are extensive and often classified. There are often multiple sources of supply; a wide variety of chemical requirements; a mix of containers ranging from small ampoules to large bulk storage tanks; and a wide range of tools used to track hazardous materials, ranging from simple purchase inventories to sophisticated tracking software. Computer systems are often not uniform in capacity, capability, or operating systems, making it difficult to use a server-based unified tracking system software. Each of these issues has a solution or set of solutions tied to fundamental business practices. Each requires an understanding of the problem at hand, which, in turn, requires good communication among all potential users. A key attribute to a successful HazMart is that everybody must use the same program. That requirement often runs directly into the biggest issue of all... institutional resistance to change. To be successful, the program has to be both a top-down and bottom-up driven process. The installation or facility must set the policy and the requirement, but all of the players have to buy in and participate in building and implementing the program. Dynamac's years of experience assessing hazardous materials programs, providing business case analyses, and recommending and implementing pharmacy approaches for federal agencies has provided us with key insights into the issues, problems, and the array of solutions available. This paper presents the key steps required to implement a HazMart, explores the advantages and pitfalls associated with a HazMart, and presents some options for implementing a pharmacy or HazMart on complex installations and R and D facilities. (authors)« less
Open space suitability analysis for emergency shelter after an earthquake
NASA Astrophysics Data System (ADS)
Anhorn, J.; Khazai, B.
2015-04-01
In an emergency situation shelter space is crucial for people affected by natural hazards. Emergency planners in disaster relief and mass care can greatly benefit from a sound methodology that identifies suitable shelter areas and sites where shelter services need to be improved. A methodology to rank suitability of open spaces for contingency planning and placement of shelter in the immediate aftermath of a disaster is introduced. The Open Space Suitability Index uses the combination of two different measures: a qualitative evaluation criterion for the suitability and manageability of open spaces to be used as shelter sites and another quantitative criterion using a capacitated accessibility analysis based on network analysis. For the qualitative assessment implementation issues, environmental considerations and basic utility supply are the main categories to rank candidate shelter sites. A geographic information system is used to reveal spatial patterns of shelter demand. Advantages and limitations of this method are discussed on the basis of an earthquake hazard case study in the Kathmandu Metropolitan City. According to the results, out of 410 open spaces under investigation, 12.2% have to be considered not suitable (Category D and E) while 10.7% are Category A and 17.6% are Category B. Almost two-thirds (59.55%) are fairly suitable (Category C).
A joint frailty-copula model between tumour progression and death for meta-analysis.
Emura, Takeshi; Nakatochi, Masahiro; Murotani, Kenta; Rondeau, Virginie
2017-12-01
Dependent censoring often arises in biomedical studies when time to tumour progression (e.g., relapse of cancer) is censored by an informative terminal event (e.g., death). For meta-analysis combining existing studies, a joint survival model between tumour progression and death has been considered under semicompeting risks, which induces dependence through the study-specific frailty. Our paper here utilizes copulas to generalize the joint frailty model by introducing additional source of dependence arising from intra-subject association between tumour progression and death. The practical value of the new model is particularly evident for meta-analyses in which only a few covariates are consistently measured across studies and hence there exist residual dependence. The covariate effects are formulated through the Cox proportional hazards model, and the baseline hazards are nonparametrically modeled on a basis of splines. The estimator is then obtained by maximizing a penalized log-likelihood function. We also show that the present methodologies are easily modified for the competing risks or recurrent event data, and are generalized to accommodate left-truncation. Simulations are performed to examine the performance of the proposed estimator. The method is applied to a meta-analysis for assessing a recently suggested biomarker CXCL12 for survival in ovarian cancer patients. We implement our proposed methods in R joint.Cox package.
Mohamed, Heba M; Lamie, Nesrine T
2016-09-01
In the past few decades the analytical community has been focused on eliminating or reducing the usage of hazardous chemicals and solvents, in different analytical methodologies, that have been ascertained to be extremely dangerous to human health and environment. In this context, environmentally friendly, green, or clean practices have been implemented in different research areas. This study presents a greener alternative of conventional RP-HPLC methods for the simultaneous determination and quantitative analysis of a pharmaceutical ternary mixture composed of telmisartan, hydrochlorothiazide, and amlodipine besylate, using an ecofriendly mobile phase and short run time with the least amount of waste production. This solvent-replacement approach was feasible without compromising method performance criteria, such as separation efficiency, peak symmetry, and chromatographic retention. The greenness profile of the proposed method was assessed and compared with reported conventional methods using the analytical Eco-Scale as an assessment tool. The proposed method was found to be greener in terms of usage of hazardous chemicals and solvents, energy consumption, and production of waste. The proposed method can be safely used for the routine analysis of the studied pharmaceutical ternary mixture with a minimal detrimental impact on human health and the environment.
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
Implementing Extreme Value Analysis in a Geospatial Workflow for Storm Surge Hazard Assessment
NASA Astrophysics Data System (ADS)
Catelli, J.; Nong, S.
2014-12-01
Gridded data of 100-yr (1%) and 500-yr (0.2%) storm surge flood elevations for the United States, Gulf of Mexico, and East Coast are critical to understanding this natural hazard. Storm surge heights were calculated across the study area utilizing SLOSH (Sea, Lake, and Overland Surges from Hurricanes) model data for thousands of synthetic US landfalling hurricanes. Based on the results derived from SLOSH, a series of interpolations were performed using spatial analysis in a geographic information system (GIS) at both the SLOSH basin and the synthetic event levels. The result was a single grid of maximum flood elevations for each synthetic event. This project addresses the need to utilize extreme value theory in a geospatial environment to analyze coincident cells across multiple synthetic events. The results are 100-yr (1%) and 500-yr (0.2%) values for each grid cell in the study area. This talk details a geospatial approach to move raster data to SciPy's NumPy Array structure using the Python programming language. The data are then connected through a Python library to an outside statistical package like R to fit cell values to extreme value theory distributions and return values for specified recurrence intervals. While this is not a new process, the value behind this work is the ability to keep this process in a single geospatial environment and be able to easily replicate this process for other natural hazard applications and extreme event modeling.
CASAS: Cancer Survival Analysis Suite, a web based application
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946
CASAS: Cancer Survival Analysis Suite, a web based application.
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...
Meeks, Derek W; Takian, Amirhossein; Sittig, Dean F; Singh, Hardeep; Barber, Nick
2014-01-01
Objective The intersection of electronic health records (EHR) and patient safety is complex. To examine the applicability of two previously developed conceptual models comprehensively to understand safety implications of EHR implementation in the English National Health Service (NHS). Methods We conducted a secondary analysis of interview data from a 30-month longitudinal, prospective, case study-based evaluation of EHR implementation in 12 NHS hospitals. We used a framework analysis approach to apply conceptual models developed by Sittig and Singh to understand better EHR implementation and use: an eight-dimension sociotechnical model and a three-phase patient safety model (safe technology, safe use of technology, and use of technology to improve safety). Results The intersection of patient safety and EHR implementation and use was characterized by risks involving technology (hardware and software, clinical content, and human–computer interfaces), the interaction of technology with non-technological factors, and improper or unsafe use of technology. Our data support that patient safety improvement activities as well as patient safety hazards change as an organization evolves from concerns about safe EHR functionality, ensuring safe and appropriate EHR use, to using the EHR itself to provide ongoing surveillance and monitoring of patient safety. Discussion We demonstrate the face validity of two models for understanding the sociotechnical aspects of safe EHR implementation and the complex interactions of technology within a healthcare system evolving from paper to integrated EHR. Conclusions Using sociotechnical models, including those presented in this paper, may be beneficial to help stakeholders understand, synthesize, and anticipate risks at the intersection of patient safety and health information technology. PMID:24052536
Meeks, Derek W; Takian, Amirhossein; Sittig, Dean F; Singh, Hardeep; Barber, Nick
2014-02-01
The intersection of electronic health records (EHR) and patient safety is complex. To examine the applicability of two previously developed conceptual models comprehensively to understand safety implications of EHR implementation in the English National Health Service (NHS). We conducted a secondary analysis of interview data from a 30-month longitudinal, prospective, case study-based evaluation of EHR implementation in 12 NHS hospitals. We used a framework analysis approach to apply conceptual models developed by Sittig and Singh to understand better EHR implementation and use: an eight-dimension sociotechnical model and a three-phase patient safety model (safe technology, safe use of technology, and use of technology to improve safety). The intersection of patient safety and EHR implementation and use was characterized by risks involving technology (hardware and software, clinical content, and human-computer interfaces), the interaction of technology with non-technological factors, and improper or unsafe use of technology. Our data support that patient safety improvement activities as well as patient safety hazards change as an organization evolves from concerns about safe EHR functionality, ensuring safe and appropriate EHR use, to using the EHR itself to provide ongoing surveillance and monitoring of patient safety. We demonstrate the face validity of two models for understanding the sociotechnical aspects of safe EHR implementation and the complex interactions of technology within a healthcare system evolving from paper to integrated EHR. Using sociotechnical models, including those presented in this paper, may be beneficial to help stakeholders understand, synthesize, and anticipate risks at the intersection of patient safety and health information technology.
OSHA safety requirements and the general duty clause.
Mills, Anne C; Chillock, Cynthia A; Edelman, Harold; Mills, Shannon E
2005-03-01
Dental offices and clinics are subject to the same general safety requirements as other workplaces. Current guidelines, inspections, education, and training focus on infectious disease as the major workplace hazard for dental health care personnel (DHCP). However, the Occupational Safety and Health Administration has cited an increasing variety and number of general safety hazards during inspections of dental offices. A review of the general safety requirements for personal protective equipment and fire safety as they relate to DHCP follows. The authors discuss the responsibility of both employers and employees to perform workplace hazard evaluation and to implement education, engineering controls, and work practice controls to minimize their exposure to recognized and emerging workplace hazards.
Evolution of International Space Station Program Safety Review Processes and Tools
NASA Technical Reports Server (NTRS)
Ratterman, Christian D.; Green, Collin; Guibert, Matt R.; McCracken, Kristle I.; Sang, Anthony C.; Sharpe, Matthew D.; Tollinger, Irene V.
2013-01-01
The International Space Station Program at NASA is constantly seeking to improve the processes and systems that support safe space operations. To that end, the ISS Program decided to upgrade their Safety and Hazard data systems with 3 goals: make safety and hazard data more accessible; better support the interconnection of different types of safety data; and increase the efficiency (and compliance) of safety-related processes. These goals are accomplished by moving data into a web-based structured data system that includes strong process support and supports integration with other information systems. Along with the data systems, ISS is evolving its submission requirements and safety process requirements to support the improved model. In contrast to existing operations (where paper processes and electronic file repositories are used for safety data management) the web-based solution provides the program with dramatically faster access to records, the ability to search for and reference specific data within records, reduced workload for hazard updates and approval, and process support including digital signatures and controlled record workflow. In addition, integration with other key data systems provides assistance with assessments of flight readiness, more efficient review and approval of operational controls and better tracking of international safety certifications. This approach will also provide new opportunities to streamline the sharing of data with ISS international partners while maintaining compliance with applicable laws and respecting restrictions on proprietary data. One goal of this paper is to outline the approach taken by the ISS Progrm to determine requirements for the new system and to devise a practical and efficient implementation strategy. From conception through implementation, ISS and NASA partners utilized a user-centered software development approach focused on user research and iterative design methods. The user-centered approach used on the new ISS hazard system utilized focused user research and iterative design methods employed by the Human Computer Interaction Group at NASA Ames Research Center. Particularly, the approach emphasized the reduction of workload associated with document and data management activities so more resources can be allocated to the operational use of data in problem solving, safety analysis, and recurrence control. The methods and techniques used to understand existing processes and systems, to recognize opportunities for improvement, and to design and review improvements are described with the intent that similar techniques can be employed elsewhere in safety operations. A second goal of this paper is to provide and overview of the web-based data system implemented by ISS. The software selected for the ISS hazard systemMission Assurance System (MAS)is a NASA-customized vairant of the open source software project Bugzilla. The origin and history of MAS as a NASA software project and the rationale for (and advantages of) using open-source software are documented elsewhere (Green, et al., 2009).
This summary of implementation requirements document for the Aerospace Manufacturing and Rework facilties NESHAP was originally prepared in August 1997, but it was updated in January 2001 with a new amendments update.
40 CFR 63.7956 - Who implements and enforces this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 14 2012-07-01 2011-07-01 true Who implements and enforces this subpart? 63.7956 Section 63.7956 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Site Remediation Other Requirements and...
40 CFR 63.7956 - Who implements and enforces this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 13 2011-07-01 2011-07-01 false Who implements and enforces this subpart? 63.7956 Section 63.7956 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Site Remediation Other Requirements and...
40 CFR 63.7956 - Who implements and enforces this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 14 2014-07-01 2014-07-01 false Who implements and enforces this subpart? 63.7956 Section 63.7956 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Site Remediation Other Requirements and...
40 CFR 63.7956 - Who implements and enforces this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 13 2010-07-01 2010-07-01 false Who implements and enforces this subpart? 63.7956 Section 63.7956 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Site Remediation Other Requirements and...
40 CFR 63.7956 - Who implements and enforces this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 14 2013-07-01 2013-07-01 false Who implements and enforces this subpart? 63.7956 Section 63.7956 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Site Remediation Other Requirements and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2010 CFR
2010-10-01
...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a...
Designing a low-cost pollution prevention plan to pay off at the University of Houston.
Bialowas, Yurika Diaz; Sullivan, Emmett C; Schneller, Robert D
2006-09-01
The University of Houston is located just south of downtown Houston, TX. Many different chemical substances are used in scientific research and teaching activities throughout the campus. These activities generate a significant amount of waste materials that must be discarded as regulated hazardous waste per U.S. Environmental Protection Agency (EPA) rules. The Texas Commission on Environmental Quality (TCEQ) is the state regulatory agency that has enforcement authority for EPA hazardous waste rules in Texas. Currently, the University is classified as a large quantity generator and generates >1000 kg per month of hazardous waste. In addition, the University has experienced a major surge in research activities during the past several years, and overall the quantity of the hazardous waste generated has increased. The TCEQ requires large quantity generators to prepare a 5-yr Pollution Prevention (P2) Plan, which describes efforts to eliminate or minimize the amount of hazardous waste generated. This paper addresses the design and development of a low-cost P2 plan with minimal implementation obstacles and strong payoff potentials for the University. The projects identified can be implemented with existing University staff resources. This benefits the University by enhancing its environmental compliance efforts, and the disposal cost savings can be used for other purposes. Other educational institutions may benefit by undertaking a similar process.
Incidence of enterotoxigenic staphylococci and their toxins in foods.
Soriano, J M; Font, G; Rico, H; Moltó, J C; Mañes, J
2002-05-01
Of 504 food samples collected from cafeterias, 19 (3.8%) yielded strains of enterotoxigenic staphylococci, and 10 (52.6%), 4 (21.1%), 3 (15.8%), and 2 (10.5%) of these strains produced enterotoxins C (SEC), D (SED), B (SEB), and A (SEA), respectively. Moreover, SEA, SEB, and SEC were isolated from three hamburger samples. Of 181 food samples collected from four restaurants before the implementation of the hazard analysis and critical control point (HACCP) system, 7 (3.9%) were found to contain enterotoxigenic strains, and SED, SEC, and SEA were produced by 4 (57.1%), 2 (28.6%), and 1 (14.3%) of these strains, respectively. One meatball sample with SEC was detected in a restaurant. After the implementation of the HACCP system in four restaurants, neither enterotoxigenic staphylococci nor enterotoxins were detected in 196 studied samples.
Modeling and Hazard Analysis Using STPA
NASA Astrophysics Data System (ADS)
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.
Risk analysis for renewable energy projects due to constraints arising
NASA Astrophysics Data System (ADS)
Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.
2016-02-01
Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.
Safety analysis, risk assessment, and risk acceptance criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jamali, K.; Stack, D.W.; Sullivan, L.H.
1997-08-01
This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities andmore » that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Higgins, Kristin A., E-mail: kristin.higgins@emory.edu; Winship Cancer Institute, Emory University, Atlanta, Georgia; O'Connell, Kelli
Purpose: To analyze outcomes and predictors associated with proton radiation therapy for non-small cell lung cancer (NSCLC) in the National Cancer Database. Methods and Materials: The National Cancer Database was queried to capture patients with stage I-IV NSCLC treated with thoracic radiation from 2004 to 2012. A logistic regression model was used to determine the predictors for utilization of proton radiation therapy. The univariate and multivariable association with overall survival were assessed by Cox proportional hazards models along with log–rank tests. A propensity score matching method was implemented to balance baseline covariates and eliminate selection bias. Results: A total of 243,822more » patients (photon radiation therapy: 243,474; proton radiation therapy: 348) were included in the analysis. Patients in a ZIP code with a median income of <$46,000 per year were less likely to receive proton treatment, with the income cohort of $30,000 to $35,999 least likely to receive proton therapy (odds ratio 0.63 [95% confidence interval (CI) 0.44-0.90]; P=.011). On multivariate analysis of all patients, non-proton therapy was associated with significantly worse survival compared with proton therapy (hazard ratio 1.21 [95% CI 1.06-1.39]; P<.01). On propensity matched analysis, proton radiation therapy (n=309) was associated with better 5-year overall survival compared with non-proton radiation therapy (n=1549), 22% versus 16% (P=.025). For stage II and III patients, non-proton radiation therapy was associated with worse survival compared with proton radiation therapy (hazard ratio 1.35 [95% CI 1.10-1.64], P<.01). Conclusions: Thoracic radiation with protons is associated with better survival in this retrospective analysis; further validation in the randomized setting is needed to account for any imbalances in patient characteristics, including positron emission tomography–computed tomography staging.« less
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 12 2010-07-01 2010-07-01 true What is an implementation plan for open... AIR POLLUTANTS FOR SOURCE CATEGORIES National Emission Standards for Hazardous Air Pollutants for Boat Manufacturing Standards for Open Molding Resin and Gel Coat Operations § 63.5707 What is an implementation plan...
Data Model for Multi Hazard Risk Assessment Spatial Support Decision System
NASA Astrophysics Data System (ADS)
Andrejchenko, Vera; Bakker, Wim; van Westen, Cees
2014-05-01
The goal of the CHANGES Spatial Decision Support System is to support end-users in making decisions related to risk reduction measures for areas at risk from multiple hydro-meteorological hazards. The crucial parts in the design of the system are the user requirements, the data model, the data storage and management, and the relationships between the objects in the system. The implementation of the data model is carried out entirely with an open source database management system with a spatial extension. The web application is implemented using open source geospatial technologies with PostGIS as the database, Python for scripting, and Geoserver and javascript libraries for visualization and the client-side user-interface. The model can handle information from different study areas (currently, study areas from France, Romania, Italia and Poland are considered). Furthermore, the data model handles information about administrative units, projects accessible by different types of users, user-defined hazard types (floods, snow avalanches, debris flows, etc.), hazard intensity maps of different return periods, spatial probability maps, elements at risk maps (buildings, land parcels, linear features etc.), economic and population vulnerability information dependent on the hazard type and the type of the element at risk, in the form of vulnerability curves. The system has an inbuilt database of vulnerability curves, but users can also add their own ones. Included in the model is the management of a combination of different scenarios (e.g. related to climate change, land use change or population change) and alternatives (possible risk-reduction measures), as well as data-structures for saving the calculated economic or population loss or exposure per element at risk, aggregation of the loss and exposure using the administrative unit maps, and finally, producing the risk maps. The risk data can be used for cost-benefit analysis (CBA) and multi-criteria evaluation (SMCE). The data model includes data-structures for CBA and SMCE. The model is at the stage where risk and cost-benefit calculations can be stored but the remaining part is currently under development. Multi-criteria information, user management and the relation of these with the rest of the model is our next step. Having a carefully designed data model plays a crucial role in the development of the whole system for rapid development, keeping the data consistent, and in the end, support the end-user in making good decisions in risk-reduction measures related to multiple natural hazards. This work is part of the EU FP7 Marie Curie ITN "CHANGES"project (www.changes-itn.edu)
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
A policy analysis of the problem of the reproductive health of women in the workplace.
Kotch, J B; Ossler, C C; Howze, D C
1984-06-01
Many occupations in which women comprise the majority of the workforce involve exposure to biological, physical, and chemical hazards. Potential reproductive effects of work-related substances include impaired reproductive capacity, mutagenesis, teratogenesis, and transplacental carcinogenesis. However, female-dominated occupations tend to be only minimally regulated by the US Occupational Safety and Health Administration, and the corporate response to the issue of reproductive and fetal health has been to institute "protective discrimination policies" such as the demotion or exclusion of women of childbearing age from certain jobs. This article rates the effectiveness of alternate policy responses to increase women's occupational health and safety through use of a series of analysis criteria: equity, efficiency, preference satisfaction, right to privacy, avoidance of stigma, and unintended consequences. Policy options include the following: 1) do nothing, 2) leave current policies intact while supporting a research program to document the health consequences of specific occupational risks to women's reproductive health, 3) restrict women for who pregnancy is not ruled out from occupations or work areas known or suspected to be hazardous, 4) improve working conditions for all women, and 5) improve working conditions for all workers. Policy analysis suggests the working conditions of all workers should be improved. This alternative reduces inequity, eliminates stigma, maintains privacy, and honors preferences. Implementation of this policy would be expensive, requiring an increase in knowledge of the effects of industrial substances on female and male reproductive health, expansion of the technical capacity to control occupational hazards, and an increase in the resources of programs that monitor and regulate occupational health. However, this approach is in accord with growing concern that workers should not have to compromise their health to keep their jobs.
Bowden, Jack; Seaman, Shaun; Huang, Xin; White, Ian R
2016-04-30
In randomised controlled trials of treatments for late-stage cancer, it is common for control arm patients to receive the experimental treatment around the point of disease progression. This treatment switching can dilute the estimated treatment effect on overall survival and impact the assessment of a treatment's benefit on health economic evaluations. The rank-preserving structural failure time model of Robins and Tsiatis (Comm. Stat., 20:2609-2631) offers a potential solution to this problem and is typically implemented using the logrank test. However, in the presence of substantial switching, this test can have low power because the hazard ratio is not constant over time. Schoenfeld (Biometrika, 68:316-319) showed that when the hazard ratio is not constant, weighted versions of the logrank test become optimal. We present a weighted logrank test statistic for the late stage cancer trial context given the treatment switching pattern and working assumptions about the underlying hazard function in the population. Simulations suggest that the weighted approach can lead to large efficiency gains in either an intention-to-treat or a causal rank-preserving structural failure time model analysis compared with the unweighted approach. Furthermore, violation of the working assumptions used in the derivation of the weights only affects the efficiency of the estimates and does not induce bias or inflate the type I error rate. The weighted logrank test statistic should therefore be considered for use as part of a careful secondary, exploratory analysis of trial data affected by substantial treatment switching. ©2015 The Authors. Statistics inMedicine Published by John Wiley & Sons Ltd.
DOT National Transportation Integrated Search
1985-10-01
This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
77 FR 34066 - Notice of Lodging of Consent Decree Under the Clean Air Act
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... Organic Chemical Manufacturing Industry and Organic Hazardous Air Pollutants for Equipment Leaks). The... Defendants to implement an Enhanced Leak Detection and Repair Program to mitigate any potential excess emissions resulting from past CAA violations; implement controls on an API oil/water separator as additional...
Report #12-P-0362, March 21, 2012. Region 4 took actions to implement all recommendations made in EPA OIG Report No. 10-P-0130, EPA Activities Provide Limited Assurance of the Extent of Contamination and Risk at a North Carolina Hazardous Waste Site.
40 CFR 265.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Purpose and implementation of contingency plan. 265.51 Section 265.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... contingency plan must be designed to minimize hazards to human health or the environment from fires...
40 CFR 264.51 - Purpose and implementation of contingency plan.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Purpose and implementation of contingency plan. 264.51 Section 264.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... plan must be designed to minimize hazards to human health or the environment from fires, explosions, or...
40 CFR 257.28 - Implementation of the corrective action program.
Code of Federal Regulations, 2012 CFR
2012-07-01
...-Hazardous Waste Disposal Units Ground-Water Monitoring and Corrective Action § 257.28 Implementation of the... ground-water monitoring program that: (i) At a minimum, meets the requirements of an assessment monitoring program under § 257.25; (ii) Indicates the effectiveness of the corrective action remedy; and (iii...
44 CFR 63.1 - Purpose of part.
Code of Federal Regulations, 2011 CFR
2011-10-01
... SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IMPLEMENTATION OF SECTION 1306(c) OF THE NATIONAL FLOOD INSURANCE ACT OF 1968 General § 63.1 Purpose of part. The purpose of this part is to implement section 1306(c) of the National Flood Insurance Act of 1968, as amended (the Act...
44 CFR 63.1 - Purpose of part.
Code of Federal Regulations, 2013 CFR
2013-10-01
... SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IMPLEMENTATION OF SECTION 1306(c) OF THE NATIONAL FLOOD INSURANCE ACT OF 1968 General § 63.1 Purpose of part. The purpose of this part is to implement section 1306(c) of the National Flood Insurance Act of 1968, as amended (the Act...
44 CFR 63.1 - Purpose of part.
Code of Federal Regulations, 2010 CFR
2010-10-01
... SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IMPLEMENTATION OF SECTION 1306(c) OF THE NATIONAL FLOOD INSURANCE ACT OF 1968 General § 63.1 Purpose of part. The purpose of this part is to implement section 1306(c) of the National Flood Insurance Act of 1968, as amended (the Act...
44 CFR 63.1 - Purpose of part.
Code of Federal Regulations, 2012 CFR
2012-10-01
... SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IMPLEMENTATION OF SECTION 1306(c) OF THE NATIONAL FLOOD INSURANCE ACT OF 1968 General § 63.1 Purpose of part. The purpose of this part is to implement section 1306(c) of the National Flood Insurance Act of 1968, as amended (the Act...
44 CFR 63.1 - Purpose of part.
Code of Federal Regulations, 2014 CFR
2014-10-01
... SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IMPLEMENTATION OF SECTION 1306(c) OF THE NATIONAL FLOOD INSURANCE ACT OF 1968 General § 63.1 Purpose of part. The purpose of this part is to implement section 1306(c) of the National Flood Insurance Act of 1968, as amended (the Act...
Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project
NASA Astrophysics Data System (ADS)
van Eck, T.; Giardini, D.
2010-12-01
The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.
NASA Astrophysics Data System (ADS)
Su, Pengcheng; Sun, Zhengchao; li, Yong
2017-04-01
Luding-Kangding highway cross the eastern edge of Qinghai-Tibet Plateau where belong to the most deep canyon area of plateau and mountains in western Sichuan with high mountain and steep slope. This area belongs to the intersection among Xianshuihe, Longmenshan and Anninghe fault zones which are best known in Sichuan province. In the region, seismic intensity is with high frequency and strength, new tectonic movement is strong, rock is cracked, there are much loose solid materials. Debris flow disaster is well developed under the multiple effects of the earthquake, strong rainfall and human activity which poses a great threat to the local people's life and property security. So this paper chooses Kangding and LuDing as the study area to do the debris flow hazard assessment through the in-depth analysis of development characteristics and formation mechanism of debris flow. Which can provide important evidence for local disaster assessment and early warning forecast. It also has the important scientific significance and practical value to safeguard the people's life and property safety and the security implementation of the national major project. In this article, occurrence mechanism of debris flow disasters in the study area is explored, factor of evaluation with high impact to debris flow hazards is identified, the database of initial evaluation factors is made by the evaluation unit of basin. The factors with high impact to hazards occurrence are selected by using the stepwise regression method of logistic regression model, at the same time the factors with low impact are eliminated, then the hazard evaluation factor system of debris flow is determined in the study area. Then every factors of evaluation factor system are quantified, and the weights of all evaluation factors are determined by using the analysis of stepwise regression. The debris flows hazard assessment and regionalization of all the whole study area are achieved eventually after establishing the hazard assessment model. In this paper, regional debris flows hazard assessment method with strong universality and reliable evaluation result is presented. The whole study area is divided into 1674 units by automatically extracting and artificial identification, and then 11 factors are selected as the initial assessment factors of debris flow hazard assessment in the study area. The factors of the evaluation index system are quantified using the method of standardized watershed unit amount ratio. The relationship between debris flow occurrence and each evaluation factor is simulated using logistic regression model. The weights of evaluation factors are determined, and the model of debris flows hazard assessment is established in the study area. Danger assessment result of debris flow was applied in line optimization and engineering disaster reduction of Sichuan-Tibet highway (section of Luding-Kangding).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
About 1.5 billion tons of hazardous materials per year are moved in the US by truck, rail, barge, and air. The Hazardous Materials Transportation Act was the first attempt at a comprehensive Federal scheme for regulation. This hearing looks at the Secretary of Transportation's implementation of the statute for oversight and reauthorization responsibilities. Testimony was heard from 16 witnesses, representatives of Chemical Manufacturers Association, the American Trucking Association, the Association of American Railroads, the Department of Transportation, the Environmental Protection Agency, the Environmental Policy Institute, Office of Technology Assessment, Hazardous Materials Advisory Council, National Tank Truck Carriers, Federal Emergency Managementmore » Agency, National Paint and Coatings Association, and a representative from Ohio.« less
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
Dean, Brandon; Bagwell, Dee Ann; Dora, Vinita; Khan, Sinan; Plough, Alonzo
2013-01-01
A ll communities, explicitly or implicitly, assess and prepare for the natural and manmade hazards that they know could impact their community. The commonality of hazard-based threats in most all communities does not usually result in standard or evidence-based preparedness practice and outcomes across those communities. Without specific efforts to build a shared perspective and prioritization, "all-hazards" preparedness can result in a random hodgepodge of priorities and preparedness strategies, resulting in diminished emergency response capabilities. Traditional risk assessments, with a focus on physical infrastructure, do not present the potential health and medical impacts of specific hazards and threats. With the implementation of Centers for Disease Control and Prevention's capability-based planning, there is broad recognition that a health-focused hazard assessment process--that engages the "Whole of Community"--is needed. Los Angeles County's Health Hazard Assessment and Prioritization tool provides a practical and innovative approach to enhance existing planning capacities. Successful utilization of this tool can provide a way for local and state health agencies and officials to more effectively identify the health consequences related to hazard-specific threats and risk, determine priorities, and develop improved and better coordinated agency planning, including community engagement in prioritization.
Can hazard risk be communicated through a virtual experience?
Mitchell, J T
1997-09-01
Cyberspace, defined by William Gibson as a consensual hallucination, now refers to all computer-generated interactive environments. Virtual reality, one of a class of interactive cyberspaces, allows us to create and interact directly with objects not available in the everyday world. Despite successes in the entertainment and aviation industries, this technology has been called a 'solution in search of a problem'. The purpose of this commentary is to suggest such a problem: the inability to acquire experience with a hazard to motivate mitigation. Direct experience with a hazard has been demonstrated as a powerful incentive to adopt mitigation measures. While we lack the ability to summon hazard events at will in order to gain access to that experience, a virtual environment can provide an arena where potential victims are exposed to a hazard's effects. Immersion as an active participant within the hazard event through virtual reality may stimulate users to undertake mitigation steps that might otherwise remain undone. This paper details the possible direction in which virtual reality may be applied to hazards mitigation through a discussion of the technology, the role of hazard experience, the creation of a hazard stimulation and the issues constraining implementation.
Cusato, Sueli; Gameiro, Augusto H; Corassin, Carlos H; Sant'ana, Anderson S; Cruz, Adriano G; Faria, José de Assis F; de Oliveira, Carlos Augusto F
2013-01-01
The present study describes the implementation of a food safety system in a dairy processing plant located in the State of São Paulo, Brazil, and the challenges found during the process. In addition, microbiological indicators have been used to assess system's implementation performance. The steps involved in the implementation of a food safety system included a diagnosis of the prerequisites, implementation of the good manufacturing practices (GMPs), sanitation standard operating procedures (SSOPs), training of the food handlers, and hazard analysis and critical control point (HACCP). In the initial diagnosis, conformity with 70.7% (n=106) of the items analyzed was observed. A total of 12 critical control points (CCPs) were identified: (1) reception of the raw milk, (2) storage of the raw milk, (3 and 4) reception of the ingredients and packaging, (5) milk pasteurization, (6 and 7) fermentation and cooling, (8) addition of ingredients, (9) filling, (10) storage of the finished product, (11) dispatching of the product, and (12) sanitization of the equipment. After implementation of the food safety system, a significant reduction in the yeast and mold count was observed (p<0.05). The main difficulties encountered for the implementation of food safety system were related to the implementation of actions established in the flow chart and to the need for constant training/adherence of the workers to the system. Despite this, the implementation of the food safety system was shown to be challenging, but feasible to be reached by small-scale food industries.
Towards identifying the next generation of superfund and hazardous waste site contaminants
Ela, Wendell P.; Sedlak, David L.; Barlaz, Morton A.; Henry, Heather F.; Muir, Derek C.G.; Swackhamer, Deborah L.; Weber, Eric J.; Arnold, Robert G.; Ferguson, P. Lee; Field, Jennifer A.; Furlong, Edward T.; Giesy, John P.; Halden, Rolf U.; Henry, Tala; Hites, Ronald A.; Hornbuckle, Keri C.; Howard, Philip H.; Luthy, Richard G.; Meyer, Anita K.; Saez, A. Eduardo; vom Saal, Frederick S.; Vulpe, Chris D.; Wiesner, Mark R.
2011-01-01
Conclusions A need exists for a carefully considered and orchestrated expansion of programmatic and research efforts to identify, evaluate, and manage CECs of hazardous waste site relevance, including developing an evolving list of priority CECs, intensifying the identification and monitoring of likely sites of present or future accumulation of CECs, and implementing efforts that focus on a holistic approach to prevention.
A Combined Hazard Index Fire Test Methodology for Aircraft Cabin Materials. Volume II.
1982-04-01
Technical Center. The report was divided into two parts: Part I described the improved technology investigated to upgrade existin methods for testing...proper implementation of the computerized data acquisition and reduction programs will improve materials hazards measurement precision. Thus, other...the hold chamber before and after injection of a sample, will improve precision and repeatability of measurement. The listed data acquisition and
NASA Astrophysics Data System (ADS)
Peruzza, Laura; Azzaro, Raffaele; Gee, Robin; D'Amico, Salvatore; Langer, Horst; Lombardo, Giuseppe; Pace, Bruno; Pagani, Marco; Panzera, Francesco; Ordaz, Mario; Suarez, Miguel Leonardo; Tusa, Giuseppina
2017-11-01
This paper describes the model implementation and presents results of a probabilistic seismic hazard assessment (PSHA) for the Mt. Etna volcanic region in Sicily, Italy, considering local volcano-tectonic earthquakes. Working in a volcanic region presents new challenges not typically faced in standard PSHA, which are broadly due to the nature of the local volcano-tectonic earthquakes, the cone shape of the volcano and the attenuation properties of seismic waves in the volcanic region. These have been accounted for through the development of a seismic source model that integrates data from different disciplines (historical and instrumental earthquake datasets, tectonic data, etc.; presented in Part 1, by Azzaro et al., 2017) and through the development and software implementation of original tools for the computation, such as a new ground-motion prediction equation and magnitude-scaling relationship specifically derived for this volcanic area, and the capability to account for the surficial topography in the hazard calculation, which influences source-to-site distances. Hazard calculations have been carried out after updating the most recent releases of two widely used PSHA software packages (CRISIS, as in Ordaz et al., 2013; the OpenQuake engine, as in Pagani et al., 2014). Results are computed for short- to mid-term exposure times (10 % probability of exceedance in 5 and 30 years, Poisson and time dependent) and spectral amplitudes of engineering interest. A preliminary exploration of the impact of site-specific response is also presented for the densely inhabited Etna's eastern flank, and the change in expected ground motion is finally commented on. These results do not account for M > 6 regional seismogenic sources which control the hazard at long return periods. However, by focusing on the impact of M < 6 local volcano-tectonic earthquakes, which dominate the hazard at the short- to mid-term exposure times considered in this study, we present a different viewpoint that, in our opinion, is relevant for retrofitting the existing buildings and for driving impending interventions of risk reduction.
Göransson, Mona; Magnusson, Asa; Heilig, Markus
2006-01-01
It has been repeatedly demonstrated that hazardous alcohol use during pregnancy is rarely detected in regular antenatal care, and that detection can be markedly improved using systematic screening. A major challenge is to translate research-based strategies into regular antenatal care. Here, we examined whether a screening strategy using the Alcohol Use Disorder Test (AUDIT) and time-line follow-back (TLFB) could be implemented under naturalistic conditions and within available resources; and whether it would improve detection to the extent previously shown in a research context. Regular midwives at a large antenatal care clinic were randomized to receive brief training and then implement AUDIT and TLFB ("intervention"); or to a waiting-list control group continuing to deliver regular care ("control"). In the intervention-condition, AUDIT was used to collect data about alcohol use during the year preceding pregnancy, and TLFB to assess actual consumption during the first trimester. Data were collected from new admissions over 6 months. Drop out was higher among patients of the intervention group than control midwives, 14% (23/162) versus 0% (0/153), and p<0.0001. A one-day training session combined with continuous expert support was sufficient to implement systematic screening with AUDIT and TLFB largely within resources of regular antenatal care. The use of these instruments identified patients with hazardous consumption during the year preceding pregnancy i.e. AUDIT score 6 or higher (17%, 23/139), and patients with ongoing consumption exceeding 70 g/week and/or binge consumption according to TLFB (17%, 24/139), to a significantly higher degree than regular antenatal screening (0/162). The AUDIT- and TLFB-positive populations overlapped partially, with 36/139 subjects screening positive with either of the instrument and 11/139 were positive for both. We confirm previous findings that alcohol use during pregnancy is more extensive in Sweden than has generally been realized. Systematic screening using AUDIT and TLFB detects hazardous use in a manner which regular antenatal care does not. This remains true under naturalistic conditions, following minimal training of regular antenatal care staff, and can be achieved with minimal resources. The proposed strategy appears attractive for broad implementation.
NASA Astrophysics Data System (ADS)
Arndt, J.; Kreimer, J.
2010-09-01
The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough flexibility which is desired by manned space operations with scientific objectives. In the period of COLUMBUS operations since launch already a number of lessons learnt could be implemented especially in the IEHA that allow to improve the flexibility of on-board operations without degradation of Safety.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco
2016-04-01
The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.
Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping
NASA Astrophysics Data System (ADS)
Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai
2015-04-01
Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.
Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis
NASA Technical Reports Server (NTRS)
Shortle, J. F.; Allocco, M.
2005-01-01
Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levander, Alan Richard; Zelt, Colin A.
2015-03-17
The work plan for this project was to develop and apply advanced seismic reflection and wide-angle processing and inversion techniques to high resolution seismic data for the shallow subsurface to seismically characterize the shallow subsurface at hazardous waste sites as an aid to containment and cleanup activities. We proposed to continue work on seismic data that we had already acquired under a previous DoE grant, as well as to acquire additional new datasets for analysis. The project successfully developed and/or implemented the use of 3D reflection seismology algorithms, waveform tomography and finite-frequency tomography using compressional and shear waves for highmore » resolution characterization of the shallow subsurface at two waste sites. These two sites have markedly different near-surface structures, groundwater flow patterns, and hazardous waste problems. This is documented in the list of refereed documents, conference proceedings, and Rice graduate theses, listed below.« less
Liu, Xiang; Peng, Yingwei; Tu, Dongsheng; Liang, Hua
2012-10-30
Survival data with a sizable cure fraction are commonly encountered in cancer research. The semiparametric proportional hazards cure model has been recently used to analyze such data. As seen in the analysis of data from a breast cancer study, a variable selection approach is needed to identify important factors in predicting the cure status and risk of breast cancer recurrence. However, no specific variable selection method for the cure model is available. In this paper, we present a variable selection approach with penalized likelihood for the cure model. The estimation can be implemented easily by combining the computational methods for penalized logistic regression and the penalized Cox proportional hazards models with the expectation-maximization algorithm. We illustrate the proposed approach on data from a breast cancer study. We conducted Monte Carlo simulations to evaluate the performance of the proposed method. We used and compared different penalty functions in the simulation studies. Copyright © 2012 John Wiley & Sons, Ltd.
Bui, David P; Pollack Porter, Keshia; Griffin, Stephanie; French, Dustin D; Jung, Alesia M; Crothers, Stephen; Burgess, Jefferey L
2017-11-17
Emergency service vehicle crashes (ESVCs) are a leading cause of death in the United States fire service. Risk management (RM) is a proactive process for identifying occupational risks and reducing hazards and unwanted events through an iterative process of scoping hazards, risk assessment, and implementing controls. We describe the process, outputs, and lessons learned from the application of a proactive RM process to reduce ESVCs in US fire departments. Three fire departments representative of urban, suburban, and rural geographies, participated in a facilitated RM process delivered through focus groups and stakeholder discussion. Crash reports from department databases were reviewed to characterize the context, circumstances, hazards and risks of ESVCs. Identified risks were ranked using a risk matrix that considered risk likelihood and severity. Department-specific control measures were selected based on group consensus. Interviews, and focus groups were used to assess acceptability and utility of the RM process and perceived facilitators and barriers of implementation. Three to six RM meetings were conducted at each fire department. There were 7.4 crashes per 100 personnel in the urban department and 10.5 per 100 personnel in the suburban department; the rural department experienced zero crashes. All departments identified emergency response, backing, on scene struck by, driver distraction, vehicle/road visibility, and driver training as high or medium concerns. Additional high priority risks varied by department; the urban department prioritized turning and rear ending crashes; the suburban firefighters prioritized inclement weather/road environment and low visibility related crashes; and the rural volunteer fire department prioritized exiting station, vehicle failure, and inclement weather/road environment related incidents. Selected controls included new policies and standard operating procedures to reduce emergency response, cameras to enhance driver visibility while backing, and increased training frequency and enhanced training. The RM process was generally acceptable to department participants and considered useful. All departments reported that the focused and systematic analysis of crashes was particularly helpful. Implementation of controls was a commonly cited challenge. Proactive RM of ESVCs in three US fire departments was positively received and supported the establishment of interventions tailored to each department's needs and priorities.
Accident analysis and control options in support of the sludge water system safety analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
HEY, B.E.
A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have beenmore » incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.« less
Supplemental Hazard Analysis and Risk Assessment - Hydrotreater
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less
NASA Astrophysics Data System (ADS)
Aubrecht, Christoph; Steinnocher, Klaus; Humer, Heinrich; Huber, Hermann
2014-05-01
In the context of proactive disaster risk as well as immediate situational crisis management knowledge of locational social aspects in terms of spatio-temporal population distribution dynamics is considered among the most important factors for disaster impact minimization (Aubrecht et al., 2013a). This applies to both the pre-event stage for designing appropriate preparedness measures and to acute crisis situations when an event chain actually unfolds for efficient situation-aware response. The presented DynaPop population dynamics model is developed at the interface of those interlinked crisis stages and aims at providing basic input for social impact evaluation and decision support in crisis management. The model provides the starting point for assessing population exposure dynamics - thus here labeled as DynaPop-X - which can either be applied in a sense of illustrating the changing locations and numbers of affected people at different stages during an event or as ex-ante estimations of probable and maximum expected clusters of affected population (Aubrecht et al., 2013b; Freire & Aubrecht, 2012). DynaPop is implemented via a gridded spatial disaggregation approach and integrates previous efforts on spatio-temporal modeling that account for various aspects of population dynamics such as human mobility and activity patterns that are particularly relevant in picturing the highly dynamic daytime situation (Ahola et al., 2007; Bhaduri, 2008; Cockings et al., 2010). We will present ongoing developments particularly focusing on the implementation logic of the model using the emikat software tool, a data management system initially designed for inventorying and analysis of spatially resolved regional air pollutant emission scenarios. This study was performed in the framework of the EU CRISMA project. CRISMA is funded from the European Community's Seventh Framework Programme FP7/2007-2013 under grant agreement no. 284552. REFERENCES Ahola, T., Virrantaus, K., Krisp, J.K., Hunter, G.J. (2007) A spatio-temporal population model to support risk assessment and damage analysis for decision-making. International Journal of Geographical Information Science, 21(8), 935-953. Aubrecht, C., Fuchs, S., Neuhold, C. (2013a) Spatio-temporal aspects and dimensions in integrated disaster risk management. Natural Hazards, 68(3), 1205-1216. Aubrecht, C., Özceylan, D., Steinnocher, K., Freire, S. (2013b) Multi-level geospatial modeling of human exposure patterns and vulnerability indicators. Natural Hazards, 68(1), 147-163. Bhaduri, B. (2008) Population distribution during the day. In S. Shekhar & X. Hui, eds., Encyclopedia of GIS. Springer US, 880-885. Cockings, S., Martin, D. & Leung, S. (2010) Population 24/7: building space-time specific population surface models. In M. Haklay, J. Morley, & H. Rahemtulla, eds., Proceedings of the GIS Research UK 18th Annual conference. GISRUK 2010. London, UK, 41-47. Freire, S., Aubrecht, C. (2012) Integrating population dynamics into mapping human exposure to seismic hazard. Natural Hazards and Earth System Sciences, 12(11), 3533-3543.
Keurhorst, Myrna N; Anderson, Peter; Spak, Fredrik; Bendtsen, Preben; Segura, Lidia; Colom, Joan; Reynolds, Jillian; Drummond, Colin; Deluca, Paolo; van Steenkiste, Ben; Mierzecki, Artur; Kłoda, Karolina; Wallace, Paul; Newbury-Birch, Dorothy; Kaner, Eileen; Gual, Toni; Laurant, Miranda G H
2013-01-24
The European level of alcohol consumption, and the subsequent burden of disease, is high compared to the rest of the world. While screening and brief interventions in primary healthcare are cost-effective, in most countries they have hardly been implemented in routine primary healthcare. In this study, we aim to examine the effectiveness and efficiency of three implementation interventions that have been chosen to address key barriers for improvement: training and support to address lack of knowledge and motivation in healthcare providers; financial reimbursement to compensate the time investment; and internet-based counselling to reduce workload for primary care providers. In a cluster randomized factorial trial, data from Catalan, English, Netherlands, Polish, and Swedish primary healthcare units will be collected on screening and brief advice rates for hazardous and harmful alcohol consumption. The three implementation strategies will be provided separately and in combination in a total of seven intervention groups and compared with a treatment as usual control group. Screening and brief intervention activities will be measured at baseline, during 12 weeks and after six months. Process measures include health professionals' role security and therapeutic commitment of the participating providers (SAAPPQ questionnaire). A total of 120 primary healthcare units will be included, equally distributed over the five countries. Both intention to treat and per protocol analyses are planned to determine intervention effectiveness, using random coefficient regression modelling. Effective interventions to implement screening and brief interventions for hazardous alcohol use are urgently required. This international multi-centre trial will provide evidence to guide decision makers.
Tracking Hazard Analysis Data in a Jungle of Changing Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Robin S.; Young, Jonathan
2006-05-16
Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...
NASA Astrophysics Data System (ADS)
Bernard, E. N.; Behn, R. R.; Hebenstreit, G. T.; Gonzalez, F. I.; Krumpe, P.; Lander, J. F.; Lorca, E.; McManamon, P. M.; Milburn, H. B.
Rapid onset natural hazards have claimed more than 2.8 million lives worldwide in the past 20 years. This category includes such events as earthquakes, landslides, hurricanes, tornados, floods, volcanic eruptions, wildfires, and tsunamis. Effective hazard mitigation is particularly difficult in such cases, since the time available to issue warnings can be very short or even nonexistent. This paper presents the concept of a local warning system that exploits and integrates the existing technologies of risk evaluation, environmental measurement, and telecommunications. We describe Project THRUST, a successful implementation of this general, systematic approach to tsunamis. The general approach includes pre-event emergency planning, real-time hazard assessment, and rapid warning via satellite communication links.
Ahmed, Munerah; Nagin, Deborah; Clark, Nancy
2014-01-01
Lead-based paint and occupational lead hazards remain the primary exposure sources of lead in New York City (NYC) children and men, respectively. Lead poisoning has also been associated with the use of certain consumer products in NYC. The NYC Department of Health and Mental Hygiene developed the Intervention Model for Contaminated Consumer Products, a comprehensive approach to identify and reduce exposure to lead and other hazards in consumer products. The model identifies hazardous consumer products, determines their availability in NYC, enforces on these products, and provides risk communication and public education. Implementation of the model has resulted in removal of thousands of contaminated products from local businesses and continues to raise awareness of these hazardous products. PMID:24922141
Herzer, Kurt R; Mirrer, Meredith; Xie, Yanjun; Steppan, Jochen; Li, Matthew; Jung, Clinton; Cover, Renee; Doyle, Peter A; Mark, Lynette J
2012-08-01
Since 1999, hospitals have made substantial commitments to health care quality and patient safety through individual initiatives of executive leadership involvement in quality, investments in safety culture, education and training for medical students and residents in quality and safety, the creation of patient safety committees, and implementation of patient safety reporting systems. At the Weinberg Surgical Suite at The Johns Hopkins Hospital (Baltimore), a 16-operating-room inpatient/outpatient cancer center, a patient safety reporting process was developed to maximize the usefulness of the reports and the long-term sustainability of quality improvements arising from them. A six-phase framework was created incorporating UHC's Patient Safety Net (PSN): Identify, report, analyze, mitigate, reward, and follow up. Unique features of this process included a multidisciplinary team to review reports, mitigate hazards, educate and empower providers, recognize the identifying/reporting individuals or groups with "Good Catch" awards, and follow up to determine if quality improvements were sustained over time. Good Catch awards have been given in recognition of 29 patient safety hazards identified since 2008; in each of these cases, an initiative was developed to mitigate the original hazard. Twenty-five (86%) of the associated quality improvements have been sustained. Two Good Catch award-winning projects--vials of heparin with an unusually high concentration of the drug that posed a potential overdose hazard and a rapid infusion device that resisted practitioner control--are described in detail. A multidisciplinary team's analysis and mitigation of hazards identified in a patient safety reporting process entailed positive recognition with a Good Catch award, education of practitioners, and long-term follow-up.
Probabilistic seismic hazard estimates incorporating site effects - An example from Indiana, U.S.A
Hasse, J.S.; Park, C.H.; Nowack, R.L.; Hill, J.R.
2010-01-01
The U.S. Geological Survey (USGS) has published probabilistic earthquake hazard maps for the United States based on current knowledge of past earthquake activity and geological constraints on earthquake potential. These maps for the central and eastern United States assume standard site conditions with Swave velocities of 760 m/s in the top 30 m. For urban and infrastructure planning and long-term budgeting, the public is interested in similar probabilistic seismic hazard maps that take into account near-surface geological materials. We have implemented a probabilistic method for incorporating site effects into the USGS seismic hazard analysis that takes into account the first-order effects of the surface geologic conditions. The thicknesses of sediments, which play a large role in amplification, were derived from a P-wave refraction database with over 13, 000 profiles, and a preliminary geology-based velocity model was constructed from available information on S-wave velocities. An interesting feature of the preliminary hazard maps incorporating site effects is the approximate factor of two increases in the 1-Hz spectral acceleration with 2 percent probability of exceedance in 50 years for parts of the greater Indianapolis metropolitan region and surrounding parts of central Indiana. This effect is primarily due to the relatively thick sequence of sediments infilling ancient bedrock topography that has been deposited since the Pleistocene Epoch. As expected, the Late Pleistocene and Holocene depositional systems of the Wabash and Ohio Rivers produce additional amplification in the southwestern part of Indiana. Ground motions decrease, as would be expected, toward the bedrock units in south-central Indiana, where motions are significantly lower than the values on the USGS maps.
Seibert, P J
1994-02-01
In an earlier article (JAVMA, Jan 15, 1994), the author outlined some of the first steps necessary in establishing a hospital safety program that will comply with current Occupational Safety and Health Administration (OSHA) guidelines. One of the main concerns of the OSHA guidelines is that there be written plans for managing hazardous materials, performing dangerous jobs, and dealing with other potential safety problems. In this article, the author discusses potentially hazardous situations commonly found in veterinary practices and provides details on how to minimize the risks associated with those situations and how to implement safety procedures that will comply with the OSHA guidelines.
Evans, Richard M; Scholze, Martin; Kortenkamp, Andreas
2015-10-01
The way in which mixture risk assessment (MRA) should be included in chemical risk assessment is a current topic of debate. We used data from 67 recent pesticide evaluations to build a case study using Hazard Index calculations to form risk estimates in a tiered MRA approach in line with a Framework proposed by WHO/IPCS. The case study is used to illustrate the approach and to add detail to the existing Framework, and includes many more chemicals than previous case studies. A low-tier MRA identified risk as being greater than acceptable, but refining risk estimates in higher tiers was not possible due to data requirements not being readily met. Our analysis identifies data requirements, which typically expand dramatically in higher tiers, as being the likely cause for an MRA to fail in many realistic cases. This forms a major obstacle to routine implementation of MRA and shows the need for systematic generation and collection of toxicological data. In low tiers, hazard quotient inspection identifies chemicals that contribute most to the HI value and thus require attention if further refinement is needed. Implementing MRA requires consensus on issues such as scope setting, criteria for performing refinement, and decision criteria for actions. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations
NASA Technical Reports Server (NTRS)
Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art
2012-01-01
This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).
Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land
2006-01-01
We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.
Chang, Yen-Hou; Li, Wai-Hou; Chang, Yi; Peng, Chia-Wen; Cheng, Ching-Hsuan; Chang, Wei-Pin; Chuang, Chi-Mu
2016-03-17
In the analysis of survival data for cancer patients, the problem of competing risks is often ignored. Competing risks have been recognized as a special case of time-to-event analysis. The conventional techniques for time-to-event analysis applied in the presence of competing risks often give biased or uninterpretable results. Using a prospectively collected administrative health care database in a single institution, we identified patients diagnosed with stage III or IV primary epithelial ovarian, tubal, and peritoneal cancers with minimal residual disease after primary cytoreductive surgery between 1995 and 2012. Here, we sought to evaluate whether intraperitoneal chemotherapy outperforms intravenous chemotherapy in the presence of competing risks. Unadjusted and multivariable subdistribution hazards models were applied to this database with two types of competing risks (cancer-specific mortality and other-cause mortality) coded to measure the relative effects of intraperitoneal chemotherapy. A total of 1263 patients were recruited as the initial cohort. After propensity score matching, 381 patients in each arm entered into final competing risk analysis. Cumulative incidence estimates for cancer-specific mortality were statistically significantly lower (p = 0.017, Gray test) in patients receiving intraperitoneal chemotherapy (5-year estimates, 34.5%; 95% confidence interval [CI], 29.5-39.6%, and 10-year estimates, 60.7%; 95% CI, 52.2-68.0%) versus intravenous chemotherapy (5-year estimates, 41.3%; 95% CI, 36.2-46.3%, and 10-year estimates, 67.5%, 95% CI, 61.6-72.7%). In subdistribution hazards analysis, for cancer-specific mortality, intraperitoneal chemotherapy outperforms intravenous chemotherapy (Subdistribution hazard ratio, 0.82; 95% CI, 0.70-0.96) after correcting other covariates. In conclusion, results from this comparative effectiveness study provide supportive evidence for previous published randomized trials that intraperitoneal chemotherapy outperforms intravenous chemotherapy even eliminating the confounding of competing risks. We suggest that implementation of competing risk analysis should be highly considered for the investigation of cancer patients who have medium to long-term follow-up period.
Rehm, Jürgen; Anderson, Peter; Prieto, Jose Angel Arbesu; Armstrong, Iain; Aubin, Henri-Jean; Bachmann, Michael; Bastus, Nuria Bastida; Brotons, Carlos; Burton, Robyn; Cardoso, Manuel; Colom, Joan; Duprez, Daniel; Gmel, Gerrit; Gual, Antoni; Kraus, Ludwig; Kreutz, Reinhold; Liira, Helena; Manthey, Jakob; Møller, Lars; Okruhlica, Ľubomír; Roerecke, Michael; Scafato, Emanuele; Schulte, Bernd; Segura-Garcia, Lidia; Shield, Kevin David; Sierra, Cristina; Vyshinskiy, Konstantin; Wojnar, Marcin; Zarco, José
2017-09-28
Hazardous and harmful alcohol use and high blood pressure are central risk factors related to premature non-communicable disease (NCD) mortality worldwide. A reduction in the prevalence of both risk factors has been suggested as a route to reach the global NCD targets. This study aims to highlight that screening and interventions for hypertension and hazardous and harmful alcohol use in primary healthcare can contribute substantially to achieving the NCD targets. A consensus conference based on systematic reviews, meta-analyses, clinical guidelines, experimental studies, and statistical modelling which had been presented and discussed in five preparatory meetings, was undertaken. Specifically, we modelled changes in blood pressure distributions and potential lives saved for the five largest European countries if screening and appropriate intervention rates in primary healthcare settings were increased. Recommendations to handle alcohol-induced hypertension in primary healthcare settings were derived at the conference, and their degree of evidence was graded. Screening and appropriate interventions for hazardous alcohol use and use disorders could lower blood pressure levels, but there is a lack in implementing these measures in European primary healthcare. Recommendations included (1) an increase in screening for hypertension (evidence grade: high), (2) an increase in screening and brief advice on hazardous and harmful drinking for people with newly detected hypertension by physicians, nurses, and other healthcare professionals (evidence grade: high), (3) the conduct of clinical management of less severe alcohol use disorders for incident people with hypertension in primary healthcare (evidence grade: moderate), and (4) screening for alcohol use in hypertension that is not well controlled (evidence grade: moderate). The first three measures were estimated to result in a decreased hypertension prevalence and hundreds of saved lives annually in the examined countries. The implementation of the outlined recommendations could contribute to reducing the burden associated with hypertension and hazardous and harmful alcohol use and thus to achievement of the NCD targets. Implementation should be conducted in controlled settings with evaluation, including, but not limited to, economic evaluation.
Set-up and validation of a Delft-FEWS based coastal hazard forecasting system
NASA Astrophysics Data System (ADS)
Valchev, Nikolay; Eftimova, Petya; Andreeva, Nataliya
2017-04-01
European coasts are increasingly threatened by hazards related to low-probability and high-impact hydro-meteorological events. Uncertainties in hazard prediction and capabilities to cope with their impact lie in both future storm pattern and increasing coastal development. Therefore, adaptation to future conditions requires a re-evaluation of coastal disaster risk reduction (DRR) strategies and introduction of a more efficient mix of prevention, mitigation and preparedness measures. The latter presumes that development of tools, which can manage the complex process of merging data and models and generate products on the current and expected hydro-and morpho-dynamic states of the coasts, such as forecasting system of flooding and erosion hazards at vulnerable coastal locations (hotspots), is of vital importance. Output of such system can be of an utmost value for coastal stakeholders and the entire coastal community. In response to these challenges, Delft-FEWS provides a state-of-the-art framework for implementation of such system with vast capabilities to trigger the early warning process. In addition, this framework is highly customizable to the specific requirements of any individual coastal hotspot. Since its release many Delft-FEWS based forecasting system related to inland flooding have been developed. However, limited number of coastal applications was implemented. In this paper, a set-up of Delft-FEWS based forecasting system for Varna Bay (Bulgaria) and a coastal hotspot, which includes a sandy beach and port infrastructure, is presented. It is implemented in the frame of RISC-KIT project (Resilience-Increasing Strategies for Coasts - toolKIT). The system output generated in hindcast mode is validated with available observations of surge levels, wave and morphodynamic parameters for a sequence of three short-duration and relatively weak storm events occurred during February 4-12, 2015. Generally, the models' performance is considered as very good and results obtained - quite promising for reliable prediction of both boundary conditions and coastal hazard and gives a good basis for estimation of onshore impact.
Integrated Safety Analysis Tiers
NASA Technical Reports Server (NTRS)
Shackelford, Carla; McNairy, Lisa; Wetherholt, Jon
2009-01-01
Commercial partnerships and organizational constraints, combined with complex systems, may lead to division of hazard analysis across organizations. This division could cause important hazards to be overlooked, causes to be missed, controls for a hazard to be incomplete, or verifications to be inefficient. Each organization s team must understand at least one level beyond the interface sufficiently enough to comprehend integrated hazards. This paper will discuss various ways to properly divide analysis among organizations. The Ares I launch vehicle integrated safety analyses effort will be utilized to illustrate an approach that addresses the key issues and concerns arising from multiple analysis responsibilities.
Seismic hazard assessment: Issues and alternatives
Wang, Z.
2011-01-01
Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.
Maxim, L D; Allshouse, J N; Chen, S H; Treadway, J; Venturin, D
1998-04-01
The traditional hierarchy of measures for control of potential respiratory hazards in the workplace includes (in order of preference) engineering controls, workplace practices, and use of respiratory protection. Although third in this hierarchy, respirators can be an important component of the control mix-particularly for difficult-to-control jobs, as an interim measure (pending implementation of other controls), and in cases where exposure is intermittent. One of the problems associated with the use of respirators as a control measure is that valid and adequate data on respirator usage are often not available. Absent these data it is difficult to determine the practical effectiveness of respirators and exposure calculations which include the protective effect of respirators are speculative. This paper presents models (and appropriate statistical fitting techniques) suitable for quantification of respirator usage and defines three potentially useful measures of effectiveness for a respirator program. These models are illustrated with monitoring data on refractory ceramic fiber (RCF) developed as part of a Consent Agreement between the RCF industry and the U.S. Environmental Protection Agency. For this substance there are extensive and comprehensive monitoring data available. The models and methods of analysis may prove applicable for other potential respiratory hazards in the workplace. Copyright 1998 Academic Press.
A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva
2018-03-01
The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.
NASA Astrophysics Data System (ADS)
Azzaro, Raffaele; Barberi, Graziella; D'Amico, Salvatore; Pace, Bruno; Peruzza, Laura; Tuvè, Tiziana
2017-11-01
The volcanic region of Mt. Etna (Sicily, Italy) represents a perfect lab for testing innovative approaches to seismic hazard assessment. This is largely due to the long record of historical and recent observations of seismic and tectonic phenomena, the high quality of various geophysical monitoring and particularly the rapid geodynamics clearly demonstrate some seismotectonic processes. We present here the model components and the procedures adopted for defining seismic sources to be used in a new generation of probabilistic seismic hazard assessment (PSHA), the first results and maps of which are presented in a companion paper, Peruzza et al. (2017). The sources include, with increasing complexity, seismic zones, individual faults and gridded point sources that are obtained by integrating geological field data with long and short earthquake datasets (the historical macroseismic catalogue, which covers about 3 centuries, and a high-quality instrumental location database for the last decades). The analysis of the frequency-magnitude distribution identifies two main fault systems within the volcanic complex featuring different seismic rates that are controlled essentially by volcano-tectonic processes. We discuss the variability of the mean occurrence times of major earthquakes along the main Etnean faults by using an historical approach and a purely geologic method. We derive a magnitude-size scaling relationship specifically for this volcanic area, which has been implemented into a recently developed software tool - FiSH (Pace et al., 2016) - that we use to calculate the characteristic magnitudes and the related mean recurrence times expected for each fault. Results suggest that for the Mt. Etna area, the traditional assumptions of uniform and Poissonian seismicity can be relaxed; a time-dependent fault-based modeling, joined with a 3-D imaging of volcano-tectonic sources depicted by the recent instrumental seismicity, can therefore be implemented in PSHA maps. They can be relevant for the retrofitting of the existing building stock and for driving risk reduction interventions. These analyses do not account for regional M > 6 seismogenic sources which dominate the hazard over long return times (≥ 500 years).
ERIC Educational Resources Information Center
Hall-Wallace, Michelle K.; McAuliffe, Carla M.
2002-01-01
Investigates student learning that occurred with a Geographic Information Systems (GIS) based module on plate tectonics and geologic hazards. Examines factors in the design and implementation of the materials that impacted student learning. Reports positive correlations between student' spatial ability and performance. Includes 17 references.…