Science.gov

Sample records for hazard analysis guidance

  1. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    EPA Pesticide Factsheets

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  2. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    SciTech Connect

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  3. A Guidance Manual: Waste Analysis at Facilities that Generate, Treat, Store, and Dispose of Hazardous Wastes

    EPA Pesticide Factsheets

    Discusses how a person can perform waste analyses and develop waste analysis plans (WAPs) in accordance with the federal hazardous waste regulations of the Resource Conservation and Recovery Act (RCRA)

  4. HANDBOOK: HAZARDOUS WASTE INCINERATION MEASUREMENT GUIDANCE

    EPA Science Inventory

    This publication, Volume III of the Hazardous Waste Incineration Guidance Series, contains general guidance to permit writers in reviewing hazardous waste incineration permit applications and trial burn plans. he handbook is a how-to document dealing with how incineration measure...

  5. K Basins Hazard Analysis

    SciTech Connect

    WEBB, R.H.

    1999-12-29

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062, Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  6. K Basin Hazard Analysis

    SciTech Connect

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  7. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  8. Hazard Analysis Database Report

    SciTech Connect

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  9. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  10. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  11. HAZARD ANALYSIS SOFTWARE

    SciTech Connect

    Sommer, S; Tinh Tran, T

    2008-04-08

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process.

  12. Chemical process hazards analysis

    SciTech Connect

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  13. Job Hazard Analysis

    DTIC Science & Technology

    1998-01-01

    lifting heavy objects? • Do environmenta on, welding rays, heat, or excessiv Job Hazard Analysis U.S. Department of Labor Occupational Safety and...Performing Organization Name(s) and Address(es) U.S. Department of Labor Occupational Safety & Health Administration 200 Constitution Avenue Washington, DC...not itself alter or determine compliance responsibilities, which are set forth in OSHA standards themselves and the Occupational Safety and Health Act

  14. Federal Guidance Report No. 8: Guidance for the Control of Radiation Hazards in Uranium Mining

    EPA Pesticide Factsheets

    This report contains background material used in the development of guidance concerning radiation protection in the mining of uranium ore, and seeks to provide guidance for long-term radiation protection in uranium mining.

  15. Counterfactual Volcano Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2013-04-01

    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However

  16. Track 2 sites: Guidance for assessing low probability hazard sites at the INEL. Revision 6

    SciTech Connect

    Not Available

    1994-01-01

    This document presents guidance for assessment of Track 2 low probability hazard sites (LPHS) at the Idaho National Engineering Laboratory (INEL). The Track 2 classification was developed specifically for the INEL to streamline the implementation of Comprehensive Environmental Response, Compensation, and Liability Act. Track 2 LPHSs are described as sites where insufficient data are available to make a decision concerning the risk level or to select or design a remedy. As such, these types of sites are not described in the National Contingency Plan or existing regulatory guidance. The goal of the Track 2 process is to evaluate LPHSs using existing qualitative and quantitative data to minimize the collection of new environmental data. To this end, this document presents a structured format consisting of a series of questions and tables. A qualitative risk assessment is used. The process is iterative, and addresses an LPHS from multiple perspectives (i.e., historical, empirical, process) in an effort to generate a reproducible and defensible method. This rigorous approach follows the data quality objective process and establishes a well organized, logical approach to consolidate and assess existing data, and set decision criteria. If necessary, the process allows for the design of a sampling and analysis strategy to obtain new environmental data of appropriate quality to support decisions for each LPHS. Finally, the guidance expedites consensus between regulatory parties by emphasizing a team approach to Track 2 investigations.

  17. CHARACTERIZATION OF ORGANIC EMISSIONS FROM HAZARDOUS WASTE INCINERATION PROCESSES UNDER THE NEW EPA DRAFT RISK BURN GUIDANCE: MEASUREMENT ISSUES

    EPA Science Inventory

    The paper discusses measurement issues relating to the characterization of organic emissions from hazardous waste incineration processes under EPA's new risk burn guidance. The recently published draft quidance recommends that hazardous waste combustion facilities complete a mass...

  18. CHARACTERIZATION OF ORGANIC EMISSIONS FROM HAZARDOUS WASTE INCINERATION PROCESSES UNDER THE NEW EPA DRAFT RISK BURN GUIDANCE: MEASUREMENT ISSUES

    EPA Science Inventory

    The paper discusses measurement issues relating to the characterization of organic emissions from hazardous waste incineration processes under EPA's new risk burn guidance. The recently published draft quidance recommends that hazardous waste combustion facilities complete a mass...

  19. Preparedness of hazardous materials emergencies in railyards: Guidance for railroads and adjacent communities

    SciTech Connect

    Not Available

    1991-09-01

    Railroads are a key part of the distribution system for hazardous materials and, thus, much hazardous material passes through railyards en route to intermediate or final consumers. While the vast majority of these materials are shipped without incident, both the number of shipments and the nature of the materials themselves dictate that railyards and surrounding communities be prepared to respond quickly and effectively to emergencies. This report contains information on 11 emergency preparedness functions and 150 guidance recommendations.

  20. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  1. Hazard categorization -- interpretation of DOE guidance DOE-STD-1027-92

    SciTech Connect

    Gibson, K.D.

    1996-04-30

    The methodology used in the production of the Hazard Category 2 TQs is verified and presented in the above calculations (Section 3.0). The analysis is done as consistently to that prescribed in DOE-STD-1027-92 as is possible. Determining the TQ limits with the standard`s variables consistently produced the listed Hazard Category 2 TQs for the six major isotopes present in the CWC waste (100% of the isotopes attempted in their analysis) and verifies their approach without changing release fractions or receptor locations. An example of a modification of the TQ for a change in the release fraction is provided in the standard Attachment 1, pages A-6 through A-10. However, the change in TQs receptor location is another issue. Modification to the TQ because of facility location is allowed in the standard, but there is no definite approach described in the standard. This change is best represented through the change in the XIQ value. This paper provides a defensible interface with DOE guidance DOE-STD-1027-92, and an established proposed methodology for standardizing changes in the TQs because of release fractions or receptor locations. By using this information as the basis for the construction of the TQs as presented in DOE-STD-1027-92, it is determined that the GXQ program will produce TQs that are consistently conservative. The XIQ values generated with the GXQ program do not produce TQs listed in the standard, and as such, XIQs generated from the GXQ code should not be used in the determination of TQ. The information presented in this paper is expected to impact future analyses and other facilities. Specifically, the establishment of a receptor location for the final hazard category determination will be needed for future analyses. The methodology presented in this paper can be applied to the production of a hazard category threshold that takes into account different release fractions and/or receptor locations, while ensuring the expected dose to the onsite receptor

  2. Hazardous Materials Hazard Analysis, Portland, Oregon.

    DTIC Science & Technology

    1981-06-01

    regional equipment, crossings , and other aspects of overall system safety. For further information: Henry Jacobs, Hazardous Materials Inspector U.S... cross -tralning and an Increased in- spection effort. In the last year, twelve Oregon State Police Officers who patrol highways around the Portland area...Response System:, The International Fire Chlef (date unknown). Lee, Myra T. and Roe, Penelope G. Hazardous Materials Management System: A Guide for Local

  3. Preliminary Hazards Analysis Plasma Hearth Process

    SciTech Connect

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  4. 76 FR 23823 - Guidance for Industry on Fish and Fishery Products Hazards and Controls, Fourth Edition...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    ... fishery products using hazard analysis and critical control point (HACCP) methods. DATES: Submit either... current thinking on fish and fishery products hazards and controls. It does not create or confer any...

  5. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  6. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    SciTech Connect

    R. Longwell; J. Keifer; S. Goodin

    2001-01-22

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  7. Occupational Safety and Health Guidance Manual for Hazardous Waste Site Activities

    DTIC Science & Technology

    1985-10-01

    environment and the cleanup of inactive waste disposal sites." This manual is a guidance document for managers responsible for occupational safety and health...programs at inactive hazardous waste sites. It assumes a basic knowledge of science and experience in occupational safety and health. It is the...product of a four-agency committee (the National Institute for Occupational Safety and Health NIOSH, the Occupational Safety and Health Administration OSHA

  8. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  9. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  10. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  11. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  12. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  13. VTOL shipboard letdown guidance system analysis

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.; Karmali, M. S.

    1983-01-01

    Alternative letdown guidance strategies are examined for landing of a VTOL aircraft onboard a small aviation ship under adverse environmental conditions. Off line computer simulation of shipboard landing task is utilized for assessing the relative merits of the proposed guidance schemes. The touchdown performance of a nominal constant rate of descent (CROD) letdown strategy serves as a benchmark for ranking the performance of the alternative letdown schemes. Analysis of ship motion time histories indicates the existence of an alternating sequence of quiescent and rough motions called lulls and swells. A real time algorithms lull/swell classification based upon ship motion pattern features is developed. The classification algorithm is used to command a go/no go signal to indicate the initiation and termination of an acceptable landing window. Simulation results show that such a go/no go pattern based letdown guidance strategy improves touchdown performance.

  14. CHARACTERISTICS OF ORGANIC EMISSIONS FROM HAZARDOUS WASTE INCINERATION PROCESSES UNDER THE NEW EPA DRAFT RISK BURN GUIDANCE: MEASUREMENT ISSUES

    EPA Science Inventory

    EPA's recently published draft Risk Burn Guidance recommends that hazardous waste combustion facilities complete a mass balance of the total organics (TOs) that may be emitted from the combustor. TOs, consisting of three distinct fractions (volatile, semivolatile, and nonvolatile...

  15. CHARACTERISTICS OF ORGANIC EMISSIONS FROM HAZARDOUS WASTE INCINERATION PROCESSES UNDER THE NEW EPA DRAFT RISK BURN GUIDANCE: MEASUREMENT ISSUES

    EPA Science Inventory

    EPA's recently published draft Risk Burn Guidance recommends that hazardous waste combustion facilities complete a mass balance of the total organics (TOs) that may be emitted from the combustor. TOs, consisting of three distinct fractions (volatile, semivolatile, and nonvolatile...

  16. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  17. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  18. Hazardous waste site remediation cost savings using technical impracticability and intrinsic remediation guidance

    SciTech Connect

    Saroff, S.T.; Gefell, M.J.; Thomas, G.

    1998-12-31

    The application of recent USEPA guidance and regulations pertaining to the technical impracticability (TI) of ground-water contamination sources areas and intrinsic (natural) remediation of aqueous contaminant plumes at hazardous waste sites can (and has) result(ed) in significant cost savings for industry and utilities. The USEPA (and certain state programs) now allow potentially responsible parties (PRPs) to implement these alternative restoration methods which are practical, implementable, and cost-effective. TI determinations involve a regulatory process to: (1) document that it is not practical in many cases to comply with certain applicable and appropriate requirements (ARARs) (i.e., ground-water standards); and (2) establish an alternative strategy while still meeting the requirements of the NCP. Additionally, the USEPA supports intrinsic remediation or natural attenuation as a restoration method through the NCP, various CERCLA, RCRA, and Leaking Underground Storage (LUST) guidance, and as encouraged at recent USEPA-sponsored conferences.

  19. 78 FR 18273 - Draft Guidance for Industry on Purchasing Reef Fish Species Associated With the Hazard of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... Reef Fish Species Associated With the Hazard of Ciguatera Fish Poisoning; Availability AGENCY: Food and...: Purchasing Reef Fish Species Associated With the Hazard of Ciguatera Fish Poisoning.'' The draft guidance, when finalized, will advise primary seafood processors who purchase reef fish how to minimize the...

  20. The hazard in using probabilistic seismic hazard analysis

    SciTech Connect

    Krinitzsky, E.L. . Geotechnical Lab.)

    1993-11-01

    Earthquake experts rely on probabilistic seismic hazard analysis for everything from emergency-response planning to development of building codes. Unfortunately, says the author, the analysis is defective for the large earthquakes that pose the greater risks. Structures have short lifetimes and the distance over which earthquakes cause damage are relatively small. Exceptions serve to prove the rule. To be useful in engineering, earthquakes hazard assessment must focus narrowly in both time and space.

  1. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  2. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  3. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  4. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  5. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    SciTech Connect

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  6. Canister Storage Building (CSB) Hazard Analysis Report

    SciTech Connect

    POWERS, T.B.

    2000-03-16

    This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safety analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other

  7. Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    EPA Science Inventory

    The Technical Guidance for Assessing Environmental Justice in Regulatory Analysis (also referred to as the Environmental Justice Technical Guidance or EJTG) is intended for use by Agency analysts, including risk assessors, economists, and other analytic staff that conduct analyse...

  8. Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    EPA Science Inventory

    The Technical Guidance for Assessing Environmental Justice in Regulatory Analysis (also referred to as the Environmental Justice Technical Guidance or EJTG) is intended for use by Agency analysts, including risk assessors, economists, and other analytic staff that conduct analyse...

  9. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must identify and characterize each of the hazards and assess the risk to public health and safety and the safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  10. Canister storage building hazard analysis report

    SciTech Connect

    POWERS, T.B.

    1999-05-11

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis was performed in accordance with the DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', and meets the intent of HNF-PRO-704, ''Hazard and Accident Analysis Process''. This hazard analysis implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports''.

  11. Abstraction and analysis of clinical guidance trees.

    PubMed

    Turner, Kenneth J

    2009-04-01

    The aims of this work were: to define an abstract notation for interactive decision trees; to formally analyse exploration errors in such trees through automated translation to Lotos (language of temporal ordering specification); to generate tree implementations through automated translation for an existing tree viewer, and to demonstrate the approach on healthcare examples created by the CGT (clinical guidance tree) project. An abstract and machine-readable notation was developed for describing clinical guidance trees: Ad/it (abstract decision/interactive trees). A methodology has been designed for creating trees using Ad/it. In particular, tree structure is separated from tree content. Tree structure and flow are designed and evaluated before committing to detailed content of the tree. Software tools have been created to translate Ad/it tree descriptions into Lotos and into CGT Viewer format. These representations support formal analysis and interactive exploration of decision trees. Through automated conversion of existing CGT trees, realistic healthcare applications have been used to validate the approach. All key objectives of the work have been achieved. An abstract notation has been created for decision trees, and is supported by automated translation and analysis. Although healthcare applications have been the main focus to date, the approach is generic and of value in almost any domain where decision trees are useful.

  12. Hazard screening application guide. Safety Analysis Report Update Program

    SciTech Connect

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  13. An application of extremal field guidance in statistical low thrust guidance analysis

    NASA Technical Reports Server (NTRS)

    Jacobson, R. A.

    1980-01-01

    The subject of this paper is the application of extremal field guidance to the covariance analysis of low thrust retargeting guidance. The general procedure for low thrust navigation analysis is briefly reviewed. The basic equations of trajectory optimization and extremal field guidance are presented with emphasis on the low thrust problem. The adopted formulation includes thruster on-off times as control variables and accounts for the effects of model parameter variations. A method of solution of the guidance equations is outlined; and the application of the solution in a covariance analysis, including a procedure for the treatment of model parameter variations as either biases or first order Gauss-Markov stochastic processes, is discussed.

  14. Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.

    2008-01-01

    Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.

  15. MONITORED GEOLOGIC REPOSITORY INTERNAL HAZARDS ANALYSIS

    SciTech Connect

    S.E. Salzman

    1999-11-05

    This analysis was performed by the Management and Operating Contractor (M&O) Safety Assurance Department to identify and document the internal hazards and preliminary events associated with preclosure operations of the Monitored Geologic Repository (MGR). Internal hazards are those hazards presented by operation of the facility and associated processes. These are in contrast to external hazards which involve natural phenomena and external man-made hazards. The hazard analysis methodology used in this analysis provides a systematic means to identify facility hazards and associated events that may result in radiological consequences to the public and facility worker during the MGR preclosure period. The events are documented in a preliminary events list and are intended to be used as input to the MGR Design Basis Event (DBE) selection process. It is expected that the results from this analysis will undergo further screening and analysis based on the criteria that apply to the performance of DBE analyses for the preclosure period of repository operation. As the MGR design progresses, this analysis will be reviewed to ensure no new hazards are introduced and that previously evaluated hazards have not increased in severity.

  16. Cold Vacuum Drying Facility hazard analysis report

    SciTech Connect

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  17. Canister storage building hazard analysis report

    SciTech Connect

    Krahn, D.E.; Garvin, L.J.

    1997-07-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  18. POTW Sludge Sampling and Analysis Guidance Document

    EPA Pesticide Factsheets

    In view of the variability of municipal sludge quality,appropriate procedures must be followed to collect and analyze samples that accurately represent each POTW's sludge quality.This manual was developed to provide that guidance to POTW operators, engin

  19. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  20. Guidance on spatial wildland fire analysis: models, tools, and techniques

    Treesearch

    Richard D. Stratton

    2006-01-01

    There is an increasing need for spatial wildland fire analysis in support of incident management, fuel treatment planning, wildland-urban assessment, and land management plan development. However, little guidance has been provided to the field in the form of training, support, or research examples. This paper provides guidance to fire managers, planners, specialists,...

  1. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  2. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  3. Integrating waste management with Job Hazard analysis

    SciTech Connect

    2007-07-01

    The web-based Automated Job Hazard Analysis (AJHA) system is a tool designed to help capture and communicate the results of the hazard review and mitigation process for specific work activities. In Fluor Hanford's day-to-day work planning and execution process, AJHA has become the focal point for integrating Integrated Safety Management (ISM) through industrial health and safety principles; environmental safety measures; and involvement by workers, subject-matter experts and management. This paper illustrates how AJHA has become a key element in involving waste-management and environmental-control professionals in planning and executing work. To support implementing requirements for waste management and environmental compliance within the core function and guiding principles of an integrated safety management system (ISMS), Fluor Hanford has developed the a computer-based application called the 'Automated Job Hazard Analysis' (AJHA), into the work management process. This web-based software tool helps integrate the knowledge of site workers, subject-matter experts, and safety principles and requirements established in standards, and regulations. AJHA facilitates a process of work site review, hazard identification, analysis, and the determination of specific work controls. The AJHA application provides a well-organized job hazard analysis report including training and staffing requirements, prerequisite actions, notifications, and specific work controls listed for each sub-task determined for the job. AJHA lists common hazards addressed in the U.S. Occupational, Safety, and Health Administration (OSHA) federal codes; and State regulations such as the Washington Industrial Safety and Health Administration (WISHA). AJHA also lists extraordinary hazards that are unique to a particular industry sector, such as radiological hazards and waste management. The work-planning team evaluates the scope of work and reviews the work site to identify potential hazards. Hazards

  4. Preliminary hazards analysis conducted on FCCU complex

    SciTech Connect

    Rooney, J.J.; Turner, J.H.; Arendt, J.S.

    1988-08-08

    A preliminary hazards analysis (PHA) was conducted on an actual fluid catalytic cracking unit (FCCU) complex at a refinery. The PHA was conducted to improve the FCCU's availability and profitability. These types of analyses can be important because the current turbulent economic conditions that exist in the refining industry have complicated the decision making process for refinery and process plant managers. To decide how to allocate their limited resources, managers need to know the hazards that present the greatest threat to refinery profitability (economic risks) and the most cost-effective options for dealing with these hazards. An economic-risk-based preliminary hazards analysis is an efficient way to identify areas in a refinery or process unit that are at unacceptably high risk. A PHA also allows for the formulation of practical, cost-effective recommendations for increasing unit on-stream time factors.

  5. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard analysis. (1) Every official establishment shall conduct, or have conducted for it, a hazard analysis...

  6. Analysis of Regulatory Guidance for Health Monitoring

    NASA Technical Reports Server (NTRS)

    Munns, Thomas E.; Beard, Richard E.; Culp, Aubrey M.; Murphy, Dennis A.; Kent, Renee M.; Cooper, Eric G. (Technical Monitor)

    2000-01-01

    The purpose of this study was to assess the connection between current FAA regulations and the incorporation of Health Management (HM) systems into commercial aircraft. To address the overall objectives ARINC: (1) investigated FAA regulatory guidance, (2) investigated airline maintenance practices, (3) systematically identified regulations and practices that would be affected or could act as barriers to the introduction of HM technology, and (4) assessed regulatory and operational tradeoffs that should be considered for implementation. The assessment procedure was validated on a postulated structural HM capability for the B757 horizontal stabilizer.

  7. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  8. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  9. Analysis of guidance law performance using personal computers

    NASA Technical Reports Server (NTRS)

    Barrios, J. Rene

    1990-01-01

    A point mass, three-degree of freedom model is presented as a basic development tool for PC based simulation models. The model has been used in the development of guidance algorithms as well as in other applications such as performance management systems to compute optimal speeds. Its limitations and advantages are discussed with regard to the windshear environment. A method for simulating a simple autopilot is explained in detail and applied in the analysis of different guidance laws.

  10. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  11. General RMP Guidance - Appendix D: OSHA Guidance on PSM

    EPA Pesticide Factsheets

    OSHA's Process Safety Management (PSM) Guidance on providing complete and accurate written information concerning process chemicals, process technology, and process equipment; including process hazard analysis and material safety data sheets.

  12. Repository Subsurface Preliminary Fire Hazard Analysis

    SciTech Connect

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  13. Guidance on the Major Source Determination for Certain Hazardous Air Pollutants

    EPA Pesticide Factsheets

    This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  14. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design

  15. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  16. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  17. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  18. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  19. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  20. WIPP waste characterization program sampling and analysis guidance manual

    SciTech Connect

    Not Available

    1991-01-01

    The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastes at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.

  1. 16 CFR 1500.231 - Guidance for hazardous liquid chemicals in children's products.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...). A toy or other article intended for use by children that contains an accessible and harmful amount..., maze toys, liquid timers, and keychains, that contain hazardous chemicals. In several of these cases...

  2. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  3. New Hypervelocity Terminal Intercept Guidance Systems for Deecting/Disrupting Hazardous Asteroids

    NASA Astrophysics Data System (ADS)

    Lyzhoft, Joshua Richard

    Computational modeling and simulations of visual and infrared (IR) sensors are investigated for a new hypervelocity terminal guidance system of intercepting small asteroids (50 to 150 meters in diameter). Computational software tools for signal-to-noise ratio estimation of visual and IR sensors, estimation of minimum and maximum ranges of target detection, and GPU (Graphics Processing Units)-accelerated simulations of the IR-based terminal intercept guidance systems are developed. Scaled polyhedron models of known objects, such as the Rosetta mission's Comet 67P/C-G, NASA's OSIRIS-REx Bennu, and asteroid 433 Eros, are utilized in developing a GPU-based simulation tool for the IR-based terminal intercept guidance systems. A parallelized-ray tracing algorithm for simulating realistic surface-to-surface shadowing of irregular-shaped asteroids or comets is developed. Polyhedron solid-angle approximation is also considered. Using these computational models, digital image processing is investigated to determine single or multiple impact locations to assess the technical feasibility of new planetary defense mission concepts of utilizing a Hypervelocity Asteroid Intercept Vehicle (HAIV) or a Multiple Kinetic-energy Interceptor Vehicle (MKIV). Study results indicate that the IR-based guidance system outperforms the visual-based system in asteroid detection and tracking. When using an IR sensor, predicting impact locations from filtered images resulted in less jittery spacecraft control accelerations than conducting missions with a visual sensor. Infrared sensors have also the possibility to detect asteroids at greater distances, and if properly used, can aid in terminal phase guidance for proper impact location determination for the MKIV system. Emerging new topics of the Minimum Orbit Intersection Distance (MOID) estimation and the Full-Two-Body Problem (F2BP) formulation are also investigated to assess a potential near-Earth object collision risk and the proximity gravity

  4. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    PubMed

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  5. RMP Guidance for Offsite Consequence Analysis

    EPA Pesticide Factsheets

    Offsite consequence analysis (OCA) consists of a worst-case release scenario and alternative release scenarios. OCA is required from facilities with chemicals above threshold quantities. RMP*Comp software can be used to perform calculations described here.

  6. RMP Guidance for Offsite Consequence Analysis - Appendices

    EPA Pesticide Factsheets

    Assists owners/operators of processes covered by Chemical Accident Prevention Program rule in analysis of offsite consequences for toxic or flammable substances. Includes methods and tables for calculating worst-case/alternative scenarios and endpoints.

  7. Trauma signature analysis of the great East Japan disaster: guidance for psychological consequences.

    PubMed

    Shultz, James M; Forbes, David; Wald, David; Kelly, Fiona; Solo-Gabriele, Helena M; Rosen, Alexa; Espinel, Zelde; McLean, Andrew; Bernal, Oscar; Neria, Yuval

    2013-04-01

    On March 11, 2011, Japan experienced the largest earthquake in its history. The undersea earthquake launched a tsunami that inundated much of Japan's eastern coastline and damaged nuclear power plants, precipitating multiple reactor meltdowns. We examined open-source disaster situation reports, news accounts, and disaster-monitoring websites to gather event-specific data to conduct a trauma signature analysis of the event. The trauma signature analysis included a review of disaster situation reports; the construction of a hazard profile for the earthquake, tsunami, and radiation threats; enumeration of disaster stressors by disaster phase; identification of salient evidence-based psychological risk factors; summation of the trauma signature based on exposure to hazards, loss, and change; and review of the mental health and psychosocial support responses in relation to the analysis. Exposure to this triple-hazard event resulted in extensive damage, significant loss of life, and massive population displacement. Many citizens were exposed to multiple hazards. The extremity of these exposures was partially mitigated by Japan's timely, expert-coordinated, and unified activation of an evidence-based mental health response. The eastern Japan disaster was notable for its unique constellation of compounding exposures. Examination of the trauma signature of this event provided insights and guidance regarding optimal mental health and psychosocial responses. Japan orchestrated a model response that reinforced community resilience.

  8. Decision analysis for INEL hazardous waste storage

    SciTech Connect

    Page, L.A.; Roach, J.A.

    1994-01-01

    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft{sup 2} of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies.

  9. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  10. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  11. Economic Analysis of Hazardous Waste Minimization Alternatives

    DTIC Science & Technology

    1992-08-01

    OF REPORT OF THIS PP,,E OF ABSTRACT Unclassified Unclassilied Unclassified SAR NSN 7540 01-280 5500 SuxVl form t (Rev 2- R ]I Pe•nbed or ANSi Srd 2r39...Consulting Associates, Inc.. 15 June 1987). Chapter 7. T . Page, R . Harris, and J. Bruser, Removal of Carcinogens from Drinking Water: A Cost-Benefit Analysis...Pretreatment of Hazardous Waste, EPA/600/D-87/047 (EPA, January 1987), pp 58-70. Page, T ., R . Harris, and J. Bruser. Removal of Carcinogens from Drinking Water

  12. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  13. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  14. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  15. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  16. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  17. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  18. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... (a) of this section, which must be controlled for each process. (2) List the critical control points... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard... determine the food safety hazards reasonably likely to occur in the production process and identify...

  19. 78 FR 39284 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-01

    ... entitled, ``Technical Guidance for Assessing Environmental Justice in Regulatory Analysis.'' The purpose of... Technical Guidance for Assessing Environmental Justice in Regulatory Analysis Docket, EPA/DC, EPA West, Room... Technical Guidance for Assessing Environmental Justice in Regulatory Analysis is available in the public...

  20. Characterizing adoption of precautionary risk management guidance for nanomaterials, an emerging occupational hazard.

    PubMed

    Schubauer-Berigan, Mary K; Dahm, Matthew M; Schulte, Paul A; Hodson, Laura; Geraci, Charles L

    2015-01-01

    Exposure to engineered nanomaterials (substances with at least one dimension of 1-100 nm) has been of increased interest, with the recent growth in production and use of nanomaterials worldwide. Various organizations have recommended methods to minimize exposure to engineered nanomaterials. The purpose of this study was to evaluate available data to examine the extent to which studied U.S. companies (which represent a small fraction of all companies using certain forms of engineered nanomaterials) follow the guidelines for reducing occupational exposures to engineered nanomaterials that have been issued by the National Institute for Occupational Safety and Health (NIOSH) and other organizations. Survey data, field reports, and field notes for all NIOSH nanomaterial exposure assessments conducted between 2006 and 2011 were collected and reviewed to: (1) determine the level of adoption of precautionary guidance on engineering controls and personal protective equipment (PPE), and (2) evaluate the reliability of companies' self-reported use of engineering controls and PPE. Use of PPE was observed among 89% [95% confidence interval (CI): 76%-96%] of 46 visited companies, and use of containment-based engineering controls for at least some processes was observed among 83% (95% CI: 76%-96%). In on-site evaluations, more than 90% of the 16 engineered carbonaceous nanomaterial companies that responded to an industrywide survey were observed to be using engineering controls and PPE as reported or more stringently than reported. Since PPE use was slightly more prevalent than engineering controls, better communication may be necessary to reinforce the importance of the hierarchy of controls. These findings may also be useful in conducting exposure assessment and epidemiologic research among U.S. workers handling nanomaterials.

  1. Space Shuttle Orbiter entry guidance and control system sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Stone, H. W.; Powell, R. W.

    1976-01-01

    An approach has been developed to determine the guidance and control system sensitivity to off-nominal aerodynamics for the Space Shuttle Orbiter during entry. This approach, which uses a nonlinear six-degree-of-freedom interactive, digital simulation, has been applied to both the longitudinal and lateral-directional axes for a portion of the orbiter entry. Boundary values for each of the aerodynamic parameters have been identified, the key parameters have been determined, and system modifications that will increase system tolerance to off-nominal aerodynamics have been recommended. The simulations were judged by specified criteria and the performance was evaluated by use of key dependent variables. The analysis is now being expanded to include the latest shuttle guidance and control systems throughout the entry speed range.

  2. Preliminary hazards analysis for the National Ignition Facility

    SciTech Connect

    Brereton, S.J.

    1993-10-01

    This report documents the Preliminary Hazards Analysis (PHA) for the National Ignition Facility (NIF). In summary, it provides: a general description of the facility and its operation; identification of hazards at the facility; and details of the hazards analysis, including inventories, bounding releases, consequences, and conclusions. As part of the safety analysis procedure set forth by DOE, a PHA must be performed for the NIF. The PHA characterizes the level of intrinsic potential hazard associated with a facility, and provides the basis for hazard classification. The hazard classification determines the level of safety documentation required, and the DOE Order governing the safety analysis. The hazard classification also determines the level of review and approval required for the safety analysis report. The hazards of primary concern associated with NIF are radiological and toxicological in nature. The hazard classification is determined by comparing facility inventories of radionuclides and chemicals with threshold values for the various hazard classification levels and by examining postulated bounding accidents associated with the hazards of greatest significance. Such postulated bounding accidents cannot take into account active mitigative features; they must assume the unmitigated consequences of a release, taking into account only passive safety features. In this way, the intrinsic hazard level of the facility can be ascertained.

  3. Remote rainfall sensing for landslide hazard analysis

    USGS Publications Warehouse

    Wieczorek, Gerald F.; McWreath, Harry; Davenport, Clay

    2001-01-01

    Methods of assessing landslide hazards and providing warnings are becoming more advanced as remote sensing of rainfall provides more detailed temporal and spatial data on rainfall distribution. Two recent landslide disasters are examined noting the potential for using remotely sensed rainfall data for landslide hazard analysis. For the June 27, 1995, storm in Madison County, Virginia, USA, National Weather Service WSR-88D Doppler radar provided rainfall estimates based on a relation between cloud reflectivity and moisture content on a 1 sq. km. resolution every 6 minutes. Ground-based measurements of rainfall intensity and precipitation total, in addition to landslide timing and distribution, were compared with the radar-derived rainfall data. For the December 14-16, 1999, storm in Vargas State, Venezuela, infrared sensing from the GOES-8 satellite of cloud top temperatures provided the basis for NOAA/NESDIS rainfall estimates on a 16 sq. km. resolution every 30 minutes. These rainfall estimates were also compared with ground-based measurements of rainfall and landslide distribution. In both examples, the remotely sensed data either overestimated or underestimated ground-based values by up to a factor of 2. The factors that influenced the accuracy of rainfall data include spatial registration and map projection, as well as prevailing wind direction, cloud orientation, and topography.

  4. Regional Frequency Analysis of Ocean Hazard

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro; Weiss, Jerome; Benoit, Michel; Andreewsky, Marc

    2015-04-01

    The estimation of the extreme return level (up to 10-4 annual probability of exceedence) of natural phenomena is a very uncertain exercise, when extrapolating using the information and the measure collected in a single site. The aim of the Regional Frequency Analysis (RFA) is to benefit from the information contained in observations and data collected not only on the site of interested but in a larger set of sites, located in the same region of the site of interest or sharing with it similar characteristics. This technique was introduced in the '60 and widely used in various domains including hydrology and meteorology. The RFA was recently acknowledge as a potential choice for the estimation of flooding hazard in the Methodological Guide for flooding hazard estimation [1], published in 2013 by the French Nuclear Safety Autority. The aim of this presentation is to bring in the main concepts of the RFA and illustrate the latest innovation on its application, delivered by EDF R&D. They concerns the statistical definition of storms, the formation of homogeneous regions and a new approach for filtering the redundant information linked to the spatial correlation of natural phenomena. Application to skew surges and waves will be shown 1. ASN, Guide pour la Protection des installations nucléaires de base contre les inondations externes. 2013, ASN. p. 44.

  5. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ... entitled ``Documenting Statistical Analysis Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to... guidance entitled ``Draft Guidance for Industry on Documenting Statistical Analysis Programs and Data Files...

  6. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    NASA Technical Reports Server (NTRS)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  7. Probabilistic Seismic Hazard Analysis for Southern California Coastal Facilities

    SciTech Connect

    Savy, J; Foxall, B

    2004-04-16

    The overall objective of this study was to develop probabilistic seismic hazard estimates for the coastal and offshore area of Ventura, Los Angeles and Orange counties for use as a basis for the University of Southern California (USC) to develop physical models of tsunami for the coastal regions and by the California State Lands Commission (SLC) to develop regulatory standards for seismic loading and liquefaction evaluation of marine oil terminals. The probabilistic seismic hazard analysis (PSHA) was carried out by the Lawrence Livermore National Laboratory (LLNL), in several phases over a time period of two years, following the method developed by LLNL for the estimation of seismic hazards at Department Of Energy (DOE) facilities, and for 69 locations of nuclear plants in the Eastern United States, for the Nuclear Regulatory Commission (NRC). This method consists in making maximum use of all physical data (qualitative, and quantitative) and to characterize the uncertainties by using a set of alternate spatiotemporal models of occurrence of future earthquakes, as described in the SSHAC, PSHA Guidance Document (Budnitz et al., 1997), and implemented for the NRC (Savy et al., 2002). In general, estimation of seismic hazard is based not only on our understanding of the regional tectonics and detailed characterization of the faults in the area but also on the analysis methods employed and the types of physical and empirical models that are deemed appropriate for the analysis. To develop this understanding, the body of knowledge in the scientific community is sampled in a series of workshops with a group of experts representative of the entire scientific community, including geologists and seismologists from the United States Geological Survey (USGS), members of the South California Earthquake Center (SCEC), and members of academic institutions (University of California Santa-Cruz, Stanford, UC Santa Barbara, and University of Southern California), and members of

  8. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  9. 327 Building fire hazards analysis implementation plan

    SciTech Connect

    BARILO, N.F.

    1999-05-10

    In March 1998, the 327 Building Fire Hazards Analysis (FHA) (Reference 1) was approved by the U.S. Department of Energy, Richland Operations Office (DOE-E) for implementation by B and W Hanford Company (BWC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in five areas and provided nine recommendations (11 items) to bring the 327 Building into compliance. A status is provided for each recommendation in this document. BWHC will use this Implementation Plan to bring the 327 Building and its operation into compliance with DOE Order 5480.7A and IUD 5480.7.

  10. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  11. The use of hazards analysis in the development of training

    SciTech Connect

    Houghton, F.K.

    1998-03-01

    When training for a job in which human error has the potential of producing catastrophic results, an understanding of the hazards that may be encountered is of paramount importance. In high consequence activities, it is important that the training program be conducted in a safe environment and yet emphasize the potential hazards. Because of the high consequence of a human error the use of a high-fidelity simulation is of great importance to provide the safe environment the worker needs to learn and hone required skills. A hazards analysis identifies the operation hazards, potential human error, and associated positive measures that aid in the mitigation or prevention of the hazard. The information gained from the hazards analysis should be used in the development of training. This paper will discuss the integration of information from the hazards analysis into the development of simulation components of a training program.

  12. Analysis of cell locomotion. Contact guidance of human polymorphonuclear leukocytes.

    PubMed

    Matthes, T; Gruler, H

    1988-01-01

    The methods of statistical physics have been applied to the analysis of cell movement. Human polymorphonuclear leukocytes were exposed to different surfaces possessing parallel oriented physical structures (scratched glass surface, machine drilled aluminum surface, optical grid and stretched polyethylene foil) and cell migration was observed using time-lapse photography. We demonstrate that in cell migration along physical structures, referred to as contact guidance, two subgroups can be distinguished: 1) The nematic type where the cell size is large in relation to the grid distance of the undulate surface. 2) The smectic type where the cell size is small in relation to the grid distance of the substrate. Nematic contact guidance is characterized by an anisotropic random walk. In all substrates investigated the diffusion process parallel to the lines was faster than the diffusion process perpendicular to them. The angular dependent diffusion coefficient was described by an ellipse. Deviation from a circle defined an apolar order parameter, whose value was about 0.3. The amount of information which the cells collected from, the undulate surface was very low, between 0.1 and 0.2 bits. We demonstrate that cells do not recognize all the details of their surroundings and that their migration can be compared to the "groping around" of a short sighted man. The blurred environment can be described by a mean field whose strength is proportional to the apolar order parameter. It is argued that the anisotropic surface tension is the basic source for nematic contact guidance. Smectic contact guidance is characterized by an anisotropic random walk and is quantified by a density order parameter which is 0.28 in the case of the scratched glass surface of a Neubauer counting chamber. The information which the cells collect from their environment is very low (0.03 bits). The lines seen by the cell can be described by a mean field whose strength is proportional to the density oder

  13. Fire hazard analysis for the fuel supply shutdown storage buildings

    SciTech Connect

    REMAIZE, J.A.

    2000-09-27

    The purpose of a fire hazards analysis (FHA) is to comprehensively assess the risk from fire and other perils within individual fire areas in a DOE facility in relation to proposed fire protection so as to ascertain whether the objectives of DOE 5480.7A, Fire Protection, are met. This Fire Hazards Analysis was prepared as required by HNF-PRO-350, Fire Hazards Analysis Requirements, (Reference 7) for a portion of the 300 Area N Reactor Fuel Fabrication and Storage Facility.

  14. 78 FR 69992 - Guidance for Industry on Purchasing Reef Fish Species Associated With the Hazard of Ciguatera...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... HUMAN SERVICES Food and Drug Administration 21 CFR Part 123 Guidance for Industry on Purchasing Reef... announcing the availability of a guidance for industry entitled ``Guidance for Industry: Purchasing Reef Fish... primary seafood processors who purchase reef fish on how to minimize the risk of ciguatera fish poisoning...

  15. Fire hazards analysis of transuranic waste storage and assay facility

    SciTech Connect

    Busching, K.R., Westinghouse Hanford

    1996-07-31

    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  16. Identification of chemical hazards for security risk analysis activities.

    SciTech Connect

    Jaeger, Calvin Dell

    2005-01-01

    The presentation outline of this paper is: (1) How identification of chemical hazards fits into a security risk analysis approach; (2) Techniques for target identification; and (3) Identification of chemical hazards by different organizations. The summary is: (1) There are a number of different methodologies used within the chemical industry which identify chemical hazards: (a) Some develop a manual listing of potential targets based on published lists of hazardous chemicals or chemicals of concern, 'expert opinion' or known hazards. (b) Others develop a prioritized list based on chemicals found at a facility and consequence analysis (offsite release affecting population, theft of material, product tampering). (2) Identification of chemical hazards should include not only intrinsic properties of the chemicals but also potential reactive chemical hazards and potential use for activities off-site.

  17. Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Maltach, E. G.

    1969-01-01

    The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.

  18. Military hazardous wastes: an overview and analysis

    SciTech Connect

    Kawaoka, K.E.; Malloy, M.C.; Dever, G.L.; Weinberger, L.P.

    1981-12-01

    The report describes and analyzes the management activities and motivating factors of the military in dealing with its hazardous waste streams. Findings and conclusions in areas of concern are given to provide information that may be of value to the future management of military hazardous wastes.

  19. The use of hazards analysis in the development of training

    SciTech Connect

    Houghton, F.K.

    1998-12-01

    A hazards analysis identifies the operation hazards and the positive measures that aid in the mitigation or prevention of the hazard. If the tasks are human intensive, the hazard analysis often credits the personnel training as contributing to the mitigation of the accident`s consequence or prevention of an accident sequence. To be able to credit worker training, it is important to understand the role of the training in the hazard analysis. Systematic training, known as systematic training design (STD), performance-based training (PBT), or instructional system design (ISD), uses a five-phase (analysis, design, development, implementation, and evaluation) model for the development and implementation of the training. Both a hazards analysis and a training program begin with a task analysis that documents the roles and actions of the workers. Though the tasks analyses are different in nature, there is common ground and both the hazard analysis and the training program can benefit from a cooperative effort. However, the cooperation should not end with the task analysis phase of either program. The information gained from the hazards analysis should be used in all five phases of the training development. The training evaluation, both of the individual worker and institutional training program, can provide valuable information to the hazards analysis effort. This paper will discuss the integration of the information from the hazards analysis into a training program. The paper will use the installation and removal of a piece of tooling that is used in a high-explosive operation. This example will be used to follow the systematic development of a training program and demonstrate the interaction and cooperation between the hazards analysis and training program.

  20. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  1. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-06-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  2. Active polarimeter optical system laser hazard analysis.

    SciTech Connect

    Augustoni, Arnold L.

    2005-07-01

    A laser hazard analysis was performed for the SNL Active Polarimeter Optical System based on the ANSI Standard Z136.1-2000, American National Standard for Safe Use of Lasers and the ANSI Standard Z136.6-2000, American National Standard for Safe Use of Lasers Outdoors. The Active Polarimeter Optical System (APOS) uses a pulsed, near-infrared, chromium doped lithium strontium aluminum fluoride (Cr:LiSAF) crystal laser in conjunction with a holographic diffuser and lens to illuminate a scene of interest. The APOS is intended for outdoor operations. The system is mounted on a height adjustable platform (6 feet to 40 feet) and sits atop a tripod that points the beam downward. The beam can be pointed from nadir to as much as 60 degrees off of nadir producing an illuminating spot geometry that can vary from circular (at nadir) to elliptical in shape (off of nadir). The JP Innovations crystal Cr:LiSAF laser parameters are presented in section II. The illuminating laser spot size is variable and can be adjusted by adjusting the separation distance between the lens and the holographic diffuser. The system is adjusted while platform is at the lowest level. The laser spot is adjusted for a particular spot size at a particular distance (elevation) from the laser by adjusting the separation distance (d{sub diffuser}) to predetermined values. The downward pointing angle is also adjusted before the platform is raised to the selected operation elevation.

  3. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Toxic release hazard analysis. 417.227 Section 417.227 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  4. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Toxic release hazard analysis. 417.227 Section 417.227 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  5. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Toxic release hazard analysis. 417.227 Section 417.227 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  6. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Toxic release hazard analysis. 417.227 Section 417.227 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  7. Fire hazards analysis for solid waste burial grounds

    SciTech Connect

    McDonald, K.M.

    1995-09-28

    This document comprises the fire hazards analysis for the solid waste burial grounds, including TRU trenches, low-level burial grounds, radioactive mixed waste trenches, etc. It analyzes fire potential, and fire damage potential for these facilities. Fire scenarios may be utilized in future safety analysis work, or for increasing the understanding of where hazards may exist in the present operation.

  8. Cold Vacuum Drying (CVD) Facility Hazards Analysis Report

    SciTech Connect

    CROWE, R.D.

    2000-08-07

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) Hazard Analysis to support the CVDF Final Safety Analysis Report and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports,'' and implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports.''

  9. ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis

    NASA Technical Reports Server (NTRS)

    Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.

    2006-01-01

    Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.

  10. Guidance strategies and analysis for low thrust navigation

    NASA Technical Reports Server (NTRS)

    Jacobson, R. A.

    1973-01-01

    A low-thrust guidance algorithm suitable for operational use was formulated. A constrained linear feedback control law was obtained using a minimum terminal miss criterion and restricting control corrections to constant changes for specified time periods. Both fixed- and variable-time-of-arrival guidance were considered. The performance of the guidance law was evaluated by applying it to the approach phase of the 1980 rendezvous mission with the comet Encke.

  11. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2016-04-15

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, each based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.

  12. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    SciTech Connect

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  13. Hazard Analysis of Japanese Boxed Lunches (Bento).

    ERIC Educational Resources Information Center

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary…

  14. Hazard Analysis of Japanese Boxed Lunches (Bento).

    ERIC Educational Resources Information Center

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary…

  15. User’s Guide - Seismic Hazard Analysis

    DTIC Science & Technology

    1993-02-01

    Eartquake Magnitude Cutoff 8.5 example 8.8 Enter Site Longitude (Degrees) 117 example 115.0 Enter Site Latitude (Degrees) 38 example 38.5 Any Chnges? Y / H...the art for assessing earthquake hazards in the United States catalogue of strong motion eartquake records, Wtaerways Experiment Station, Vicks- burg

  16. Hazardous Substance Release Reporting Under CERCLA, EPCR {section}304 and DOE Emergency Management System (EMS) and DOE Occurrence Reporting Requirements. Environmental Guidance

    SciTech Connect

    Traceski, T.T.

    1994-06-01

    Releases of various substances from DOE facilities may be subject to reporting requirements under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and the Emergency Planning and Community Right-to-Know Act (EPCRA), as well as DOE`s internal ``Occurrence Reporting and Processing of Operations Information`` and the ``Emergency Management System`` (EMS). CERCLA and EPCPA are Federal laws that require immediate reporting of a release of a Hazardous Substance (HS) and an Extremely Hazardous Substance (EHS), respectively, in a Reportable Quantity (RQ) or more within a 24-hour period. This guidance uses a flowchart, supplemental information, and tables to provide an overview of the process to be followed, and more detailed explanations of the actions that must be performed, when chemical releases of HSs, EHSs, pollutants, or contaminants occur at DOE facilities. This guidance should be used in conjunction with, rather than in lieu of, applicable laws, regulations, and DOE Orders. Relevant laws, regulations, and DOE Orders are referenced throughout this guidance.

  17. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  18. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  19. Simple Sensitivity Analysis for Orion Guidance Navigation and Control

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar

    2013-01-01

    The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.

  20. Department of Defense radiation hazards testing analysis

    NASA Astrophysics Data System (ADS)

    Fortinberry, Jarrod Douglas

    The United States Department of Defense tests its weapons systems to many different real as well as man-made environments prior to deploying the systems to the troops. The intent of testing is to insure the systems function as they are intended without adverse reactions. One of the required tests is a Radiation Hazards test to insure that nonionizing radio frequency waves generated by transmitters, such as radios and jammers, do not cause harm to personnel, ordnance or fuel. This test is typically performed at a DoD test lab and data taken at the lab is used to determine safe operating parameters for a particular piece of equipment. This thesis presents measurements as well as mathematical models to demonstrate methods that can be employed to take more relevant Radiation Hazards data.

  1. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    SciTech Connect

    Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin; Rodriguez-Marek, Adrian; Falero, Valentina Montaldo; Youngs, Robert

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-risk informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.

  2. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    SciTech Connect

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka; Ken Yanagisawa; Tadashi Annaka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present an example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)

  3. Space Propulsion Hazards Analysis Manual (SPHAM), volume 1

    NASA Technical Reports Server (NTRS)

    Becker, Dorothy L. (Editor)

    1989-01-01

    The Space Propulsion Hazards Analysis Manual (SPHAM) is a compilation of methods and data directed at hazards analysis and safety for space propulsion and associated vehicles, but broadly applicable to other environments and systems. Methods are described of compiling relevant regulatory documentation, deriving design requirements and specifications, modeling accident scenarios in formal risk assessments, and correlation real-time data to risk probability modeling. Also, SPHAM provides methods for predicting post-accident blast, fragmentation, thermal, and environmental damage. Included in the appendices are an exhaustive bibliography, hazardous properties information on selected space propulsion commodities, and system descriptions of various launch vehicles, upper stages, and spacecrafts.

  4. Fire Hazards Analysis for the 200 Area Interim Storage Area

    SciTech Connect

    JOHNSON, D.M.

    2000-01-06

    This documents the Fire Hazards Analysis (FHA) for the 200 Area Interim Storage Area. The Interim Storage Cask, Rad-Vault, and NAC-1 Cask are analyzed for fire hazards and the 200 Area Interim Storage Area is assessed according to HNF-PRO-350 and the objectives of DOE Order 5480 7A. This FHA addresses the potential fire hazards associated with the Interim Storage Area (ISA) facility in accordance with the requirements of DOE Order 5480 7A. It is intended to assess the risk from fire to ensure there are no undue fire hazards to site personnel and the public and to ensure property damage potential from fire is within acceptable limits. This FHA will be in the form of a graded approach commensurate with the complexity of the structure or area and the associated fire hazards.

  5. Overview hazard analysis for the H2Fuel Bus Program

    SciTech Connect

    Hovis, G.L.

    1996-06-18

    The H2Fuel Bus project is a joint development effort to produce a safe, near-zero emission, 32 passenger bus that is propelled by electric power with continuous on-board hydrogen powered battery recharging. A key initiative in the hydrogen bus development effort is a rigorous evaluation of operational safety. Westinghouse Savannah River Co., the prime contractor at the Department of Energy`s Savannah River Site, has developed a hazard analysis methodology designed to provide a systematic, comprehensive identification and evaluation of hazards. Although originally developed to support nuclear/chemical facility safety basis documentation, the SRS Methodology has widespread applicability to operations and/or systems that utilize hazardous materials and energy. This methodology was used to perform an overview hazard analysis for the H2Fuel Bus project to focus attention on those hypothetical circumstances that pose the greatest threat to the populace and property. The hazard analysis yields a listing of all known H2Fuel Bus hazards, postulated accident scenarios describing possible hazardous releases or conditions, an assessment of the scenarios in terms of frequency of occurrence and consequence, and binning in frequency-consequence space to assess the relative severity of postulated scenarios.

  6. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, B.; Siu, Y. L.; Mitchell, G.

    2015-12-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  7. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  8. Analysis of Brazilian data for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Drouet, S.; Assumpção, M.

    2013-05-01

    Seismic hazard analysis in Brazil is going to be re-assessed in the framework of the Global Earthquake Model (GEM) project. Since the last worldwide Global Seismic Hazard Analysis Project (GSHAP) there has been no specific study in this field in Brazil. Brazil is a stable continental region and is characterized by a low seismic activity. In this particular type of regions, seismic hazard assessment is a very hard task due to the limited amount of data available regarding the seismic sources, earthquake catalogue, or ground-motion amplitudes, and the uncertainties associated are very large. This study focuses on recorded data in South-East Brazil where broadband stations are installed, belonging to two networks: the network managed by the seismology group at the IAG-USP in São Paulo which exists since about 20 years, and the network managed by the Observatorio Nacional in Rio de Janeiro which has just been set up. The two networks are now integrated into the national network RSB (Rede Sismográfica Brasileira) which will also include stations from the rest of Brazil currently in installation by the Universities of Brasilia and Natal. There are a couple of events with magnitude greater than 3 recorded at these very sensitive stations, usually at rather large distances. At first sight these data may appear meaningless in the context of seismic hazard but they can help to improve different parts involved in the process. The analysis of the S-wave Fourier spectra can help to better resolve source, path and site effects in Brazil. For instance moment magnitudes can be computed from the flat part of the Fourier spectra. These magnitudes are of utmost importance in order to build an homogeneous catalogue in terms of moment magnitude. At the moment only body wave magnitude (or some equivalent scale) are determined routinely for the events in Brazil. Attenuation and site effect, especially the high-frequency attenuation known as the kappa effect will also help to

  9. Speed your hazard analysis with the focused what if?

    SciTech Connect

    Goodman, L.

    1996-07-01

    Chemical process industries (CPI) plants worldwide must diminish their associated hazards. CPI firms are driven by their need to protect the public, their plants` neighbors, and their operating staffs from the consequences of accidents, as well as to meet regulations. Assuring the company and its regulators of safe operation requires conducting hazard analyses to demonstrate that plants are designed and operated to be as safe as practicable. For all organizations, large and small, it can be daunting and resource-consuming to conduct hazard analyses. There are various acceptable methods of hazard analysis: the hazard and operability analysis (HAZOP), the what if?, failure modes and effects and analysis (FMEA), and checklists, among others. Of all these methods, the HAZOP is on its way to becoming the de facto standard, because it is seemingly the most comprehensive and detailed method. Unfortunately, the HAZOP analysis, in many cases, may not be a good choice for a hazard analysis. This article explains more of the why of the focused what if (FWI), describes what it is, tells where it has been applied successfully, and provides an abridged example of its use.

  10. Seismic hazard analysis for Jayapura city, Papua

    SciTech Connect

    Robiana, R. Cipta, A.

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  11. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    PubMed

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  12. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; (iv) Pesticides; (v) Drug residues; (vi) Decomposition in scombroid toxin-forming species or in any other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where... HACCP plan need not list the food safety hazard associated with the formation of Clostridium...

  13. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...; (iv) Pesticides; (v) Drug residues; (vi) Decomposition in scombroid toxin-forming species or in any other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where... HACCP plan need not list the food safety hazard associated with the formation of Clostridium...

  14. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...; (iv) Pesticides; (v) Drug residues; (vi) Decomposition in scombroid toxin-forming species or in any other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where... HACCP plan need not list the food safety hazard associated with the formation of Clostridium...

  15. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...; (iv) Pesticides; (v) Drug residues; (vi) Decomposition in scombroid toxin-forming species or in any other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where... HACCP plan need not list the food safety hazard associated with the formation of Clostridium...

  16. Analysis of navigation and guidance requirements for commercial VTOL operations

    NASA Technical Reports Server (NTRS)

    Hoffman, W. C.; Zvara, J.; Hollister, W. M.

    1975-01-01

    The paper presents some results of a program undertaken to define navigation and guidance requirements for commercial VTOL operations in the takeoff, cruise, terminal and landing phases of flight in weather conditions up to and including Category III. Quantitative navigation requirements are given for the parameters range, coverage, operation near obstacles, horizontal accuracy, multiple landing aircraft, multiple pad requirements, inertial/radio-inertial requirements, reliability/redundancy, update rate, and data link requirements in all flight phases. A multi-configuration straw-man navigation and guidance system for commercial VTOL operations is presented. Operation of the system is keyed to a fully automatic approach for navigation, guidance and control, with pilot as monitor-manager. The system is a hybrid navigator using a relatively low-cost inertial sensor with DME updates and MLS in the approach/departure phases.

  17. A sensitivity analysis of hazardous waste disposal site climatic and soil design parameters using HELP3

    SciTech Connect

    Adelman, D.D.; Stansbury, J.

    1997-12-31

    The Resource Conservation and Recovery Act (RCRA) Subtitle C, Comprehensive Environmental Response, Compensation, And Liability Act (CERCLA), and subsequent amendments have formed a comprehensive framework to deal with hazardous wastes on the national level. Key to this waste management is guidance on design (e.g., cover and bottom leachate control systems) of hazardous waste landfills. The objective of this research was to investigate the sensitivity of leachate volume at hazardous waste disposal sites to climatic, soil cover, and vegetative cover (Leaf Area Index) conditions. The computer model HELP3 which has the capability to simulate double bottom liner systems as called for in hazardous waste disposal sites was used in the analysis. HELP3 was used to model 54 combinations of climatic conditions, disposal site soil surface curve numbers, and leaf area index values to investigate how sensitive disposal site leachate volume was to these three variables. Results showed that leachate volume from the bottom double liner system was not sensitive to these parameters. However, the cover liner system leachate volume was quite sensitive to climatic conditions and less sensitive to Leaf Area Index and curve number values. Since humid locations had considerably more cover liner system leachate volume than and locations, different design standards may be appropriate for humid conditions than for and conditions.

  18. Frequency Analysis of Aircraft hazards for License Application

    SciTech Connect

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  19. International perspectives on social media guidance for nurses: a content analysis.

    PubMed

    Ryan, Gemma

    2016-12-01

    Aim This article reports the results of an analysis of the content of national and international professional guidance on social media for the nursing profession. The aim was to consolidate good practice examples of social media guidelines, and inform the development of comprehensive guidance. Method A scoping search of professional nursing bodies' and organisations' social media guidance documents was undertaken using google search. Results 34 guidance documents were located, and a content analysis of these was conducted. Conclusion The results, combined with a review of competency hearings and literature, indicate that guidance should cover the context of social media, and support nurses to navigate and negotiate the differences between the real and online domains to help them translate awareness into actions.

  20. Dispersion analysis for baseline reference mission 3A using powered explicit guidance

    NASA Technical Reports Server (NTRS)

    Snow, L. S.

    1975-01-01

    A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for baseline reference mission 3A. Powered explicit guidance (PEG) was used to develop closed loop steering commands for this dispersion analysis. The nominal profile for the dispersion analysis is identical to the nominal profile of Reference 1 ascent trajectory with the exception that generalized linear tangent (GLT) guidance is used in Reference 1. Nominal trajectory differences which result from using PEG instead of GLT were determined and dispersion data were developed using PEG for comparison with similar data developed using GLT guidance.

  1. AN ENHANCED HAZARD ANALYSIS PROCESS FOR THE HANFORD TANK FARMS

    SciTech Connect

    SHULTZ MV

    2008-05-15

    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process.

  2. FMEA, the alternative process hazard method. [Failure Mode Effects Analysis

    SciTech Connect

    Goyal, R.K. )

    1993-05-01

    Failure mode effects analysis (FMEA) is an old reliability/assurance tool finding its way into the HPI. Not popular yet, this hazard technique has some viable applications that can improve hazard assessment data. Notably, FMEA studies can identify possible areas for improvement that may have not been discovered using other methods. Also, FMEA is not as labor intensive and costly as other process hazard analysis (PHA) methods. PSHA 1910.119 set in place an informational structure whose main purpose is the reduction of potential accidents and minimizing risks in the event of an accident. Consequently, HPI operators must evaluate their process systems and identify potential major hazards, such as fires, explosions and accidental release of toxic/hazardous chemicals, and protect their facilities, employees, the public and the environment. But, which PHA method(s) apply to a particular plant or process still remains a difficult question. This paper describes what FMEA is; types of FMEA; how to conduct a FMEA study; comparison with HAZOP (hazard and operability study); computer software; applicability of FMEA; and examples of its use.

  3. Fire hazards analysis for the Center for National Security and Arms Control (CNSAC) Facility

    SciTech Connect

    Klamerus, E.W.; Ross, S.B.

    1993-07-01

    This Fire Hazards Analysis is sponsored by Plant Engineering and is prepared to support the Safety Assessment for the CNSAC Facility. This is a preliminary fire hazards analysis of a yet to be constructed facility and is based upon the current building design and the current understanding of the potential occupancy hazards. The governing occupancy for this building is personnel offices. The CNSAC facility will be dedicated primarily to two activities: (1) arms control and verification technology and (2) intelligence. This report supplements the Safety Assessment for the CNSAC facility and follows the guidance of DOE Memorandum EH-31.3 and meets the objectives of paragraph 4 of DOE Order 5480.7A, ``Fire Protection.`` This analysis demonstrates that under ``worst case`` assumptions a fire in the CNSAC facility will result in consequences which are below DOE offsite guidelines for accident conditions. This report is based upon preliminary design information and any major changes to the building design may require additional analyses.

  4. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  5. K basin sludge & water system preliminary hazard analysis

    SciTech Connect

    MARTIN, J.B.

    2001-09-06

    This report provides the initial Preliminary Hazard Analysis (PHA) for the Spent Nuclear Fuel (SNF) Project, Sludge and Water System (SWS) subproject. The purpose of the K East (KE) basins SWS is to provide a system that will transfer the KE basins sludge from in-basin storage or source locations to containment and loadout for subsequent transport to T Plant for storage. The SWS will provide a system (Phase I) to retrieve and contain the identified as-settled sludge volume in the KE basin, while maintaining basin water clarity and associated radioactive control limits in accordance with basin operation requirements, and to support the Fuel Transfer System (FTS) and Debris Removal System (DRS). The SWS will also provide a system (Phase II) to retrieve and contain the remaining as-settled sludge volume in the K East basin, while maintaining basin water clarity and associated radioactive control limits in accordance with basin operation requirements. The SWS is in the conceptual design phase. A primary purpose of this PHA is to support the Conceptual Design Document (draft). As details of the SWS process and design are developed, this initial PHA will require revision. The K Basin Hazard Analysis (HNF-3960) documents the hazard analysis previously performed on the K basins including the multi-canister overpack (MCO) and support buildings. HNF-3960 will be updated later to reflect future SWS activities for sludge and water transfer to ensure that all hazards are included. All operational activities and energy sources associated with the SWS are evaluated in this hazard analysis. Using a systematic approach, this document identifies hazards created by abnormal operating conditions, external events (e.g., range fire), and natural phenomena hazards (e.g., earthquake) with the potential for causing undesirable consequences to facility workers, on-site individuals, the public, or the environment.

  6. Pedestrian Evacuation Analysis for Tsunami Hazards

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Wood, N. J.

    2014-12-01

    Recent catastrophic tsunamis in the last decade, as well as the 50th anniversary of the 1964 Alaskan event, have heightened awareness of the threats these natural hazards present to large and increasing coastal populations. For communities located close to the earthquake epicenter that generated the tsunami, strong shaking may also cause significant infrastructure damage, impacting the road network and hampering evacuation. There may also be insufficient time between the earthquake and first wave arrival to rely on a coordinated evacuation, leaving at-risk populations to self-evacuate on foot and across the landscape. Emergency managers evaluating these coastal risks need tools to assess the evacuation potential of low-lying areas in order to discuss mitigation options, which may include vertical evacuation structures to provide local safe havens in vulnerable communities. The U.S. Geological Survey has developed the Pedestrian Evacuation Analyst software tool for use by researchers and emergency managers to assist in the assessment of a community's evacuation potential by modeling travel times across the landscape and producing both maps of travel times and charts of population counts with corresponding times. The tool uses an anisotropic (directionally dependent) least cost distance model to estimate evacuation potential and allows for the variation of travel speed to measure its effect on travel time. The effectiveness of vertical evacuation structures on evacuation time can also be evaluated and compared with metrics such as travel time maps showing each structure in place and graphs displaying the percentage change in population exposure for each structure against the baseline. Using the tool, travel time maps and at-risk population counts have been generated for some coastal communities of the U.S. Pacific Northwest and Alaska. The tool can also be used to provide valuable decision support for tsunami vertical evacuation siting.

  7. Seismic hazard analysis of the Adelaide region, South Australia

    NASA Astrophysics Data System (ADS)

    Setiawan, Bambang

    2017-07-01

    Seismic activity in Australia is categorised as low to moderate. However, the rate of the deformation of the Australian continent is faster than other stable intraplate regions, such as Eastern North America and Africa. Adelaide region is the most seismically active zone in the Australian continent. Therefore, seismic hazard analysis of the Adelaide region is needed to improve the accuracy of seismic hazard predictions. Probabilistic seismic hazard analysis (PSHA) incorporating Monte Carlo simulation is selected in the present paper. This method has several advantages i.e. simple, fast, flexible, and robust. Results of the analysis suggest comparable results with previous studies. Furthermore, two main threats are identified in the de-aggregation for the Adelaide city. Due to the limitation of strong magnitude event and the scarcity of the seismic event, further work is suggested for improving the estimates by extending the seismic catalogue i.e. incorporating neo-tectonic and paleo-seismic studies.

  8. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    SciTech Connect

    Lewis, W.S.

    1994-08-23

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.

  9. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria. ********* ;Without an analysis of the physical causes of recorded floods, and of the whole geophysical, biophysical and anthropogenic context which circumscribes the potential for flood formation, results of flood frequency analysis as [now practiced], rather than providing information useful for coping with the flood hazard, themselves represent an additional hazard that can contribute to damages caused by floods. This danger is very real since decisions made on the basis of wrong numbers presented as good estimates of flood probabilities will generally be worse than decisions made with an awareness of an impossibility to make a good estimate and with the aid of merely qualitative information on the general flooding potential.;

  10. 78 FR 27235 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-09

    ... all people with respect to the development, implementation, and enforcement of environmental laws... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis...

  11. Sampling and Analysis Plan - Guidance and Template v.4 - General Projects - 04/2014

    EPA Pesticide Factsheets

    This Sampling and Analysis Plan (SAP) guidance and template is intended to assist organizations in documenting the procedural and analytical requirements for one-time, or time-limited, projects involving the collection of water, soil, sediment, or other

  12. Influence of analysis methods on interpretation of hazard maps.

    PubMed

    Koehler, Kirsten A; Peters, Thomas M

    2013-06-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with 'off-the-shelf' mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets.

  13. 78 FR 41853 - Safety Advisory Guidance: Heating Rail Tank Cars To Prepare Hazardous Material for Unloading or...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-12

    .... The waste was a combination of cyclohexanone oxime, cyclohexanone, and water. As a result of the... result of heating, and additional risk if the hazardous material reacts with air or water.; The pressure...., viscosity and temperature) for safe unloading or transloading. Certain chemicals, such as a material...

  14. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    SciTech Connect

    Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin; Rodriguez-Marek, Adrian; Falero, Valentina Montaldo; Youngs, Robert

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the new methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.

  15. Tracking Hazard Analysis Data in a Jungle of Changing Design

    SciTech Connect

    Sullivan, Robin S.; Young, Jonathan

    2006-05-14

    The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site in Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.

  16. Landslide hazards and systems analysis: A Central European perspective

    NASA Astrophysics Data System (ADS)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  17. Hazardous-waste analysis plan for LLNL operations

    SciTech Connect

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  18. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  19. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    SciTech Connect

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  20. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    SciTech Connect

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  1. Phase 2 fire hazard analysis for the canister storage building

    SciTech Connect

    Sadanaga, C.T., Westinghouse Hanford

    1996-07-01

    The fire hazard analysis assesses the risk from fire in a facility to ascertain whether the fire protection policies are met. This document provides a preliminary FHA for the CSB facility. Open items have been noted in the document. A final FHA will be required at the completion of definitive design, prior to operation of the facility.

  2. Implementation of hazard analysis critical control point in jameed production.

    PubMed

    Al-Saed, A K; Al-Groum, R M; Al-Dabbas, M M

    2012-06-01

    The average of standard plate count and coliforms, Staphylococcus aureus and Salmonella counts for three home-made jameed samples, a traditional fermented dairy product, before applying hazard analysis critical control point system were 2.1 × 10(3), 8.9 × 10(1), 4 × 10(1) and less than 10 cfu/g, respectively. The developed hazard analysis critical control point plan resulted in identifying ten critical control points in the flow chart of jameed production. The critical control points included fresh milk receiving, pasteurization, addition of starter, water and salt, straining, personnel hygiene, drying and packaging. After applying hazard analysis critical control point system, there was significant improvement in the microbiological quality of the home-made jameed. The standard plate count was reduced to 3.1 × 10(2) cfu/g whereas coliform and Staphylococcus aureus counts were less than 10 cfu/g and Salmonella was not detected. Sensory evaluation results of color and flavor of sauce prepared from jameed showed a significant increase in the average scores given after hazard analysis critical control point application.

  3. Fire Hazards Analysis for the Inactive Equipment Storage Sprung Structure

    SciTech Connect

    MYOTT, C.F.

    2000-02-03

    The purpose of the analysis is to comprehensively assess the risk from fire within individual fire areas in relation to proposed fire protection so as to ascertain whether the fire protection objective of DOE Order 5480.1A are met. The order acknowledges a graded approach commensurate with the hazards involved.

  4. Hazard analysis of Clostridium perfringens in the Skylab Food System

    NASA Technical Reports Server (NTRS)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  5. PO*WW*ER mobile treatment unit process hazards analysis

    SciTech Connect

    Richardson, R.B.

    1996-06-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damage and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  6. Hazard analysis of Clostridium perfringens in the Skylab Food System

    NASA Technical Reports Server (NTRS)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  7. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  8. Evaluation and Application of Probabilistic Tsunami Hazard Analysis in California

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Wilson, R. I.; Miller, K.

    2014-12-01

    The California Geological Survey (CGS) and URS Corporation are in the process of generating tsunami hazard map products for land-use planning and construction through the California Seismic Hazard Mapping Act (Public Resources Code, sec 2690 et seq.). Similar to seismic hazard zonation, these products for land-use and development decision-making are typically based on a probabilistic analysis, and require information on the frequency of occurrence through a probabilistic tsunami hazard analysis (PTHA). In Phase 1 of CGS's work, the California PTHA Work Group was established to evaluate the results of PTHA demonstration projects in Crescent City and Huntington Beach. The results of this Phase 1 review of the two independent analyses indicate PTHA's can be developed with recommended improvements in source characterization, PTHA methods, and numerical model use. An immediate significant improvement is to align the characterization of the Cascadia Subduction Zone PTHA with the seismic characterization of the National Seismic Hazard Map Program of the USGS. In addition to applying PTHA to land-use planning and the two demonstration projects, CGS and the CA-PTHA Work Group identified other potential applications for various PTHA risk levels (ARP = Average Return Period), including flood insurance (100 and 500 year ARP), building codes (2,500 year ARP), and emergency response planning (1000 year ARP or larger). CGS is working with URS Corp., the California Office of Emergency Services, and FEMA on a Phase 2 plan to produce a single set of reliable and consistent PTHA maps for multiple risk levels and work with various end-users to determine how to use the maps. The California PTHA and the results of the Work Group review are also proposed to be used by the U.S. National Tsunami Hazard Mitigation Program to develop guidelines for production in other coastal states.

  9. D0 Detector Collision Hall Oxygen Deficiancy Hazard Analysis

    SciTech Connect

    Wu, J.; /Fermilab

    1992-08-06

    EN-258, D0 Platform ODH Analysts. provided the oxygen deficiency hazard analysts for the D0 detector in the Assembly Hall. This note covers the same analysis. but revised for the Collision Hall. Liquid cryogens. released and warming to atmosphere conditions, expand to, on average, seven hundred times their liquid volume, and displace vital atmospheric oxygen. An oxygen deficiency hazard analysis assesses the increased risk to personnel in areas containing cryogenic systems. The D0 detector Collision Hall ODH analysis has been approached five different ways using established methods. If the low beta quad magnets are powered, and the exhaust rate is below 4220 scfm, the area is ODH class 1. In any other case, the analysis shows the area to be ODH class 0 as equipped (with ventilation fans) and requiring no special safety provisions. System designers have provided for a reduced oxygen level detection and warning system as well as emergency procedures to address fault conditions.

  10. Uncertainty analysis for seismic hazard in Northern and Central Italy

    USGS Publications Warehouse

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  11. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  12. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  13. Probabilistic Tsunami Hazard Analysis - Results for the Western United States

    NASA Astrophysics Data System (ADS)

    Thio, H.; Polet, J.; Somerville, P.

    2007-12-01

    We have developed a series of probabilistic tsunami hazard maps for the coasts of western North America based on fault source characterizations of the circum-Pacific subduction zones as well as local offshore faults. The maps show the probabilistic offshore exceedance waveheights at 72, 475, 975 and 2475 year return periods, which are the return periods typically used in Probabilistic Seismic Hazard Analysis (PSHA). Our method follows along similar lines as (PSHA) which has become a standard practice in the evaluation and mitigation of seismic hazard in particular with respect to structures, infrastructure and lifelines. Its ability to condense complexities, variability and uncertainties of seismic activity into a manageable set of ground motion parameters greatly facilitates the planning and design of effective seismic resistant buildings and infrastructure. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can rapidly synthesize tsunami waveforms for any slip distribution on those faults by summing the individual weighted subfault tsunami waveforms. This Green's function summation provides accurate estimates of tsunami height for probabilistic calculations, where one typically integrates over thousands of earthquake scenarios. We have carried out tsunami hazard calculations for western North America and Hawaii based on a comprehensive source model around the Pacific Ocean including both subduction zone sources as well as local offshore faults. We will present the tsunami hazard maps and discuss how these results are used for probabilistic inundation mapping, including a follow-up inundation study of the San Francisco Bay area that is based on disaggregation results of the

  14. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials. Published by Elsevier B.V.

  15. Long term volcanic hazard analysis in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  16. Surface Fire Hazards Analysis Technical Report-Constructor Facilities

    SciTech Connect

    R.E. Flye

    2000-10-24

    The purpose of this Fire Hazards Analysis Technical Report (hereinafter referred to as Technical Report) is to assess the risk from fire within individual fire areas to ascertain whether the U.S. Department of Energy (DOE) fire safety objectives are met. The objectives identified in DOE Order 420.1, Change 2, Facility Safety, Section 4.2, establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public, or the environment; Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding defined limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  17. A practical approach to fire hazard analysis for offshore structures.

    PubMed

    Krueger, Joel; Smith, Duncan

    2003-11-14

    Offshore quantitative risk assessments (QRA) have historically been complex and costly. For large offshore design projects, the level of detail required for a QRA is often not available until well into the detailed design phase of the project. In these cases, the QRA may be unable to provide timely hazard understanding. As a result, the risk reduction measures identified often come too late to allow for cost effective changes to be implemented. This forces project management to make a number of difficult or costly decisions. This paper demonstrates how a scenario-based approached to fire risk assessment can be effectively applied early in a project's development. The scenario or design basis fire approach calculates the consequence of a select number of credible fire scenarios, determines the potential impact on the platform process equipment, structural members, egress routes, safety systems, and determines the effectiveness of potential options for mitigation. The early provision of hazard data allows the project team to select an optimum design that is safe and will meet corporate or regulatory risk criteria later in the project cycle. The focus of this paper is on the application of the scenario-based approach to gas jet fires. This paper draws on recent experience in the Gulf of Mexico (GOM) and other areas to outline an approach to fire hazard analysis and fire hazard management for deep-water structures. The methods presented will include discussions from the recent June 2002 International Workshop for Fire Loading and Response.

  18. Guidance and Control Algorithms for the Mars Entry, Descent and Landing Systems Analysis

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; CwyerCianciolo, Alicia M.; Powell, Richard W.; Shidner, Jeremy D.; Garcia-Llama, Eduardo

    2010-01-01

    The purpose of the Mars Entry, Descent and Landing Systems Analysis (EDL-SA) study was to identify feasible technologies that will enable human exploration of Mars, specifically to deliver large payloads to the Martian surface. This paper focuses on the methods used to guide and control two of the contending technologies, a mid- lift-to-drag (L/D) rigid aeroshell and a hypersonic inflatable aerodynamic decelerator (HIAD), through the entry portion of the trajectory. The Program to Optimize Simulated Trajectories II (POST2) is used to simulate and analyze the trajectories of the contending technologies and guidance and control algorithms. Three guidance algorithms are discussed in this paper: EDL theoretical guidance, Numerical Predictor-Corrector (NPC) guidance and Analytical Predictor-Corrector (APC) guidance. EDL-SA also considered two forms of control: bank angle control, similar to that used by Apollo and the Space Shuttle, and a center-of-gravity (CG) offset control. This paper presents the performance comparison of these guidance algorithms and summarizes the results as they impact the technology recommendations for future study.

  19. A methodology for probabilistic fault displacement hazard analysis (PFDHA)

    USGS Publications Warehouse

    Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H.; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

    2003-01-01

    We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

  20. 76 FR 23824 - Guidance for Industry: “Computer Crossmatch” (Computerized Analysis of the Compatibility Between...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    ... Analysis of the Compatibility Between the Donor's Cell Type and the Recipient's Serum or Plasma Type... Crossmatch' (Computerized Analysis of the Compatibility between the Donor's Cell Type and the Recipient's... donor's cell type and the recipient's serum or plasma type. The guidance describes practices that...

  1. Flood Hazard and Risk Analysis in Urban Area

    NASA Astrophysics Data System (ADS)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien

    2017-04-01

    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  2. A Hazard Analysis for a Generic Insulin Infusion Pump

    PubMed Central

    Zhang, Yi; Jones, Paul L.; Jetley, Raoul

    2010-01-01

    Background Researchers at the Food and Drug Administration (FDA)/Center for Device and Radiological Health/Office of Science and Engineering Laboratories have been exploring the concept of model-based engineering as a means for improving the quality of medical device software. Insulin pumps were chosen as a research subject because their design provides the desired degree of research complexity and these types of devices present an ongoing regulatory challenge. Methods Insulin pump hazards and their contributing factors are considered in the context of a highly abstract generic insulin infusion pump (GIIP) model. Hazards were identified by consulting with manufacturers, pump users, and clinicians; by reviewing national and international standards and adverse event reports collected by the FDA; and from workshops sponsored by Diabetes Technology Society. This information has been consolidated in tabular form to facilitate further community analysis and discussion. Results A generic insulin infusion pump model architecture has been established. A fairly comprehensive hazard analysis document, corresponding to the GIIP model, is presented in this article. Conclusions We believe that this work represents the genesis of an insulin pump safety reference standard upon which future insulin pump designs can be based to help ensure a basic level of safety. More interaction with the diabetes community is needed to assure the quality of this safety modeling process. PMID:20307387

  3. Lithium-thionyl chloride cell system safety hazard analysis

    NASA Astrophysics Data System (ADS)

    Dampier, F. W.

    1985-03-01

    This system safety analysis for the lithium thionyl chloride cell is a critical review of the technical literature pertaining to cell safety and draws conclusions and makes recommendations based on this data. The thermodynamics and kinetics of the electrochemical reactions occurring during discharge are discussed with particular attention given to unstable SOCl2 reduction intermediates. Potentially hazardous reactions between the various cell components and discharge products or impurities that could occur during electrical or thermal abuse are described and the most hazardous conditions and reactions identified. Design factors influencing the safety of Li/SOCl2 cells, shipping and disposal methods and the toxicity of Li/SOCl2 battery components are additional safety issues that are also addressed.

  4. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    SciTech Connect

    Sevougian, S. David

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  5. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan....

  6. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... Collection; Comment Request; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary... requirements for applying hazard analysis and critical control point (HAACP) procedures for safe and sanitary... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point...

  7. 76 FR 67764 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...-xxxx, Revision 0, ``Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and... at (301) 492-3446. FOR FURTHER INFORMATION CONTACT: Song-Hua Shen, Division of Risk Analysis,...

  8. Multi-hazard risk analysis related to hurricanes

    NASA Astrophysics Data System (ADS)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is

  9. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  10. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  11. Comparative analysis of hazardous household waste in two Mexican regions.

    PubMed

    Delgado, Otoniel Buenrostro; Ojeda-Benítez, Sara; Márquez-Benavides, Liliana

    2007-01-01

    Household hazardous waste (HHW) generation in two Mexican regions was examined, a northern region (bordering with the USA) and a central region. The aim of this work was to determine the dynamics of solid waste generation and to be able to compare the results of both regions, regarding consumption patterns and solid waste generation rates. In the northern region, household solid waste was analysed quantitatively. In order to perform this analysis, the population was categorized into three socioeconomic strata (lower, middle, upper). Waste characterization revealed the presence of products that give origin to household hazardous waste. In the northern region (Mexicali city), household hazardous waste comprised 3.7% of municipal solid waste, the largest categories in this fraction were home care products (29.2%), cleaning products (19.5%) and batteries and electronic equipment (15.7%). In the central region, HHW comprised 1.03% of municipal solid waste; the main categories in this fraction were represented by cleaning products (39%), self care products (27.3%), and insecticides (14.4%). In Mexicali, the socioeconomic study demonstrated that the production of HHW is independent of the income level. Furthermore, the composition of the solid waste stream in both regions suggested the influence of another set of variables such as local climate, migration patterns and marketing coverage. Further research is needed in order to establish the effect of low quantities of HHW upon the environment and public health.

  12. Safety analysis of contained low-hazard biotechnology applications.

    PubMed

    Pettauer, D; Käppeli, O; van den Eede, G

    1998-06-01

    A technical safety analysis has been performed on a containment-level-2 pilot plant in order to assess an upgrading of the existing facility, which should comply with good manufacturing practices. The results were obtained by employing the hazard and operability (HAZOP) assessment method and are discussed in the light of the appropriateness of this procedural tool for low-hazard biotechnology applications. The potential release of micro-organisms accounts only for a minor part of the hazardous consequences. However, in certain cases the release of a large or moderate amount of micro-organisms would not be immediately identified. Most of the actions required to avoid these consequences fall into the realm of operational procedures. As a major part of potential failures result from human errors, standard operating procedures play a prominent role when establishing the concept of safety management. The HAZOP assessment method was found to be adequate for the type of process under investigation. The results also may be used for the generation of checklists which, in most cases, are sufficient for routine safety assurance.

  13. 75 FR 16202 - Office of New Reactors; Interim Staff Guidance on Ensuring Hazard-Consistent Seismic Input for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ..., ``Standard Review Plan (SRP) for the Review of Safety Analysis Reports for Nuclear Power Plants,'' March 2007... Engineering Branch 2, Division of Engineering, Office of the New Reactors, U.S. Nuclear Regulatory Commission... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY...

  14. Guidance analysis of the aeroglide plane change maneuver as a turning point problem

    NASA Technical Reports Server (NTRS)

    Gracey, Christopher

    1989-01-01

    The development of guidance approximations for the atmospheric (aeroglide) portion of the minimum fuel, orbital plane change, trajectory optimization problem is described. Asymptotic methods are used to reduce the two point, boundary value, optimization problem to a turning point problem from the bank angle control. The turning point problem solution, which yields an approximate optimal control policy, is given in terms of parabolic cylinder functions, which are tabulated, and integral expressions, which must be numerically computed. Comparisons of the former, over their region of validity, with optimal control solutions show good qualitative agreement. Additional work and analysis is needed to compute the guidance approximation work.

  15. Guidance document for preparing water sampling and analysis plans for UMTRA sites. Final [report

    SciTech Connect

    Not Available

    1993-09-01

    A water sampling and analysis plan (WSAP) is required for each Uranium Mill Tailings Remedial Action (UMTRA) site to provide rationale for groundwater and surface water sampling at disposal sites and former processing sites. The WSAP identifies and justifies the sampling locations, analytical parameters, detection limits, and sampling frequencies for the groundwater and surface water monitoring stations at each site. Section 2.0 of this WSAP Guidance Document describes the WSAP format. Sections 3.0 and 4.0 provide guidance for selecting sampling frequencies and sampling locations, respectively. Section 5.0 contains criteria for selecting analytical parameters. Section 6.0 provides guidance for the contents of each site`s WSAP file. Finally, Section 7.0 presents the references used to prepare this document. The purpose of this guidance document is to provide a consistent technical approach for sampling and monitoring activities performed under WSAPs and a consistent format for WSAP documents. This document is designed for use by the Technical Assistance Contractor (TAC) to prepare WSAPs and by the US Department of Energy (DOE), Nuclear Regulatory Commission, state and tribal agencies, regulatory agencies, and the public to evaluate the contents of the WSAPS. This guidance document may be updated periodically based on new or changing regulations.

  16. Performance analysis of the combined EDS maglev propulsion, levitation, and guidance system

    SciTech Connect

    He, J.L.; Coffey, H.T.; Rote, D.M.

    1993-10-01

    An analysis of the Japanese maglev system which uses only one set of coils in the guideway for combined levitation, propulsion, and guidance functions is presented in this paper. This preliminary study, using the dynamic circuit approach, indicates that the system is very promising.

  17. Protocol for the collection and analysis of volatile POHCs (Principal Organic Hazardous Constituents) using VOST (Volatile Organic Sampling Train)

    NASA Astrophysics Data System (ADS)

    Hansen, E. M.

    1984-03-01

    An operating protocol for sampling and analysis of voltatile organic constituents of flue gas from hazardous waste incinerators or other similar combustor systems using the Volatile Organic Sampling Train (VOST) is presented. It is intended to be used for guidance by personnel of the regulatory groups, personnel associated with engineering research and development, and the regulated community. The document is in two parts. Part A describes the key components of the train, the procedures for preparing the sorbent materials, and procedures for sample collection using the VOST. Part B describes the procedures for analyzing VOST sorbent cartridges for volatile principal organic hazardous constituents (POHCs) using purge-trap-desorb gas chromatography/mass spectrometry (P-T-D GC/MS). Quality control procedures are presented in both parts.

  18. RMP Guidance for Warehouses - Chapter 4: Offsite Consequence Analysis

    EPA Pesticide Factsheets

    Offsite consequence analysis (OCA) informs government and the public about potential consequences of an accidental toxic or flammable chemical release at your facility, and consists of a worst-case release scenario and alternative release scenarios.

  19. Probabilistic Seismic Hazard Analysis: Adaptation for CO2 Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Vasudevan, K.; Eaton, D. W.

    2011-12-01

    Large-scale sequestration of CO2 in depleted oil and gas fields in sedimentary basins such as the Western Canada Sedimentary Basin (WCSB) and in particular, central Alberta, should consider, among other safety and risk issues, a seismic hazard analysis that would include potential ground motions induced by earthquakes. The region is juxtaposed to major tectonically active seismogenic zones such as the Cascadia Subduction Zone, the Queen Charlotte Fault Zone, and the northern Cordillera region. Hazards associated with large-scale storage from strong ground motions caused by large-magnitude earthquakes along the west coast of Canada, and/or medium-to-large magnitude earthquakes triggered by such earthquakes in the neighbourhood of the storage site, must be clearly understood. To this end, stochastic modeling of the accelerograms recorded during large magnitude earthquakes in western Canada has been undertaken. A lack of recorded accelerograms and the absence of a catalogue of ground-motion prediction equations similar to the Next Generation Attenuation (NGA) database, however, hamper such analysis for the WCSB. In order to generate our own database of ground-motions for probabilistic seismic hazard analysis, we employ a site-based stochastic simulation approach. We use it to simulate three-component ground-motion accelerograms recorded during the November 3, 2002 Denali earthquake to mimic the Queen Charlotte Fault earthquakes. To represent a Cascadia megathrust earthquake, we consider three-component strong-motion accelerograms recorded during the March 11, 2011 Tohoku earthquake in Japan. Finally, to simulate an event comparable to the thrust-style Kinbasket Lake earthquake of 1908, we use three-component ground-motion accelerograms recorded during the 1985 Nahanni earthquake and the 2004 Chuetsu earthquake. Here, we develop predictive equations for the stochastic model parameters that describe ground motions in terms of earthquake and site characteristics such as

  20. Fire hazards analysis for W030 tank farm ventilation upgrade

    SciTech Connect

    Huckfeldt, R.A.

    1996-07-17

    This Fire Hazard Analysis (FHA) was prepared according to the requirements of U.S. Department of Energy (DOE) Order 5480.7A,FIRE PROTECTION, 2-17-93. The purpose of this FHA is to ascertain whether the objectives of DOE 5480.7A are being met. This purpose is accomplished through a conservative comprehensive assessment of the risk from fire and other perils within individual fire areas of a DOE facility in relation to proposed fire protection. This FHA is based on conditions set forth within this document and is valid only under these conditions.

  1. A LiDAR based analysis of hydraulic hazard mapping

    NASA Astrophysics Data System (ADS)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  2. Hazards assessment for the INEL Landfill Complex

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-02-01

    This report documents the hazards assessment for the INEL Landfill Complex (LC) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and the DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes the hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding the LC, the buildings and structures at the LC, and the processes that are used at the LC are described in this report. All hazardous materials, both radiological and nonradiological, at the LC were identified and screened against threshold quantities according to DOE Order 5500.3A guidance. Asbestos at the Asbestos Pit was the only hazardous material that exceeded its specified threshold quantity. However, the type of asbestos received and the packaging practices used are believed to limit the potential for an airborne release of asbestos fibers. Therefore, in accordance with DOE Order 5500.3A guidance, no further hazardous material characterization or analysis was required for this hazards assessment.

  3. DoD Cost Analysis Guidance and Procedures

    DTIC Science & Technology

    1992-12-01

    Itemis 3-9 ’TABLE’S TAB P!E TIFlF IIAGF 2-1 Cost Analysis -improvemnent Group (CAIG) Tim etab Ic 2-11 De’i’en:sc Acquisition Prolgraw Life’-Cycle Cost...relationship to other systems. 1.1.3 System- Configuration. This section identifies the cquipmenleit (hardwvare and software ) work breakdown structure (W135) for...furnished commercial off-ti,: ~ (COTS) software should be addressed in thle discussion. Where Goverrnlent-fu, .’ ’.cd equipment or inron~ertx’ is

  4. Fire hazard analysis for Plutonium Finishing Plant complex

    SciTech Connect

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  5. Six-degree-of-freedom guidance and control-entry analysis of the HL-20

    NASA Astrophysics Data System (ADS)

    Powell, Richard W.

    1993-10-01

    The ability of the HL-20 lifting body to fly has been evaluated for an automated entry from atmospheric interface to landing. This evaluation was required to demonstrate that not only successful touchdown conditions would be possible for this low lift-to-drag-ratio vehicle, but also the vehicle would not exceed its design dynamic pressure limit of 400 psf during entry. This dynamic pressure constraint limit, coupled with limited available pitch-control authority at low supersonic speeds, restricts the available maneuvering capability for the HL-20 to acquire the runway. One result of this analysis was that this restrictive maneuvering capability does not allow the use of a model-following atmospheric entry-guidance algorithm, such as that used by the Space Shuttle, but instead requires a more adaptable guidance algorithm. Therefore, for this analysis, a predictor-corrector guidance algorithm was developed that would provide successful touchdown conditions while not violating the dynamic pressure constraint. A flight-control system was designed and incorporated, along with the predictor-corrector guidance algorithm, into a six-DOF simulation. which showed that the HL-20 remained controllable and could reach the landing site and execute a successful landing under all off-nominal conditions simulated.

  6. Six-degree-of-freedom guidance and control-entry analysis of the HL-20

    NASA Technical Reports Server (NTRS)

    Powell, Richard W.

    1993-01-01

    The ability of the HL-20 lifting body to fly has been evaluated for an automated entry from atmospheric interface to landing. This evaluation was required to demonstrate that not only successful touchdown conditions would be possible for this low lift-to-drag-ratio vehicle, but also the vehicle would not exceed its design dynamic pressure limit of 400 psf during entry. This dynamic pressure constraint limit, coupled with limited available pitch-control authority at low supersonic speeds, restricts the available maneuvering capability for the HL-20 to acquire the runway. One result of this analysis was that this restrictive maneuvering capability does not allow the use of a model-following atmospheric entry-guidance algorithm, such as that used by the Space Shuttle, but instead requires a more adaptable guidance algorithm. Therefore, for this analysis, a predictor-corrector guidance algorithm was developed that would provide successful touchdown conditions while not violating the dynamic pressure constraint. A flight-control system was designed and incorporated, along with the predictor-corrector guidance algorithm, into a six-DOF simulation. which showed that the HL-20 remained controllable and could reach the landing site and execute a successful landing under all off-nominal conditions simulated.

  7. Source sampling and analysis guidance: A methods directory

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Baughman, K.W.; James, R.H.; Spafford, R.B.

    1991-01-01

    Sampling and analytical methodologies are needed by EPA and industry for testing stationary sources for specific organic compounds such as those listed under the Resource Conservation and Recovery Act (RCRA) Appendix 8 and Appendix 9 and the Clean Air Act of 1990. A computerized directory, Problem POHC Reference Directory, has been developed that supplies information on available field sampling and analytical methodology for each compound in those lists. Existing EPA methods are referenced if applicable, along with their validation status. At the present, the data base is strongly oriented toward combustion sources. The base may be searched on the basis of several parameters including name, Chemical Abstracts Service (CAS) number, physical properties, thermal stability, combustion rank, or general problem areas in sampling or analysis. The methods directory is menu driven and requires no programming ability; however, some familiarity with dBASE III+ would be helpful.

  8. Resources Available for Hazards Analysis of Aerospace Fluids

    NASA Technical Reports Server (NTRS)

    Woods, S. S.; Stewart, W. F.; Baker, D. L.

    2001-01-01

    In recent years, the legislative and executive branches of the federal government have pushed to make government more efficient and responsive to the needs of the marketplace. One of these initiatives, Public Law 104-113, also known as the National Technology Transfer and Advancement Act of 1995 (NTTAA), is designed to accelerate technology transfer to industry and promote government-industry partnership. Summarized, NTTAA states that '... all Federal agencies and departments shall use technical standards that are developed or adopted by voluntary consensus standards bodies, using such technical standards as a means to carry out policy objectives or activities determined by the agencies and departments. Government agencies must now determine if their in-house requirement-setting activities are sufficiently unique that no public interest is served by having them adopted by a voluntary consensus organization (VCO), or if not, to use or develop voluntary consensus standards. The Office of Management and Budget (OMB) is chartered by the law to monitor federal agency progress and report the results to Congress. In response to NTTAA, agency-wide oxygen and hydrogen safety standards sponsored by the NASA Headquarters (HQ) Office of Safety and Mission Assurance (OSMA) were obvious choices for early adoption by VCOs. In 1996, HQ sought assistance from the Johnson Space Center (JSC) White Sands Test Facility (WSTF), the technical lead for development of these safety standards, to evaluate their adoption by VCOs. At that time, WSTF-developed propellant hazards manuals were likewise identified for possible VCO adoption. Subsequently, WSTF was asked to represent NASA for development of an international ISO safety standard for hydrogen use. Concurrent with these WSTF standards activities are related efforts to develop and publish propellant hazards analysis protocols and safety courses for the industrial, propellant use of oxygen, hydrogen, and hypergols. This paper reports on

  9. D0 Detector Assemble Hall Platform Oxygen Deficiency Hazard Analysis

    SciTech Connect

    Clark, D.; Michael, J.; /Fermilab

    1991-01-29

    Liquid cryogens, released and warming to atmosphere conditions, expand to, on average, seven hundred times their liquid volume, and displace vital atmospheric oxygen. An oxygen deficiency hazard analysis assesses the increased risk to personnel in areas containing cryogenic systems. The D0 detector platform area ODH analysis has been approached four different ways using established methods. In each case, the analysis shows the platform area to be ODH class 0 as equipped (with ventilation fans) and requiring no special safety provisions. System designers have provided for a reduced oxygen level detection and warning system as well as emergency procedures to address fault conditions. The Oxygen Deficiency Hazard of any particular area is defined by these parameters: the nature of the accidental supply of inert gas (probability of occurrence and quantity then released), the area's volume, the area's ventilation rate, and to a small degree the elevation of the area. Once this information is assembled, the ODH classification can be determined through standardized calculations. The platform area under the D0 detector contains much of the cryogenic and gas system piping necessary for the D0 experiment. Prior to moving the detector into the Collision Hall, the liquid argon calorimeters are cooled down and operated in the Assembly Hall. The first phase of this operation involved the cooldown of the Central Calorimeter, which was done in February 1991. This engineering note assesses the increased risk to personnel in the platform level to a reduced oxygen atmosphere during the cool down and subsequent operation of the calorimeters in the Assembly Hall. In addition, it outlines the steps taken to warn personnel of an emergency and to direct the subsequent evacuation. This note analyses only the Assembly Hall area. A similar engineering note, EN-332, covers the analysis of the Collision Hall area.

  10. The European ASAMPSA_E project : towards guidance to model the impact of high amplitude natural hazards in the probabilistic safety assessment of nuclear power plants. Information on the project progress and needs from the geosciences.

    NASA Astrophysics Data System (ADS)

    Raimond, Emmanuel; Decker, Kurt; Guigueno, Yves; Klug, Joakim; Loeffler, Horst

    2015-04-01

    The Fukushima nuclear accident in Japan resulted from the combination of two correlated extreme external events (earthquake and tsunami). The consequences, in particular flooding, went beyond what was considered in the initial engineering design design of nuclear power plants (NPPs). Such situations can in theory be identified using probabilistic safety assessment (PSA) methodology. PSA results may then lead industry (system suppliers and utilities) or Safety Authorities to take appropriate decisions to reinforce the defence-in-depth of the NPP for low probability event but high amplitude consequences. In reality, the development of such PSA remains a challenging task. Definitions of the design basis of NPPs, for example, require data on events with occurrence probabilities not higher than 10-4 per year. Today, even lower probabilities, down to 10-8, are expected and typically used for probabilistic safety analyses (PSA) of NPPs and the examination of so-called design extension conditions. Modelling the combinations of natural or man-made hazards that can affect a NPP and affecting some meaningful probability of occurrence seems to be difficult. The European project ASAMPSAE (www.asampsa.eu) gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants. It offers a framework to discuss, at a technical level, how "extended PSA" can be developed efficiently and be used to verify if the robustness of Nuclear Power Plants (NPPs) in their environment is sufficient. The paper will present the objectives of this project, some first lessons and introduce which type of guidance is being developed. It will explain the need of expertise from geosciences to support the nuclear safety assessment in the different area (seismotectonic, hydrological, meteorological and biological

  11. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  12. A hazards analysis of a nuclear explosives dismantlement

    SciTech Connect

    Bott, T.F.; Eisenhawer, S.W.

    1995-07-01

    This paper describes the methodology used in a quantitative hazard assessment of a nuclear weapon disassembly process. Potential accident sequences were identified using an accident-sequence fault tree based on operational history, weapon safety studies, a hazard analysis team composed of weapons experts, and walkthroughs of the process. The experts provided an initial screening of the accident sequences to reduce the number of accident sequences that would be quantified. The accident sequences that survived the screening process were developed further using event trees. Spreadsheets were constructed for each event tree, the accident sequences associated with that event tree were entered as rows on the spreadsheet, and that spreadsheet was linked to spreadsheets with initiating-event frequencies, enabling event probabilities, and weapon response probabilities. The probability and frequency distribution estimates used in these spreadsheets were gathered from weapon process operational data, surrogate industrial data, expert judgment, and probability models. Frequency distributions were calculated for the sequences whose point-value frequency represented 99% of the total point-value frequency using a Monte Carlo simulation. Partial differential importances of events and distributions of accident frequency by weapon configuration, location, process, and other parameters were calculated.

  13. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis.

    PubMed

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA.

  14. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis

    PubMed Central

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011

  15. 14 CFR Appendix B to Part 417 - Flight Hazard Area Analysis for Aircraft and Ship Protection

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... hazard area that encompasses the launch point from the surface of the Earth to an altitude of 100,000 ft... radial direction. A launch operator must calculate an aircraft hazard area in the vicinity of the launch... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight Hazard Area Analysis for Aircraft...

  16. Rapid microbiological methods with hazard analysis critical control point.

    PubMed

    Griffiths, M W

    1997-01-01

    The proactive approach to ensuring food safety termed hazard analysis critical control point (HACCP) was introduced in the 1960s by the Pillsbury Company, in collaboration with the U.S. Army Natick Laboratories and National Aeronautics and Space Administration, to help guarantee that astronauts would not be incapacitated by the trauma of foodborne illness during space flights. The approach has subsequently been adopted as the standard food safety management system world-wide and is seen as forming the basis for harmonization of food inspection regulations necessitated by trade agreements such as General Agreement on Tariffs and Trade and North American Free Trade Agreement as the move toward globalization of trade in food products gains momentum. The new U.S. Department of Agriculture Mega-Reg requires mandatory introduction of HACCP, and the Food Safety Enhancement Program of Agriculture and Agri-food Canada, as well as the "due diligence" legislation of the European Union, is centered on HACCP principles.

  17. Analysis of hazardous material releases due to natural hazards in the United States.

    PubMed

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Cruz, Ana Maria

    2012-10-01

    Natural hazards were the cause of approximately 16,600 hazardous material (hazmat) releases reported to the National Response Center (NRC) between 1990 and 2008-three per cent of all reported hazmat releases. Rain-induced releases were most numerous (26 per cent of the total), followed by those associated with hurricanes (20 per cent), many of which resulted from major episodes in 2005 and 2008. Winds, storms or other weather-related phenomena were responsible for another 25 per cent of hazmat releases. Large releases were most frequently due to major natural disasters. For instance, hurricane-induced releases of petroleum from storage tanks account for a large fraction of the total volume of petroleum released during 'natechs' (understood here as a natural hazard and the hazardous materials release that results). Among the most commonly released chemicals were nitrogen oxides, benzene, and polychlorinated biphenyls. Three deaths, 52 injuries, and the evacuation of at least 5,000 persons were recorded as a consequence of natech events. Overall, results suggest that the number of natechs increased over the study period (1990-2008) with potential for serious human and environmental impacts.

  18. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    SciTech Connect

    , R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  19. UPDATE TO THE PROBABILISTIC VOLCANIC HAZARD ANALYSIS, YUCCA MOUNTAIN, NEVADA

    SciTech Connect

    K.J. Coppersmith

    2005-09-14

    A probabilistic volcanic hazard analysis (PVHA) was conducted in 1996 for the proposed repository at Yucca Mountain, Nevada. Based on data gathered by the Yucca Mountain Project over the course of about 15 years, the analysis integrated the judgments of a panel of ten volcanic experts using methods of formal expert elicitation. PVHA resulted in a probability distribution of the annual frequency of a dike intersecting the repository, which ranges from 10E-7 to 10E-10 (mean 1.6 x 10E-8). The analysis incorporates assessments of the future locations, rates, and types of volcanic dikes that could intersect the repository, which lies about 300 m below the surface. A particular focus of the analysis is the quantification of uncertainties. Since the 1996 PVHA, additional aeromagnetic data have been collected in the Yucca Mountain region, including a high-resolution low-altitude survey. A number of anomalies have been identified within alluvial areas and modeling suggests that some of these may represent buried eruptive centers (basaltic cinder cones). A program is currently underway to drill several of the anomalies to gain information on their origin and, if basalt, their age and composition. To update the PVHA in light of the new aeromagnetic and drilling data as well as other advancements in volcanic hazard modeling over the past decade, the expert panel has been reconvened and the expert elicitation process has been fully restarted. The analysis requires assessments of the spatial distribution of igneous events, temporal distributions, and geometries and characteristics of future events (both intrusive and extrusive). The assessments are for future time periods of 10,000 years and 1,000,000 years. Uncertainties are being quantified in both the conceptual models that define these elements as well as in the parameters for the models. The expert elicitation process is centered around a series of workshops that focus on the available data; alternative approaches to

  20. Statistical modeling of ground motion relations for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2013-10-01

    We introduce a new approach for ground motion relations (GMR) in the probabilistic seismic hazard analysis (PSHA), being influenced by the extreme value theory of mathematical statistics. Therein, we understand a GMR as a random function. We derive mathematically the principle of area equivalence, wherein two alternative GMRs have an equivalent influence on the hazard if these GMRs have equivalent area functions. This includes local biases. An interpretation of the difference between these GMRs (an actual and a modeled one) as a random component leads to a general overestimation of residual variance and hazard. Beside this, we discuss important aspects of classical approaches and discover discrepancies with the state of the art of stochastics and statistics (model selection and significance, test of distribution assumptions, extreme value statistics). We criticize especially the assumption of logarithmic normally distributed residuals of maxima like the peak ground acceleration (PGA). The natural distribution of its individual random component (equivalent to exp( ɛ 0) of Joyner and Boore, Bull Seism Soc Am 83(2):469-487, 1993) is the generalized extreme value. We show by numerical researches that the actual distribution can be hidden and a wrong distribution assumption can influence the PSHA negatively as the negligence of area equivalence does. Finally, we suggest an estimation concept for GMRs of PSHA with a regression-free variance estimation of the individual random component. We demonstrate the advantages of event-specific GMRs by analyzing data sets from the PEER strong motion database and estimate event-specific GMRs. Therein, the majority of the best models base on an anisotropic point source approach. The residual variance of logarithmized PGA is significantly smaller than in previous models. We validate the estimations for the event with the largest sample by empirical area functions, which indicate the appropriate modeling of the GMR by an anisotropic

  1. NFPA 70E: Performing the electrical flash hazard analysis.

    PubMed

    Wallace, W Jon

    2005-08-01

    Arc flash hazards pose a serious risk to employee safety; proper safe work practices must be utilized. Electrical equipment > or = 50 volts must be deenergized and locked out/tagged out prior to servicing and maintenance unless doing so would increase hazards or is infeasible. Remember, convenience is not an acceptable reason for keeping equipment energized during servicing and maintenance. If electrical equipment must remain energized during Servicing and maintenance, NFPA 70E should be consulted to determine flash hazard boundaries as well as required PPE. Finally, circuit breakers and electrical disconnects must be marked to warn qualified employees of potential arc flash hazards.

  2. Precise terrestrial time: A means for improved ballistic missile guidance analysis

    NASA Technical Reports Server (NTRS)

    Ehrsam, E. E.; Cresswell, S. A.; Mckelvey, G. R.; Matthews, F. L.

    1978-01-01

    An approach developed to improve the ground instrumentation time tagging accuracy and adapted to support the Minuteman ICBM program is desired. The Timing Insertion Unit (TIU) technique produces a telemetry data time tagging resolution of one tenth of a microsecond, with a relative intersite accuracy after corrections and velocity data (range, azimuth, elevation and range rate) also used in missile guidance system analysis can be correlated to within ten microseconds of the telemetry guidance data. This requires precise timing synchronization between the metric and telemetry instrumentation sites. The timing synchronization can be achieved by using the radar automatic phasing system time correlation methods. Other time correlation techniques such as Television (TV) Line-10 and the Geostationary Operational Environmental Satellites (GEOS) terrestial timing receivers are also considered.

  3. A Descriptive Analysis of Prehospital Response to Hazardous Materials Events.

    PubMed

    Martin, Ashley J; Lohse, Christine M; Sztajnkrycer, Matthew D

    2015-10-01

    Little is known about the overall frequency of hazardous materials (HazMat) events in the United States and the nature of prehospital care for those exposed. The purpose of the current study was to perform a descriptive analysis of Emergency Medical Services (EMS) activations reported to a national EMS database. Analysis of the 2012 National EMS Information System (NEMSIS) Public Release Research Data Set v.2.2.1, containing EMS emergency response data submitted by 41 states, was conducted. Mandatory data elements E0207 (Type of Response Delay), E0208 (Type of Scene Delay), and E0209 (Type of Transport Delay) contained specific codes for HazMat events and were used to identify specific EMS activation records for subsequent analysis. Overlapping data elements were identified and combined in order to prevent duplicate entries. Descriptive analyses were generated from the NEMSIS Research Data Set. A total of 17,479,328 EMS activations were reported, of which 2,527 unique activations involved HazMat response. Mass-casualty incident was coded for 5.6% of activations. The most common level of prehospital care present on scene was Basic Life Support (BLS; 51.1%); 2.1% required aggressive Advanced Life Support (ALS) response. The most common locations for HazMat activations were homes (36.2%), streets or highways (26.3%), and health care facilities (11.6%). The primary symptoms observed by EMS personnel were pain (29.6%), breathing problems (12.2%), and change in responsiveness (9.6%). Two percent of HazMat activations involved cardiac arrest, with 21.7% occurring after EMS arrival. Delays in patient care included response delay, scene delay, and transport delay. Hazardous materials events are rare causes of EMS activation in the United States. The majority occur in non-industrial venues and involve two or fewer patients. Scene time frequently is delayed due to multiple barriers. Cardiac arrest is rare but occurred after EMS arrival in one-fifth of patients.

  4. Spatial prediction of landslide hazard using discriminant analysis and GIS

    Treesearch

    Peter V. Gorsevski; Paul Gessler; Randy B. Foltz

    2000-01-01

    Environmental attributes relevant for spatial prediction of landslides triggered by rain and snowmelt events were derived from digital elevation model (DEM). Those data in conjunction with statistics and geographic information system (GIS) provided a detailed basis for spatial prediction of landslide hazard. The spatial prediction of landslide hazard in this paper is...

  5. Natural hazard modeling and uncertainty analysis [Chapter 2

    Treesearch

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  6. Hazard function analysis for flood planning under nonstationarity

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  7. 76 FR 70768 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft Report for Comment; Correction AGENCY: Nuclear Regulatory Commission. ACTION: Draft NUREG; request...

  8. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  9. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  10. Independent Orbiter Assessment (IOA): Analysis of the guidance, navigation, and control subsystem

    NASA Technical Reports Server (NTRS)

    Trahan, W. H.; Odonnell, R. A.; Pietz, K. C.; Hiott, J. M.

    1986-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) is presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The independent analysis results corresponding to the Orbiter Guidance, Navigation, and Control (GNC) Subsystem hardware are documented. The function of the GNC hardware is to respond to guidance, navigation, and control software commands to effect vehicle control and to provide sensor and controller data to GNC software. Some of the GNC hardware for which failure modes analysis was performed includes: hand controllers; Rudder Pedal Transducer Assembly (RPTA); Speed Brake Thrust Controller (SBTC); Inertial Measurement Unit (IMU); Star Tracker (ST); Crew Optical Alignment Site (COAS); Air Data Transducer Assembly (ADTA); Rate Gyro Assemblies; Accelerometer Assembly (AA); Aerosurface Servo Amplifier (ASA); and Ascent Thrust Vector Control (ATVC). The IOA analysis process utilized available GNC hardware drawings, workbooks, specifications, schematics, and systems briefs for defining hardware assemblies, components, and circuits. Each hardware item was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.

  11. National Emission Standards for Hazardous Air Pollutants (NESHAP) for Major Sources: Industrial, Commercial, and Institutional Boilers - Guidance for Calculating Emission Credits Resulting from Implementation of Energy Conservation Measures

    EPA Pesticide Factsheets

    The purpose of this July 2012 document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule.

  12. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  13. Hazard Analysis and Disaster Preparedness in the Fairbanks North Star Borough, Alaska using Hazard Simulations, GIS, and Network Analysis

    NASA Astrophysics Data System (ADS)

    Schaefer, K.; Prakash, A.; Witte, W.

    2011-12-01

    The Fairbanks North Star Borough (FNSB) lies in interior Alaska, an area that is dominated by semiarid, boreal forest climate. FNSB frequently witnesses flooding events, wild land fires, earthquakes, extreme winter storms and other natural and man-made hazards. Being a large 19,065 km2 area, with a population of approximately 97,000 residents, providing emergency services to residents in a timely manner is a challenge. With only four highways going in and out of the borough, and only two of those leading to another city, most residents do not have quick access to a main road. Should a major disaster occur and block one of the two highways, options for evacuating or getting supplies to the area quickly dwindle. We present the design of a Geographic Information System (GIS) and network analysis based decision support tool that we have created for planning and emergency response. This tool will be used by Emergency Service (Fire/EMS), Emergency Management, Hazardous Materials Team, and Law Enforcement Agencies within FNSB to prepare and respond to a variety of potential disasters. The GIS combines available road and address networks from different FNSB agencies with the 2010 census data. We used ESRI's ArcGIS and FEMA's HAZUS-MH software to run multiple disaster scenarios and create several evacuation and response plans. Network analysis resulted in determining response time and classifying the borough by response times to facilitate allocation of emergency resources. The resulting GIS database can be used by any responding agency in FNSB to determine possible evacuation routes, where to open evacuation centers, placement of resources, and emergency response times. We developed a specific emergency response plan for three common scenarios: (i) major wildfire threatening Fairbanks, (ii) a major earthquake, (iii) loss of power during flooding in a flood-prone area. We also combined the network analysis results with high resolution imagery and elevation data to determine

  14. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  15. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    NASA Astrophysics Data System (ADS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-11-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh.

  16. Probabilistic Tsunami Hazard Analysis for Eastern Sicily (Italy)

    NASA Astrophysics Data System (ADS)

    Lorito, S.; Piatanesi, A.; Romano, F.; Basili, R.; Kastelic, V.; Tiberti, M. M.; Valensise, G.

    2009-12-01

    We applied the method for Probabilistic Tsunami Hazard Analysis (PTHA), first introduced by Rikitake and Aida (1988), to the coasts of eastern Sicily (Italy), and focusing on important cities such as Messina, Catania, and Augusta. The method has been very recently improved by González et al. (2009) to explicitly include inundation. Our approach considers both far- and near-field sources. Based on the approach described by Lorito et al. (2008), we identified a few Source Zones (SZ), each of which is characterized by a Typical Fault (TF) floating within it and capable of generating the Maximum Credible Earthquake (MCE). Some of the MCE recall historical events such as the 1908 Messina, the 1693 Catania and the 365 AD Crete earthquakes. We calculated the combined probability for all considered SZs by assigning a recurrence interval to each of them and a Poisson distribution of the inter-event times. We performed the PTHA for different damage metrics, such as runup, current speed, momentum and Froude number, with probability of exceedance of given thresholds being evaluated for different time intervals.

  17. Analysis of hazardous biological material by MALDI mass spectrometry

    SciTech Connect

    KL Wahl; KH Jarman; NB Valentine; MT Kingsley; CE Petersen; ST Cebula; AJ Saenz

    2000-03-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-MS) has become a valuable tool for analyzing microorganisms. The speed with which data can be obtained from MALDI-MS makes this a potentially important tool for biological health hazard monitoring and forensic applications. The excitement in the mass spectrometry community in this potential field of application is evident by the expanding list of research laboratories pursuing development of MALDI-MS for bacterial identification. Numerous research groups have demonstrated the ability to obtain unique MALDI-MS spectra from intact bacterial cells and bacterial cell extracts. The ability to differentiate strains of the same species has been investigated. Reproducibility of MALDI-MS spectra from bacterial species under carefully controlled experimental conditions has also been demonstrated. Wang et al. have reported on interlaboratory reproducibility of the MALDI-MS analysis of several bacterial species. However, there are still issues that need to be addressed, including the careful control of experimental parameters for reproducible spectra and selection of optimal experimental parameters such as solvent and matrix.

  18. Hazardous Gas Leak Analysis in the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.

    1991-01-01

    Helium tests of the main propulsion system in the Space Shuttle and on hydrogen leaks are examined. The hazardous gas detection system (HGDS) in the mobile launch pad uses mass spectrometers (MS) to monitor the shuttle environment for leaks. The mass spectrometers are fed by long tubes to sample gas from the payload bay, mid-body, aft engine compartment, and external tank. The purpose is to improve the HGDS, especially in its potential for locating cryogen leaks. Pre-existing leak data was analyzed for transient information to determine if the leak location could be pinpointed from test data. A rapid response leak detection experiment was designed, built, and tested. Large eddies and vortices were visually seen with Schlieren imaging, and they were detected in the time plots of the various instruments. The response time of the MS was found in the range of 0.05 to 0.1 sec. Pulsed concentration waves were clearly detected at 25 cycles per sec by spectral analysis of MS data. One conclusion is that the backup HGDS sampling frequency should be increased above the present rate of 1 sample per second.

  19. Preparation of waste oil for analysis to determine hazardous metals

    SciTech Connect

    Essling, A.M.; Huff, D.R.; Huff, E.A.; Fox, I.M.; Graczyk, D.G.

    1995-07-01

    Two methods for preparing waste-oil samples to permit measurement of their metals content were evaluated. For this evaluation, metals-in-oil standard reference materials were prepared by each method and the resulting solutions were analyzed for 20 metals, including those (As, Ba, Cd, Cr, Pb, Hg, Se, and Ag) regulated as hazardous under the Resource Conservation and Recovery Act. One preparation method involved combustion of the waste oil under oxygen at 25 atm pressure, as described in the American Society for Testing and Materials test method E926-88. As we applied it, this method gave recoveries well under 90% for most of the metals that we examined and, hence, proved unsatisfactory for routine application to waste-oil analysis. With the other method, nitric acid decomposition in a sealed vessel heated with microwave energy (analogous to US Environmental Protection Agency Method 3051), recoveries of all 20 metal contaminants were within 90 to 110% of the certified values. This microwave digestion procedure was also more efficient since it allowed six samples to be prepared together, whereas the oxygen combustion approach allowed processing of only one sample at a time.

  20. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each.... Sanitation controls may be included in the HACCP plan. However, to the extent that they are monitored in... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point...

  1. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... Collection; Comment Request; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... monitoring of the critical control points and other measurements as prescribed in the HACCP plan......

  2. Scout: orbit analysis and hazard assessment for NEOCP objects

    NASA Astrophysics Data System (ADS)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  3. Development, analysis, and testing of robust nonlinear guidance algorithms for space applications

    NASA Astrophysics Data System (ADS)

    Wibben, Daniel R.

    This work focuses on the analysis and application of various nonlinear, autonomous guidance algorithms that utilize sliding mode control to guarantee system stability and robustness. While the basis for the algorithms has previously been proposed, past efforts barely scratched the surface of the theoretical details and implications of these algorithms. Of the three algorithms that are the subject of this research, two are directly derived from optimal control theory and augmented using sliding mode control. Analysis of the derivation of these algorithms has shown that they are two different representations of the same result, one of which uses a simple error state model (Delta r/Deltav) and the other uses definitions of the zero-effort miss and zero-effort velocity (ZEM/ZEV) values. By investigating the dynamics of the defined sliding surfaces and their impact on the overall system, many implications have been deduced regarding the behavior of these systems which are noted to feature time-varying sliding modes. A formal finite time stability analysis has also been performed to theoretically demonstrate that the algorithms globally stabilize the system in finite time in the presence of perturbations and unmodeled dynamics. The third algorithm that has been subject to analysis is derived from a direct application of higher-order sliding mode control and Lyapunov stability analysis without consideration of optimal control theory and has been named the Multiple Sliding Surface Guidance (MSSG). Via use of reinforcement learning methods an optimal set of gains has been found that make the guidance perform similarly to an open-loop optimal solution. Careful side-by-side inspection of the MSSG and Optimal Sliding Guidance (OSG) algorithms has shown some striking similarities. A detailed comparison of the algorithms has demonstrated that though they are nearly indistinguishable at first glance, there are some key differences between the two algorithms and they are indeed

  4. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    SciTech Connect

    Thomas, John

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  5. A Chemical Plant Safety and Hazard Analysis Course.

    ERIC Educational Resources Information Center

    Gupta, J. P.

    1989-01-01

    Describes a course for teaching chemical engineering students about safety and hazards. Summarizes the course content including topics for term papers and disciplines related to this course. Lists 18 references. (YP)

  6. Impact of real-time ultrasound guidance on complications of percutaneous dilatational tracheostomy: a propensity score analysis.

    PubMed

    Rajajee, Venkatakrishna; Williamson, Craig A; West, Brady T

    2015-04-29

    Recent studies have demonstrated the feasibility of real-time ultrasound guidance during percutaneous dilatational tracheostomy, including in patients with risk factors such as coagulopathy, cervical spine immobilization and morbid obesity. Use of real-time ultrasound guidance has been shown to improve the technical accuracy of percutaneous dilatational tracheostomy; however, it is unclear if there is an associated reduction in complications. Our objective was to determine whether the peri-procedural use of real-time ultrasound guidance is associated with a reduction in complications of percutaneous dilatational tracheostomy using a propensity score analysis. This study reviewed all percutaneous dilatational tracheostomies performed in an 8-year period in a neurocritical care unit. Percutaneous dilatational tracheostomies were typically performed by trainees under guidance of the attending intensivist. Bronchoscopic guidance was used for all procedures with addition of real-time ultrasound guidance at the discretion of the attending physician. Real-time ultrasound guidance was used to guide endotracheal tube withdrawal, guide tracheal puncture, identify guidewire entry level and confirm bilateral lung sliding. The primary outcome was a composite of previously defined complications including (among others) bleeding, infection, loss of airway, inability to complete procedure, need for revision, granuloma and early dislodgement. Propensity score analysis was used to ensure that the relationship of not using real-time ultrasound guidance with the probability of an adverse outcome was examined within groups of patients having similar covariate profiles. Covariates included were age, gender, body mass index, diagnosis, Acute Physiology and Chronic Health Evaluation II score, timing of tracheostomy, positive end-expiratory pressure and presence of risk factors including coagulopathy, cervical spine immobilization and prior tracheostomy. A total of 200 patients underwent

  7. Hazards assessment for the Hazardous Waste Storage Facility

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-04-01

    This report documents the hazards assessment for the Hazardous Waste Storage Facility (HWSF) located at the Idaho National Engineering Laboratory. The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding HWSF, the buildings and structures at HWSF, and the processes used at HWSF are described in this report. All nonradiological hazardous materials at the HWSF were identified (radiological hazardous materials are not stored at HWSF) and screened against threshold quantities according to DOE Order 5500.3A guidance. Two of the identified hazardous materials exceeded their specified threshold quantity. This report discusses the potential release scenarios and consequences associated with an accidental release for each of the two identified hazardous materials, lead and mercury. Emergency considerations, such as emergency planning zones, emergency classes, protective actions, and emergency action levels, are also discussed based on the analysis of potential consequences. Evaluation of the potential consequences indicated that the highest emergency class for operational emergencies at the HWSF would be a Site Area Emergency.

  8. Identification of Aircraft Hazards

    SciTech Connect

    K. Ashley

    2006-12-08

    Aircraft hazards were determined to be potentially applicable to a repository at Yucca Mountain in ''Monitored Geological Repository External Events Hazards Screening Analysis'' (BSC 2005 [DIRS 174235], Section 6.4.1). That determination was conservatively based upon limited knowledge of flight data in the area of concern and upon crash data for aircraft of the type flying near Yucca Mountain. The purpose of this report is to identify specific aircraft hazards that may be applicable to a monitored geologic repository (MGR) at Yucca Mountain, using NUREG-0800, ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987 [DIRS 103124], Section 3.5.1.6), as guidance for the inclusion or exclusion of identified aircraft hazards. The intended use of this report is to provide inputs for further screening and analysis of identified aircraft hazards based upon the criteria that apply to Category 1 and Category 2 event sequence analyses as defined in 10 CFR 63.2 [DIRS 176544] (Section 4). The scope of this report includes the evaluation of military, private, and commercial use of airspace in the 100-mile regional setting of the repository at Yucca Mountain with the potential for reducing the regional setting to a more manageable size after consideration of applicable screening criteria (Section 7).

  9. IDENTIFICATION OF AIRCRAFT HAZARDS

    SciTech Connect

    K.L. Ashley

    2005-03-23

    Aircraft hazards were determined to be potentially applicable to a repository at Yucca Mountain in the ''Monitored Geological Repository External Events Hazards Screening Analysis'' (BSC 2004, Section 6.4.1). That determination was conservatively based on limited knowledge of flight data in the area of concern and on crash data for aircraft of the type flying near Yucca Mountain. The purpose of this report is to identify specific aircraft hazards that may be applicable to a Monitored Geologic Repository (MGR) at Yucca Mountain using NUREG-0800, ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987, Section 3.5.1.6), as guidance for the inclusion or exclusion of identified aircraft hazards. NUREG-0800 is being used here as a reference because some of the same considerations apply. The intended use of this report is to provide inputs for further screening and analysis of the identified aircraft hazards based on the criteria that apply to Category 1 and 2 event sequence analyses as defined in 10 CFR 63.2 (see Section 4). The scope of this technical report includes the evaluation of military, private, and commercial use of airspace in the 100-mile regional setting of the MGR at Yucca Mountain with the potential for reducing the regional setting to a more manageable size after consideration of applicable screening criteria (see Section 7).

  10. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    SciTech Connect

    Dionne, B.J.; Morris, S. III; Baum, J.W.

    1998-03-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique.

  11. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    SciTech Connect

    Dionne, B.J.; Morris, S.C. III; Baum, J.W.

    1998-01-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique. This document contains the Appendices for the report.

  12. Guidance on the implementation and reporting of a drug safety Bayesian network meta-analysis.

    PubMed

    Ohlssen, David; Price, Karen L; Xia, H Amy; Hong, Hwanhee; Kerman, Jouni; Fu, Haoda; Quartey, George; Heilmann, Cory R; Ma, Haijun; Carlin, Bradley P

    2014-01-01

    The Drug Information Association Bayesian Scientific Working Group (BSWG) was formed in 2011 with a vision to ensure that Bayesian methods are well understood and broadly utilized for design and analysis and throughout the medical product development process, and to improve industrial, regulatory, and economic decision making. The group, composed of individuals from academia, industry, and regulatory, has as its mission to facilitate the appropriate use and contribute to the progress of Bayesian methodology. In this paper, the safety sub-team of the BSWG explores the use of Bayesian methods when applied to drug safety meta-analysis and network meta-analysis. Guidance is presented on the conduct and reporting of such analyses. We also discuss different structural model assumptions and provide discussion on prior specification. The work is illustrated through a case study involving a network meta-analysis related to the cardiovascular safety of non-steroidal anti-inflammatory drugs.

  13. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    SciTech Connect

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  14. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  15. Analysis of hazardous substances released during CFRP laser processing

    NASA Astrophysics Data System (ADS)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  16. Hazard Detection Analysis for a Forward-Looking Interferometer

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Herkert, Ralph; Smith, William L.; Kireev, Stanislav; Schaffner, Philip R.; Daniels, Taumi S.; Cornman, Larry B.; Sharman, Robert; Weekley, Andrew; Perram, Glen; Gross, Kevin; Smith, Greg; Feltz, Wayne; Taylor, Joe; Olson, Erik

    2010-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining the measurements required to alert flight crews to potential weather hazards to safe flight. To meet the needs of the commercial fleet, such a sensor should address multiple hazards to warrant the costs of development, certification, installation, training, and maintenance. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing. These technologies have also been applied to the detection of aerosols and gases for other purposes. The FLI concept is being evaluated for its potential to address multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing during all phases of flight (takeoff, cruise, and landing). The research accomplished in this second phase of the FLI project was in three major areas: further sensitivity studies to better understand the potential capabilities and requirements for an airborne FLI instrument, field measurements that were conducted in an effort to provide empirical demonstrations of radiometric hazard detection, and theoretical work to support the development of algorithms to determine the severity of detected hazards

  17. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    SciTech Connect

    Matthews, Elizabeth C.; Sattler, Meredith; Friedland, Carol J.

    2014-11-15

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.

  18. Method for data analysis in different institutions: example of image guidance of prostate cancer patients.

    PubMed

    Piotrowski, T; Rodrigues, G; Bajon, T; Yartsev, S

    2014-03-01

    Multi-institutional collaborations allow for more information to be analyzed but the data from different sources may vary in the subgroup sizes and/or conditions of measuring. Rigorous statistical analysis is required for pooling the data in a larger set. Careful comparison of all the components of the data acquisition is indispensable: identical conditions allow for enlargement of the database with improved statistical analysis, clearly defined differences provide opportunity for establishing a better practice. The optimal sequence of required normality, asymptotic normality, and independence tests is proposed. An example of analysis of six subgroups of position corrections in three directions obtained during image guidance procedures for 216 prostate cancer patients from two institutions is presented. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. Analysis of medical students' needs for development of a career guidance program.

    PubMed

    An, Hyejin; Kim, Eunjeong; Hwang, Jinyoung; Lee, Seunghee

    2014-09-01

    The purpose of this study is to provide basic data for the development of a career guidance program through a demand survey. For this purpose, three study topics were examined: Is there a difference between the satisfaction and importance of a career program? Is there a difference between the satisfaction and importance of a career program by gender, grade level? and What type of mentor and the mentoring way of medical students demanded? The subjects were 380 students at Seoul National University College of Medicine. The data were analyzed by frequency analysis, paired t-test, and Borich's formula. By t-test with matched samples for satisfaction-importance, We noted statistically significant differences in all domains. In particular, the difference was greater in the second year. According to the needs analysis, the most urgent program is meeting with seniors in various career areas. Also, medical students hope for mentor from clinical professors of the university and successful medical practitioners, and personal counseling. These results show that medical students need a career guidance program. The findings of the study can be used to guide the development of career education programs and curriculum for medicine students.

  20. Occupational hazards control of hazardous substances in clean room of semiconductor manufacturing plant using CFD analysis.

    PubMed

    Li, Jianfeng; Zhou, Ya-Fei

    2015-02-01

    The manufacturing processes in chip industries are complex, and many kinds of raw materials and solvents of different nature are used, most of which are highly toxic and dangerous. During the machine preventive maintenance period, these toxic and harmful substances will escape from the sealed reaction chamber to the clean workshop environment and endanger the health of the workers on-site, resulting in occupational diseases. From the perspective of prevention, the spread and prediction of hydrochloric acid (HCl) that escaped from the metal-etching chamber during maintenance were studied in this article. The computational fluid dynamics technology was used for a three-dimensional numerical simulation of the indoor air velocity field and the HCl concentration field, and the simulation results were then compared with the on-site monitoring data to verify the correctness and feasibility. The occupational hazards and control measures were analyzed based on the numerical simulation, and the optimal control measure was obtained. In this article, using the method of ambient air to analyze the occupational exposure can provide a new idea to the field of occupational health research in the integrated circuit industry and had theoretical and practical significance.

  1. Overview of Mars Sample Hazard Analysis (Requirements Workshop Series)

    NASA Technical Reports Server (NTRS)

    Rummel, John D.

    2000-01-01

    Samples returned from Mars should be contained and treated as though potentially hazardous until proven otherwise. If sample containment cannot be verified en route to Earth, the sample and spacecraft should either be sterilized in space or not returned to Earth. Integrity of sample containment should be maintained through reentry and transfer to a receiving facility. Controlled distribution of unsterilized materials should only occur if analyses determine the sample not to contain a biological hazard. Planetary protection measures adopted for the first sample return should not be relaxed for subsequent missions without thorough scientific review and concurrence by an appropriate independent body.

  2. Reporting guidelines for survey research: an analysis of published guidance and reporting practices.

    PubMed

    Bennett, Carol; Khangura, Sara; Brehaut, Jamie C; Graham, Ian D; Moher, David; Potter, Beth K; Grimshaw, Jeremy M

    2010-08-01

    Research needs to be reported transparently so readers can critically assess the strengths and weaknesses of the design, conduct, and analysis of studies. Reporting guidelines have been developed to inform reporting for a variety of study designs. The objective of this study was to identify whether there is a need to develop a reporting guideline for survey research. We conducted a three-part project: (1) a systematic review of the literature (including "Instructions to Authors" from the top five journals of 33 medical specialties and top 15 general and internal medicine journals) to identify guidance for reporting survey research; (2) a systematic review of evidence on the quality of reporting of surveys; and (3) a review of reporting of key quality criteria for survey research in 117 recently published reports of self-administered surveys. Fewer than 7% of medical journals (n = 165) provided guidance to authors on survey research despite a majority having published survey-based studies in recent years. We identified four published checklists for conducting or reporting survey research, none of which were validated. We identified eight previous reviews of survey reporting quality, which focused on issues of non-response and accessibility of questionnaires. Our own review of 117 published survey studies revealed that many items were poorly reported: few studies provided the survey or core questions (35%), reported the validity or reliability of the instrument (19%), defined the response rate (25%), discussed the representativeness of the sample (11%), or identified how missing data were handled (11%). There is limited guidance and no consensus regarding the optimal reporting of survey research. The majority of key reporting criteria are poorly reported in peer-reviewed survey research articles. Our findings highlight the need for clear and consistent reporting guidelines specific to survey research.

  3. Reporting Guidelines for Survey Research: An Analysis of Published Guidance and Reporting Practices

    PubMed Central

    Bennett, Carol; Khangura, Sara; Brehaut, Jamie C.; Graham, Ian D.; Moher, David; Potter, Beth K.; M. Grimshaw, Jeremy

    2011-01-01

    Background Research needs to be reported transparently so readers can critically assess the strengths and weaknesses of the design, conduct, and analysis of studies. Reporting guidelines have been developed to inform reporting for a variety of study designs. The objective of this study was to identify whether there is a need to develop a reporting guideline for survey research. Methods and Findings We conducted a three-part project: (1) a systematic review of the literature (including “Instructions to Authors” from the top five journals of 33 medical specialties and top 15 general and internal medicine journals) to identify guidance for reporting survey research; (2) a systematic review of evidence on the quality of reporting of surveys; and (3) a review of reporting of key quality criteria for survey research in 117 recently published reports of self-administered surveys. Fewer than 7% of medical journals (n = 165) provided guidance to authors on survey research despite a majority having published survey-based studies in recent years. We identified four published checklists for conducting or reporting survey research, none of which were validated. We identified eight previous reviews of survey reporting quality, which focused on issues of non-response and accessibility of questionnaires. Our own review of 117 published survey studies revealed that many items were poorly reported: few studies provided the survey or core questions (35%), reported the validity or reliability of the instrument (19%), defined the response rate (25%), discussed the representativeness of the sample (11%), or identified how missing data were handled (11%). Conclusions There is limited guidance and no consensus regarding the optimal reporting of survey research. The majority of key reporting criteria are poorly reported in peer-reviewed survey research articles. Our findings highlight the need for clear and consistent reporting guidelines specific to survey research. Please see

  4. Preliminary fire hazard analysis for the PUTDR and TRU trenches in the Solid Waste Burial Ground

    SciTech Connect

    Gaschott, L.J.

    1995-06-16

    This document represents the Preliminary Fire Hazards Analysis for the Pilot Unvented TRU Drum Retrieval effort and for the Transuranic drum trenches in the low level burial grounds. The FHA was developed in accordance with DOE Order 5480.7A to address major hazards inherent in the facility.

  5. Job load and hazard analysis: a method for the analysis of workplace conditions for occupational health care.

    PubMed Central

    Mattila, M K

    1985-01-01

    One requirement for successful occupational health care is reliable information on occupational hazards. The aim of this study was to develop a simple, standardised method for workplace investigations for use in occupational health care. The theoretical framework of the method comprises the stress-strain model, the hazard-danger model, and risk behaviour theory. The new method, termed job load and hazard analysis, includes four stages: identification of hazards, their evaluation, conclusions and proposals, and follow up. Different methods are available for hazard identification. The identification starts with a rough analysis of five factors, chemical hazards, physical hazards, physical load, mental stress, and accident risk. Hazards and stress factors are assessed with an ordinal scale. Specialised methods are used if all hazards cannot otherwise be identified. The analytical procedure comprises: detection of hazards through observations and interviews at the workplace and with a questionnaire; assessment of findings as teamwork; and evaluation of the results of these assessments to yield conclusions and proposals made by occupational health care personnel. A data processing system has been developed for data storage and future use. The method has functioned in practice, improving the contents of the occupational health care programme and generating preventive measures. The method offers many new possibilities for controlling occupational hazards and studying relations between working conditions and workers' health. PMID:4041383

  6. National Emission Standards for Hazardous Air Pollutants for Major Sources. Industrial, Commercial, and Institutional Boilers; Guidance for Calculating Emission Credits Resulting from Implementation of Energy Conservation Measures

    SciTech Connect

    Papar, Riyaz; Wright, Anthony; Cox, Daryl

    2012-07-01

    The purpose of this document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule (i.e., Subpart DDDDD of CFR Part 63).

  7. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... determine the food safety hazards reasonably likely to occur in the production process and identify the... chart describing the steps of each process and product flow in the establishment shall be prepared, and...) Decomposition; (viii) Parasites; (ix) Unapproved use of direct or indirect food or color additives; and...

  8. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... determine the food safety hazards reasonably likely to occur in the production process and identify the... chart describing the steps of each process and product flow in the establishment shall be prepared, and...) Decomposition; (viii) Parasites; (ix) Unapproved use of direct or indirect food or color additives; and...

  9. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... determine the food safety hazards reasonably likely to occur in the production process and identify the... chart describing the steps of each process and product flow in the establishment shall be prepared, and...) Decomposition; (viii) Parasites; (ix) Unapproved use of direct or indirect food or color additives; and...

  10. Reliability analysis of common hazardous waste treatment processes

    SciTech Connect

    Waters, Robert D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  11. [Analysis and toxicological evaluation of hazardous gases in sealed cabin].

    PubMed

    He, Z; Shi, J; Yu, B; Liang, H; Yu, F

    1998-10-01

    82 volatile organic compounds (VOCs) of eight organic sorts and 3 target inorganic compounds in a sealed cabin that simulating the flying spaceship were identified and quantified for 5 d, the law of hazardous gas concentration variation was discussed, and the atmosphere toxicology was evaluated preliminarily. It provides a basis for detecting gas compounds and evaluating the atmosphere toxicology in the spaceship.

  12. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    SciTech Connect

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  13. Hazardous Glaciers In Switzerland: A Statistical Analysis of Inventory Data

    NASA Astrophysics Data System (ADS)

    Raymond, M.; Funk, M.; Wegmann, M.

    Because of the recent increase in both occupation and economical activities in high mountain areas, a systematic overview of potential hazard zones of glaciers is needed to avoid the constuction of settlements and infrastructures in endangered areas in fu- ture. Historical informations about glacier disasters show that catastrophic events can happen repeatedly for the same causes and with the same dramatic consequences. Past catastrophic events are not only useful to identify potentially dangerous glaciers, but represent an indication of the kind of glacier hazards to expect for any given glacier. An inventory containing all known events having caused damages in the past has been compiled for Switzerland. Three different types of glacier hazards are distinguished , e.g. ice avalanches, glacier floods and glacier length changes.Hazardous glaciers have been identified in the alpine cantons of Bern, Grison, Uri, Vaud and Valais so far. The inventory data were analysed in terms of periodicity of different types of events as well as of damage occured.

  14. Estimating Source Recurrence Rates for Probabilistic Tsunami Hazard Analysis (PTHA)

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Parsons, T.

    2004-12-01

    A critical factor in probabilistic tsunami hazard analysis (PTHA) is estimating the average recurrence rate for tsunamigenic sources. Computational PTHA involves aggregating runup values derived from numerical simulations for many far-field and local sources, primarily earthquakes, each with a specified probability of occurrence. Computational PTHA is the primary method used in the ongoing FEMA pilot study at Seaside, Oregon. For a Poissonian arrival time model, the probability for a given source is dependent on a single parameter: the mean inter-event time of the source. In other probability models, parameters such as aperiodicity are also included. In this study, we focus on methods to determine the recurrence rates for large, shallow subduction zone earthquakes. For earthquakes below about M=8, recurrence rates can be obtained from modified Gutenberg-Richter distributions that are constrained by the tectonic moment rate for individual subduction zones. However, significant runup from far-field sources is commonly associated with the largest magnitude earthquakes, for which the recurrence rates are poorly constrained by the tail of empirical frequency-magnitude relationships. For these earthquakes, paleoseismic evidence of great earthquakes can be used to establish recurrence rates. Because the number of geologic horizons representing great earthquakes along a particular subduction zone is limited, special techniques are needed to account for open intervals before the first and after the last observed events. Uncertainty in age dates for the horizons also has to be included in estimating recurrence rates and aperiodicity. A Monte Carlo simulation is performed in which a random sample of earthquake times is drawn from a specified probability distribution with varying average recurrence rates and aperiodicities. A recurrence rate can be determined from the mean rate of all random samples that fit the observations, or a range of rates can be carried through the

  15. Hazard analysis for 300 Area N Reactor Fuel Fabrication and Storage Facilty

    SciTech Connect

    Johnson, D.J.; Brehm, J.R.

    1994-01-25

    This hazard analysis (HA) has been prepared for the 300 Area N Reactor Fuel Fabrication and Storage Facility (Facility), in compliance with the requirements of Westinghouse Hanford Company (Westinghouse Hanford) controlled manual WHC-CM-4-46, Nonreactor Facility Safety Analysis Manual, and to the direction of WHC-IP-0690, Safety Analysis and Regulation Desk Instructions, (WHC 1992). An HA identifies potentially hazardous conditions in a facility and the associated potential accident scenarios. Unlike the Facility hazard classification documented in WHC-SD-NR-HC-004, Hazard Classification for 300 Area N Reactor Fuel Fabrication and Storage Facility, (Huang 1993), which is based on unmitigated consequences, credit is taken in an HA for administrative controls or engineered safety features planned or in place. The HA is the foundation for the accident analysis. The significant event scenarios identified by this HA will be further evaluated in a subsequent accident analysis.

  16. Taxonomic analysis of perceived risk: modeling individual and group perceptions within homogeneous hazard domains

    SciTech Connect

    Kraus, N.N.; Slovic, P.

    1988-09-01

    Previous studies of risk perception have typically focused on the mean judgments of a group of people regarding the riskiness (or safety) of a diverse set of hazardous activities, substances, and technologies. This paper reports the results of two studies that take a different path. Study 1 investigated whether models within a single technological domain were similar to previous models based on group means and diverse hazards. Study 2 created a group taxonomy of perceived risk for only one technological domain, railroads, and examined whether the structure of that taxonomy corresponded with taxonomies derived from prior studies of diverse hazards. Results from Study 1 indicated that the importance of various risk characteristics in determining perceived risk differed across individuals and across hazards, but not so much as to invalidate the results of earlier studies based on group means and diverse hazards. In Study 2, the detailed analysis of railroad hazards produced a structure that had both important similarities to, and dissimilarities from, the structure obtained in prior research with diverse hazard domains. The data also indicated that railroad hazards are really quite diverse, with some approaching nuclear reactors in their perceived seriousness. These results suggest that information about the diversity of perceptions within a single domain of hazards could provide valuable input to risk-management decisions.

  17. Development of hazard analysis by critical control points (HACCP) procedures to control organic chemical hazards in the agricultural production of raw food commodities.

    PubMed

    Ropkins, Karl; Ferguson, Andrew; Beck, Angus J

    2003-01-01

    Hazard Analysis by Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards in the food chain. Effective HACCP requires the consideration of all chemical microbiological, and physical hazards. However, current procedures focus primarily on microbiological and physical hazards, while chemical aspects of HACCP have received relatively little attention. In this article we discuss the application of HACCP to organic chemical contaminants and the problems that are likely to be encountered in agriculture. We also present generic templates for the development of organic chemical contaminant HACCP procedures for selected raw food commodities, that is, cereal crops,raw meats, and milk.

  18. Hazard analysis for magnetic induction from electric transmission lines

    NASA Astrophysics Data System (ADS)

    Taylor, R. J.

    1983-06-01

    The potential hazard of magnetic induction from electric transmission lines was investigated. A literature search was accomplished, measurements were made and compared with the theories found in the literature and a new approach was developed for estimating the hazardous potential of magnetically induced voltage in fences which could be grasped by barefoot children under wet conditions. Conditions under which representative transmission lines could induce such voltage were explored. For example, a standard three-phase horizontal 500 KV transmission line carrying 1000 A was estimated to induce 10.5 V in a 900 M fence 30 M from the center of the line. It was also found that third harmonic currents can magnetically induce significant voltages compared to those induced by 60 HZ currents umder some circumstances.

  19. 77 FR 9946 - Draft Guidance for Industry on Drug Interaction Studies-Study Design, Data Analysis, Implications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-21

    ... HUMAN SERVICES Food and Drug Administration (Formerly Docket No. 2006D-0344) Draft Guidance for Industry on Drug Interaction Studies--Study Design, Data Analysis, Implications for Dosing, and Labeling... entitled ``Drug Interaction Studies--Study Design, Data Analysis, Implications for Dosing, and...

  20. A cross-hazard analysis of terse message retransmission on Twitter

    PubMed Central

    Sutton, Jeannette; Gibson, C. Ben; Phillips, Nolan Edward; Spiro, Emma S.; League, Cedar; Johnson, Britta; Fitzhugh, Sean M.; Butts, Carter T.

    2015-01-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes—local network properties, message content, and message style—that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates. PMID:26627233

  1. A cross-hazard analysis of terse message retransmission on Twitter.

    PubMed

    Sutton, Jeannette; Gibson, C Ben; Phillips, Nolan Edward; Spiro, Emma S; League, Cedar; Johnson, Britta; Fitzhugh, Sean M; Butts, Carter T

    2015-12-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes--local network properties, message content, and message style--that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates.

  2. Hazards assessment for the Waste Experimental Reduction Facility

    SciTech Connect

    Calley, M.B.; Jones, J.L. Jr.

    1994-09-19

    This report documents the hazards assessment for the Waste Experimental Reduction Facility (WERF) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. This hazards assessment describes the WERF, the area surrounding WERF, associated buildings and structures at WERF, and the processes performed at WERF. All radiological and nonradiological hazardous materials stored, used, or produced at WERF were identified and screened. Even though the screening process indicated that the hazardous materials could be screened from further analysis because the inventory of radiological and nonradiological hazardous materials were below the screening thresholds specified by DOE and DOE-ID guidance for DOE Order 5500.3A, the nonradiological hazardous materials were analyzed further because it was felt that the nonradiological hazardous material screening thresholds were too high.

  3. Space Propulsion Hazards Analysis Manual (SPHAM). Volume 1

    DTIC Science & Technology

    1988-10-01

    dioxide, carbon monoxide, Freons, hydrazine, liquid hydrogen , hydrogen peroxide, liquid fluorine, nitrogen tetroxide, monomethyihydrazine, liquid methane...why the requirement does not apply. Item 15: Column 8. Enter an exact dese -iption of the drawing, test method, operations procedurz etc., wherefrom...environment - Hydrogen embrittlement *Note: This discussion on hazard reports has been extracted for the most part from MCR 82-800, Rev. B, 29 Sept 82

  4. Chemical hazards analysis of resilient flooring for healthcare.

    PubMed

    Lent, Tom; Silas, Julie; Vallette, Jim

    2010-01-01

    This article addresses resilient flooring, evaluating the potential health effects of vinyl flooring and the leading alternatives-synthetic rubber, polyolefin, and linoleum-currently used in the healthcare marketplace. The study inventories chemicals incorporated as components of each of the four material types or involved in their life cycle as feedstocks, intermediary chemicals, or emissions. It then characterizes those chemicals using a chemical hazard-based framework that addresses persistence and bioaccumulation, human toxicity, and human exposures.

  5. Rockfall hazard analysis using LiDAR and spatial modeling

    NASA Astrophysics Data System (ADS)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  6. Spatial analysis of hazardous waste data using geostatistics

    SciTech Connect

    Zirschky, J.H.

    1984-01-01

    The objective of this investigation was to determine if geostatistics could be a useful tool for evaluating hazardous waste sites. Three sites contaminated by dioxin (2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD)) were investigated. The first site evaluated was a creek into which TCDD-contaminated soil had eroded. The second site was a town in which TCDD-contaminated wastes had been sprayed onto the streets. Finally, the third site was a highway of which the shoulders were contaminated by dust deposition from a nearby hazardous waste site. The distribution of TCDD at the first and third sites were investigated using kriging, an optimal estimation technique. By using kriging, the areas of both sites requiring cleanup were successfully identified. At the second site, the town, satisfactory results were not obtained. The distribution of contamination in this town is believed to be very heterogeneous; thus, reasonable estimates could not be obtained. Additional sampling was therefore recommended at this site. Based upon this research, geostatistics appears to be a very useful tool for evaluating a hazardous waste site if the distribution of contaminants at the site is homogeneous, or can be divided into homogeneous areas.

  7. Analysis of temporal and spatial overlapping of hazards interactions at different scales

    NASA Astrophysics Data System (ADS)

    De Angeli, Silvia; Trasforini, Eva; Taylor, Faith; Rudari, Roberto; Rossi, Lauro

    2017-04-01

    The aim of this work is to develop a methodological framework to analyse the impact of multiple hazards on complex territorial systems, not only focusing on multi-hazard interactions but evaluating also the multi-risk, i.e. considering the impact of multiple hazards also in terms of exposure and vulnerability. Impacts generated by natural hazards in the last years are growing also because many regions of the world become subject to multiple hazards and cascading effects. The modelling of the multi-hazard dimension is a new challenge that allows the stakeholder to face with the chain effects between hazards and to model the risk in a real holistic way. Despite the recognition of the importance of a multi-hazard approach in risk assessment, there are only a few multi-risk approaches developed up to now. The examination of multiple hazards, in contrast to single-hazard cases, poses a series of challenges in each step of the risk analysis, starting from the assessment of the hazard level, passing trough the vulnerability evaluation, and arriving finally at the resultant risk level. Hazard interactions and hazard contemporaneity arising from their spatial and temporal overlap may not only influence the overall hazard level, but also the vulnerability of elements at risk. In the proposed approach a series of possible interactions between hazards are identified and classified. These interactions are then analysed looking at the temporal and spatial evolution of the hazards and the consequent impacts and represented through an explicative graphical framework. Different temporal dimensions are identified. The time of the impact differs from the time of the damage because, even after the end of the impact, damages remain until recovery and restoration processes are completed. The discrepancy between the time of the impact and time of the damage is very important for the modelling of multi-hazard damage. Whenever a certain interval of time occurs between two impacts

  8. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  9. Laser hazard analysis for various candidate diode lasers associated with the high resolution pulsed scanner.

    SciTech Connect

    Augustoni, Arnold L.

    2004-10-01

    A laser hazard analysis and safety assessment was performed for each various laser diode candidates associated with the High Resolution Pulse Scanner based on the ANSI Standard Z136.1-2000, American National Standard for the Safe Use of Lasers. A theoretical laser hazard analysis model for this system was derived and an Excel{reg_sign} spreadsheet model was developed to answer the 'what if questions' associated with the various modes of operations for the various candidate diode lasers.

  10. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  11. Analysis of flood hazard under consideration of dike breaches

    NASA Astrophysics Data System (ADS)

    Vorogushyn, S.; Apel, H.; Lindenschmidt, K.-E.; Merz, B.

    2009-04-01

    The study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). Dike failures for each mechanism are simulated based on fragility functions. The probability of breach is conditioned by the uncertainty in geometrical and geotechnical dike parameters. The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100; 200; 500; 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that

  12. Landslide hazard zonation assessment using GIS analysis at Golmakan Watershed, northeast of Iran

    NASA Astrophysics Data System (ADS)

    Mansouri Daneshvar, Mohammad Reza; Bagherzadeh, Ali

    2011-03-01

    Landslide hazard is one of the major environmental hazards in geomorphic studies in mountainous areas. For helping the planners in selection of suitable locations to implement development projects, a landslide hazard zonation map has been produced for the Golmakan Watershed as part of Binaloud northern hillsides (northeast of Iran). For this purpose, after preparation of a landslide inventory of the study area, some 15 major parameters were examined for integrated analysis of landslide hazard in the region. The analyses of parameters were done by geo-referencing and lateral model making, satellite imaging of the study area, and spatial analyses by using geographical information system (GIS). The produced factor maps were weighted with analytic hierarchy process (AHP) method and then classified. The study area was classified into four classes of relative landslide hazards: negligible, low, moderate, and high. The final produced map for landslide hazard zonation in Golmakan Watershed revealed that: 1) the parameters of land slope and geologic formation have strong correlation ( R 2 = 0.79 and 0.83, respectively) with the dependent variable landslide hazard ( p<0.05). 2) About 18.8% of the study area has low and negligible hazards to future landslides, while 81.2% of the land area of Golmakan Watershed falls into the high and moderate categories.

  13. Solar Glaze Hazard Analysis Tool v. 3.0

    SciTech Connect

    Ho, Clifford K.; Sims, Cianan A.

    2015-08-25

    SGHAT predicts the occurrence and intensity of glare caused by a user-specified solar panel array when viewed from one or more observation points. An interactive mapping interface is used to determine the latitude, longitude and elevation of the array and observation points. The presence and intensity of glare is then calculated along a given time interval throughout the year, based on the position of the sun. The potential ocular hazard is also reported. The maximum energy production of the solar array is also estimated so that alternative designs can be compared to determine the design that yields the most energy production while mitigating glare.

  14. National Emission Standards for Hazardous Air Pollutants for Major Sources: Industrial, Commercial, and Institutional Boilers; Guidance for Calculating Efficiency Credits Resulting from Implementation of Energy Conservation Measures

    SciTech Connect

    Cox, Daryl; Papar, Riyaz; Wright, Dr. Anthony

    2013-02-01

    The purpose of this document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule (i.e., subpart DDDDD of CFR part 63). This document divides Boiler System conservation opportunities into four functional areas: 1) the boiler itself, 2) the condensate recovery system, 3) the distribution system, and 4) the end uses of the steam. This document provides technical information for documenting emissions credits proposed in the Implementation Plan for functional areas 2) though 4). This document does not include efficiency improvements related to the Boiler tune-ups.

  15. National Emission Standards for Hazardous Air Pollutants for Major Sources: Industrial, Commercial, and Institutional Boilers; Guidance for Calculating Emission Credits Resulting from Implementation of Energy Conservation Measures

    SciTech Connect

    Cox, Daryl; Papar, Riyaz; Wright, Dr. Anthony

    2012-07-01

    The purpose of this document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule (i.e., subpart DDDDD of CFR part 63). This document divides Boiler System conservation opportunities into four functional areas: 1) the boiler itself, 2) the condensate recovery system, 3) the distribution system, and 4) the end uses of the steam. This document provides technical information for documenting emissions credits proposed in the Implementation Plan for functional areas 2) though 4). This document does not include efficiency improvements related to the Boiler tune-ups.

  16. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    PubMed Central

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  17. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    PubMed

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  18. Hazard and consequence analysis for waste emplacement at the Waste Isolation Pilot Plant

    SciTech Connect

    Gerstner, D.M.; Clayton, S.G.; Farrell, R.F.; McCormick, J.A.; Ortiz, C.; Standiford, D.L.

    1996-05-01

    The Carlsbad Area Office established and analyzed the safety bases for the design and operations as documented in the WIPP Safety Analysis Report (SAR). Additional independent efforts are currently underway to assess the hazards associated with the long-term (10,000 year) isolation period as required by 40 CFR 191. The structure of the WIPP SAR is unique due to the hazards involved, and the agreement between the State of New Mexico and the DOE regarding SAR content and format. However, the hazards and accident analysis philosophy as contained in DOE-STD-3009-94 was followed as closely as possible, while adhering to state agreements. Hazards associated with WIPP waste receipt, emplacement, and disposal operations were systematically identified using a modified Hazard and Operability Study (HAZOP) technique. The WIPP HAZOP assessed the potential internal, external, and natural phenomena events that can cause the identified hazards to develop into accidents. The hazard assessment identified deviations from the intended design and operation of the waste handling system, analyzed potential accident consequences to the public and workers, estimated likelihood of occurrence, and evaluated associated preventative and mitigative features. It was concluded from the assessment that the proposed WIPP waste emplacement operations and design are sufficient to ensure safety of the public, workers, and environment, over the 35 year disposal phase.

  19. An Exploratory Statistical Analysis of a Planet Approach-Phase Guidance Scheme Using Angular Measurements with Significant Error

    NASA Technical Reports Server (NTRS)

    Friedlander, Alan L.; Harry, David P., III

    1960-01-01

    An exploratory analysis of vehicle guidance during the approach to a target planet is presented. The objective of the guidance maneuver is to guide the vehicle to a specific perigee distance with a high degree of accuracy and minimum corrective velocity expenditure. The guidance maneuver is simulated by considering the random sampling of real measurements with significant error and reducing this information to prescribe appropriate corrective action. The instrumentation system assumed includes optical and/or infrared devices to indicate range and a reference angle in the trajectory plane. Statistical results are obtained by Monte-Carlo techniques and are shown as the expectation of guidance accuracy and velocity-increment requirements. Results are nondimensional and applicable to any planet within limits of two-body assumptions. The problem of determining how many corrections to make and when to make them is a consequence of the conflicting requirement of accurate trajectory determination and propulsion. Optimum values were found for a vehicle approaching a planet along a parabolic trajectory with an initial perigee distance of 5 radii and a target perigee of 1.02 radii. In this example measurement errors were less than i minute of arc. Results indicate that four corrections applied in the vicinity of 50, 16, 15, and 1.5 radii, respectively, yield minimum velocity-increment requirements. Thrust devices capable of producing a large variation of velocity-increment size are required. For a vehicle approaching the earth, miss distances within 32 miles are obtained with 90-percent probability. Total velocity increments used in guidance are less than 3300 feet per second with 90-percent probability. It is noted that the above representative results are valid only for the particular guidance scheme hypothesized in this analysis. A parametric study is presented which indicates the effects of measurement error size, initial perigee, and initial energy on the guidance

  20. Analysis of Guidance and Counseling Program at Thika High School for the Blind, Kenya

    ERIC Educational Resources Information Center

    Irungu, Mary Wangari; Wamocho, Franciscah Irangi

    2010-01-01

    The purpose of this study was to analyze the guidance and counseling program at Thika High School for the Blind, the only residential secondary school for students with visual impairments in Kenya. The study examined the content of the existing guidance and counseling program and whether the teacher counselors were adequately trained to handle the…

  1. Meta-Analysis of Inquiry-Based Learning: Effects of Guidance

    ERIC Educational Resources Information Center

    Lazonder, Ard W.; Harmsen, Ruth

    2016-01-01

    Research has consistently shown that inquiry-based learning can be more effective than other, more expository instructional approaches as long as students are supported adequately. But what type of guidance is adequate, and for whom? These questions are difficult to answer as most previous research has only focused on one type of guidance and one…

  2. Analysis of Guidance and Counseling Program at Thika High School for the Blind, Kenya

    ERIC Educational Resources Information Center

    Irungu, Mary Wangari; Wamocho, Franciscah Irangi

    2010-01-01

    The purpose of this study was to analyze the guidance and counseling program at Thika High School for the Blind, the only residential secondary school for students with visual impairments in Kenya. The study examined the content of the existing guidance and counseling program and whether the teacher counselors were adequately trained to handle the…

  3. Fire hazard analysis for Project W-320 Tank 241-C-106 waste retrieval

    SciTech Connect

    Conner, J.C.

    1995-09-12

    This Fire Hazards Analysis (FHA) for Project W-320, `Tank 241-C-106 Waste Retrieval` addresses fire hazards or fire related concerns in accordance with DOE 5480.7A (DOE 1998), resulting from or related to the processes and equipment to be installed or modified under Project W-320 to ensure that there are no undue fire hazards to site personnel and the public; the potential for the occurrence of a fire is minimized, process control and safety systems are not damaged by fire or related perils; and property damage from fire and related perils does not exceed an acceptable level.

  4. Defining geologic Hazards for natural resources management using tree-ring analysis

    USGS Publications Warehouse

    DeGraff, J.V.; Agard, S.S.

    1984-01-01

    Landslides, avalanches, floods, and other geologic hazards impair natural resources management by jeopardizing public safety, damaging or restricting resource utilization, and necessitating expenditures for corrective measures The negative impact of geologic hazard events can be reduced by tailoring resources management to hazard potential of an area This requires assessment of where and how frequently the events occur National forests and other managed wildlands often lack monitoring or historical records to compute frequency of hazard occurrence Tree-ring analysis, based on internal growth response to external events such as tilting and abrasion, can provide frequency data Two examples of the use of tree-ring analysis to date landslide activity illustrate advantages and limitations of the technique An example from the Fishlake National Forest in central Utah illustrates assessment for planning purposes An example from the Sierra National Forest in east-central California shows assessment applied to project design Many geologic hazards in addition to landslides are suited to tree-ring analysis to establish frequency of occurrence Hazard reduction efforts in natural resources management could be enhanced by careful application of tree-ring analysis ?? 1984 Springer-Verlag New York Inc.

  5. Preliminary Hazards Analysis of K-Basin Fuel Encapsulation and Storage

    SciTech Connect

    Strickland, G.C.

    1994-08-19

    This Preliminary Hazards Analysis (PHA) systematically examines the K-Basin facilities and their supporting systems for hazards created by abnormal operating conditions and external events (e.g., earthquakes) which have the potential for causing undesirable consequences to the facility worker, the onsite individual, or the public. The operational activities examined are fuel encapsulation, fuel storage and cooling. Encapsulation of sludges in the basins is not examined. A team of individuals from Westinghouse produced a set of Hazards and Operability (HAZOP) tables documenting their examination of abnormal process conditions in the systems and activities examined in K-Basins. The purpose of this report is to reevaluate and update the HAZOP in the original Preliminary Hazard Analysis of K-Basin Fuel Encapsulation and Storage originally developed in 1991.

  6. Sensitivity analysis of seismic hazard for Western Liguria (North Western Italy): A first attempt towards the understanding and quantification of hazard uncertainty

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Spallarossa, Daniele; Bazzurro, Paolo; Eva, Claudio

    2007-05-01

    The use of logic trees in probabilistic seismic hazard analyses often involves a large number of branches that reflect the uncertainty in the selection of different models and in the selection of the parameter values of each model. The sensitivity analysis, as proposed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817], is an efficient tool that allows the construction of logic trees focusing attention on the parameters that have greater impact on the hazard. In this paper the sensitivity analysis is performed in order to identify the parameters that have the largest influence on the Western Liguria (North Western Italy) seismic hazard. The analysis is conducted for six strategic sites following the multi-parameter approach developed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817] and accounts for both mean hazard values and hazard values corresponding to different percentiles (e.g., 16%-ile and 84%-ile). The results are assessed in terms of the expected PGA with a 10% probability of exceedance in 50 years for rock conditions and account for both the contribution from specific source zones using the Cornell approach [Cornell, C.A., 1968. Engineering seismic risk analysis. Bull. Seismol. Soc. Am. 58, 1583-1606] and the spatially smoothed seismicity [Frankel, A., 1995. Mapping seismic hazard in the Central and Eastern United States. Seismol. Res. Lett. 66, 8-21]. The influence of different procedures for calculating seismic hazard, seismic catalogues (epicentral parameters), source zone models, frequency-magnitude parameters, maximum earthquake magnitude values and attenuation relationships is considered. As a result, the sensitivity analysis allows us to identify the parameters with higher influence on the hazard. Only these

  7. North Portal Fuel Storage System Fire Hazard Analysis-ESF Surface Design Package ID

    SciTech Connect

    N.M. Ruonavaara

    1995-01-18

    The purpose of the fire hazard analysis is to comprehensively assess the risk from fire within the individual fire areas. This document will only assess the fire hazard analysis within the Exploratory Studies Facility (ESF) Design Package ID, which includes the fuel storage system area of the North Portal facility, and evaluate whether the following objectives are met: 1.1.1--This analysis, performed in accordance with the requirements of this document, will satisfy the requirements for a fire hazard analysis in accordance with U.S. Department of Energy (DOE) Order 5480.7A. 1.1.2--Ensure that property damage from fire and related perils does not exceed an acceptable level. 1.1.3--Provide input to the ESF Basis For Design (BFD) Document. 1.1.4 Provide input to the facility Safety Analysis Report (SAR) (Paragraph 3.8).

  8. Progress report on the guidance for industry for statistical aspects of the design, analysis, and interpretation of chronic rodent carcinogenicity studies of pharmaceuticals.

    PubMed

    Lin, K K

    2000-11-01

    The U.S. Food and Drug Administration (FDA) is in the process of preparing a draft Guidance for Industry document on the statistical aspects of carcinogenicity studies of pharmaceuticals for public comment. The purpose of the document is to provide statistical guidance for the design of carcinogenicity experiments, methods of statistical analysis of study data, interpretation of study results, presentation of data and results in reports, and submission of electronic study data. This article covers the genesis of the guidance document and some statistical methods in study design, data analysis, and interpretation of results included in the draft FDA guidance document.

  9. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  10. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  11. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  12. Policy & Guidance

    EPA Pesticide Factsheets

    Policy documents represent EPA's official interpretation or view of specific issues. Guidance documents are published to further clarify regulations and to assist in implementation of environmental regulations.

  13. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  14. Techniques for hazard analysis and their use at CERN.

    PubMed

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  15. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  16. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    PubMed

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

  17. [Hazard evaluation modeling of particulate matters emitted by coal-fired boilers and case analysis].

    PubMed

    Shi, Yan-Ting; Du, Qian; Gao, Jian-Min; Bian, Xin; Wang, Zhi-Pu; Dong, He-Ming; Han, Qiang; Cao, Yang

    2014-02-01

    In order to evaluate the hazard of PM2.5 emitted by various boilers, in this paper, segmentation of particulate matters with sizes of below 2. 5 microm was performed based on their formation mechanisms and hazard level to human beings and environment. Meanwhile, taking into account the mass concentration, number concentration, enrichment factor of Hg, and content of Hg element in different coal ashes, a comprehensive model aimed at evaluating hazard of PM2.5 emitted by coal-fired boilers was established in this paper. Finally, through utilizing filed experimental data of previous literatures, a case analysis of the evaluation model was conducted, and the concept of hazard reduction coefficient was proposed, which can be used to evaluate the performance of dust removers.

  18. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  19. CHARACTERIZATION OF DEFENSE NUCLEAR WASTE USING HAZARDOUS WASTE GUIDANCE. APPLICATIONS TO HANFORD SITE ACCELERATED HIGH-LEVEL WASTE TREATMENT AND DISPOSAL MISSION0

    SciTech Connect

    Hamel, William; Huffman, Lori; Lerchen, Megan; Wiemers, Karyn

    2003-02-27

    Federal hazardous waste regulations were developed for management of industrial waste. These same regulations are also applicable for much of the nation's defense nuclear wastes. At the U.S. Department of Energy's (DOE) Hanford Site in southeast Washington State, one of the nation's largest inventories of nuclear waste remains in storage in large underground tanks. The waste's regulatory designation and its composition and form constrain acceptable treatment and disposal options. Obtaining detailed knowledge of the tank waste composition presents a significant portion of the many challenges in meeting the regulatory-driven treatment and disposal requirements for this waste. Key in applying the hazardous waste regulations to defense nuclear wastes is defining the appropriate and achievable quality for waste feed characterization data and the supporting evidence demonstrating that applicable requirements have been met at the time of disposal. Application of a performance-based approach to demonstrating achievable quality standards will be discussed in the context of the accelerated high-level waste treatment and disposal mission at the Hanford Site.

  20. An analysis of the Hubble Space Telescope fine guidance sensor fine lock mode

    NASA Technical Reports Server (NTRS)

    Taff, L. G.

    1991-01-01

    There are two guiding modes of the Hubble Space Telescope (HST) used for the acquisition of astronomical data by one of its six scientific instruments. The more precise one is called Fine Lock. Command and control problems in the onboard electronics has limited Fine Lock to brighter stars, V less than 13.0 mag, instead of fulfilling its goal of V = 14.5 mag. Consequently, the less precise guiding mode of Coarse Track (approximately 40 milli-arc seconds) has to be used fairly frequently. Indeed, almost half of the scientific observations to have been made with the HST will be compromised. The only realistic or extensive simulations of the Fine Lock guidance mode are reported. The theoretical analysis underlying the Monte Carlo experiments and the numerical computations clearly show both that the control electronics are severely under-engineered and how to adjust the various control parameters to successfully extend Fine Lock guiding performance back to V = 14.0 mag and sometimes beyond.

  1. Guidance and control analysis of the entry of a lifting body personnel launch vehicle

    NASA Astrophysics Data System (ADS)

    Powell, Richard W.; Cruz, Christopher I.

    1991-01-01

    NASA is currently involved in definition studies of a Personnel Launch System (PLS) that could be used to transport people to and from low-earth orbit. This vehicle would serve both to complement the Space Shuttle and to provide alternative access to space in the event the Space Shuttle fleet were unavailable for a prolonged period. The PLS would consist of a manned spacecraft launched by an expendable vehicle, e.g., Titan 4. One promising candidate for the manned component of the PLS is the NASA Langley Research Center HL-20 lifting body. Many studies are currently underway to assess this vehicle, and one of the main areas of study is the development of the capability to successfully enter, glide to the landing site, and land. To provide this capability, guidance and control algorithms have been developed, incorporated into a six-degree-of-freedom simulation, and evaluation in the presence of off-nominal atmospheric conditions, consisting of both density variations and steady-state winds. In addition, the impact of atmospheric turbulence was examined for the portion of flight from Mach 3.5 to touchdown. This analysis showed that the vehicle remained controllable and could successfully land even in the presence of off-nominal atmospheric conditions.

  2. [Analysis of 43 cases of cesarean scar pregnancy treated with suction curettage under ultrasound guidance].

    PubMed

    Zheng, Jinghong; He, Yuan; Liu, Suping; Xu, Congjian

    2015-08-01

    To evaluate the feasibility and safety of transabdominal ultrasound-guided suction curettage to treat cesarean scar pregnancy (CSP) and investigate factors affecting prognosis of CSP. This was a retrospective case study of 43 cases who were diagnosed as CSP and treated by suction curettage under transabdominal ultrasound guidance as an initial treatment of CSP in Obstetrics and Gynecology Hospital of Fudan University over past 7 years (from 2007 to 2013); factors affecting prognosis of CSP were investigated. 39 of the 43 cases (91%) were successfully treated. There were no statistically significant differences in maternal age, gravidity, abortion frequency, and the time interval between current CSP and last cesarean delivery, the myometrium thickness between the gestational sac and the bladder wall between the success group and the failure group (all P > 0.05). Statistically significant difference was found in crown-rump length (CRL) between the two group (median of the two group was 18.5, 2.0 mm) by rank sum test (P = 0.047). Univariate logistic regression analysis demonstrated that CRL was strongly associated with the prognosis and the OR for no complications was 18.50, comparing CRL ≤ 6 mm versus CRL > 6 mm (P = 0.020). Transabdominal ultrasound-guided suction curettage is effective and safe in the treatment of CSP with CRL ≤ 6 mm.

  3. Orion Crew Exploration Vehicle Launch Abort System Guidance and Control Analysis Overview

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Kim, Sungwan; Raney, David L.; Aubuchon, Vanessa V.; Sparks, Dean W.; Busan, Ronald C.; Proud, Ryan W.; Merritt, Deborah S.

    2008-01-01

    Aborts during the critical ascent flight phase require the design and operation of Orion Crew Exploration Vehicle (CEV) systems to escape from the Crew Launch Vehicle (CLV) and return the crew safely to the Earth. To accomplish this requirement of continuous abort coverage, CEV ascent abort modes are being designed and analyzed to accommodate the velocity, altitude, atmospheric, and vehicle configuration changes that occur during ascent. Aborts from the launch pad to early in the flight of the CLV second stage are performed using the Launch Abort System (LAS). During this type of abort, the LAS Abort Motor is used to pull the Crew Module (CM) safely away from the CLV and Service Module (SM). LAS abort guidance and control studies and design trades are being conducted so that more informed decisions can be made regarding the vehicle abort requirements, design, and operation. This paper presents an overview of the Orion CEV, an overview of the LAS ascent abort mode, and a summary of key LAS abort analysis methods and results.

  4. Hazard analysis of Arid and semi-Arid (ASAL) regions of Kenya.

    PubMed

    Tabu, J S; Otwelo, J A; Koskei, P; Makokha, P

    2013-06-01

    water scarcity, poverty and low educational levels. The region suffers from a variety of hazards in particular Drought and famine, Epidemics including HIV/AIDS and War and conflict. Environmental degradation though given a low score may be more of a perception. There is need to undertake a comprehensive hazard and Vulnerability analysis at regional and country level to inform interventions and other developmental activities. Women should be targeted at the community and leadership level, and efforts to empower them should be stepped up.

  5. Landslide Hazard Analysis with Multidisciplinary Approach: İstanbul example

    NASA Astrophysics Data System (ADS)

    Kılıç, Osman; Baş, Mahmut; Yahya Menteşe, Emin; Tarih, Ahmet; Duran, Kemal; Gümüş, Salim; Rıza Yapar, Evrens; Emin Karasu, Muhammed; Acar Kara, Sema; Karaman, Abdullah; Özalaybey, Serdar; Zor, Ekrem; Ediger, Vedat; Arpat, Esen; Özgül, Necdet; Polat, Feyzi; Doǧan, Uǧur; Çakır, Ziyadin

    2017-04-01

    There are several methods that can be utilized for describing the landslide mechanisms. While some of them are commonly used, there are relatively new methods that have been proven to be useful. Obviously, each method has its own limitations and thus integrated use of these methods contributes to obtaining a realistic landslide model. The slopes of Küçükçekmece and Büyükçekmece Lagoons located at the Marmara Sea coast of İstanbul, Turkey, are among most specific examples of complex type landslides. The landslides in the area started developing at low sea level, and appears to ceased or at least slowed down to be at minimum after the sea level rise, as oppose to the still-active landslides that continue to cause damage especially in the valley slopes above the recent sea level between the two lagoons. To clarify the characteristics of these slope movements and classify them in most accurate way, Directorate of Earthquake and Ground Research of Istanbul Metropolitan Municipality launched a project in cooperation with Marmara Research Center of The Scientific and Technological Research Council of Turkey (TÜBİTAK). The project benefits the utility of the techniques of different disciplines such as geology, geophysics, geomorphology, hydrogeology, geotechnics, geodesy, remote sensing and meteorology. The observations include detailed mapping of topography by airborne LIDAR, deformation monitoring with more than 80 GPS stations, Ground Based Synthetic Aperture Radar measurements in 8 critical zones, 81 geological drills and more than 20 km of geophysical measurements. With three years of monitoring, the acquired data, and the results such as landslide hazard map, were integrated in GIS database for the purpose of easing tasks for the urban planners and the decision makers.

  6. A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen

    2014-05-01

    Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho

  7. GUIDANCE FOR EVALUATING LANDFILL GAS EMISSIONS ...

    EPA Pesticide Factsheets

    This document provides guidance to Superfund remedial project managers, on scene coordinators, facility owners, and potentially responsible parties for conducting an air pathway analysis for landfill gas (LFG) emissions under the Comprehensive Environmental Response, Compensation and Liability Act, Superfund Amendments and Reauthorization Act, and the Resource Conservation and Recovery Act. The document provides procedures and a set of tools for evaluating LFG emissions to ambient air, subsurface vapor migration due to landfill gas pressure gradients, and subsurface vapor intrusion into buildings. The air pathway analysis is used to evaluate the inhalation risks of offsite receptors as well as the hazards of both onsite and offsite methane explosions and landfill fires. information

  8. Information System Hazard Analysis: A Method for Identifying Technology-induced Latent Errors for Safety.

    PubMed

    Weber, Jens H; Mason-Blakley, Fieran; Price, Morgan

    2015-01-01

    Many health information and communication technologies (ICT) are safety-critical; moreover, reports of technology-induced adverse events related to them are plentiful in the literature. Despite repeated criticism and calls to action, recent data collected by the Institute of Medicine (IOM) and other organization do not indicate significant improvements with respect to the safety of health ICT systems. A large part of the industry still operates on a reactive "break & patch" model; the application of pro-active, systematic hazard analysis methods for engineering ICT that produce "safe by design" products is sparse. This paper applies one such method: Information System Hazard Analysis (ISHA). ISHA adapts and combines hazard analysis techniques from other safety-critical domains and customizes them for ICT. We provide an overview of the steps involved in ISHA and describe.

  9. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two

  10. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    PubMed

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-08

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  11. FIRE HAZARDS ANALYSIS FOR THE FUEL SUPPLY SYSTEM - ESF PACKAGE 1E

    SciTech Connect

    N.M. Ruonavaara

    1995-04-12

    The purpose of the fire hazards analysis is to comprehensively assess the risk from fire within individual fire areas in accordance with US. Department of Energy (DOE) Order 5480.7h (Reference 4.4.7.4). This document will assess the fire hazard risk within the Exploratory Studies Facility (ESF) fuel supply system, Package 1E, and evaluate whether the following objectives are met: (1) Ensure that property damage from fire and related perils do not exceed an acceptable level. (2) Provide input to the facility Safety Analysis Report (SAR).

  12. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  13. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  14. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  15. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  16. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  17. Vertical guidance performance analysis of the L1-L5 dual-frequency GPS/WAAS user avionics sensor.

    PubMed

    Jan, Shau-Shiun

    2010-01-01

    This paper investigates the potential vertical guidance performance of global positioning system (GPS)/wide area augmentation system (WAAS) user avionics sensor when the modernized GPS and Galileo are available. This paper will first investigate the airborne receiver code noise and multipath (CNMP) confidence (σair). The σair will be the dominant factor in the availability analysis of an L1-L5 dual-frequency GPS/WAAS user avionics sensor. This paper uses the MATLAB Algorithm Availability Simulation Tool (MAAST) to determine the required values for the σair, so that an L1-L5 dual-frequency GPS/WAAS user avionics sensor can meet the vertical guidance requirements of APproach with Vertical guidance (APV) II and CATegory (CAT) I over conterminous United States (CONUS). A modified MAAST that includes the Galileo satellite constellation is used to determine under what user configurations WAAS could be an APV II system or a CAT I system over CONUS. Furthermore, this paper examines the combinations of possible improvements in signal models and the addition of Galileo to determine if GPS/WAAS user avionics sensor could achieve 10 m Vertical Alert Limit (VAL) within the service volume. Finally, this paper presents the future vertical guidance performance of GPS user avionics sensor for the United States' WAAS, Japanese MTSAT-based satellite augmentation system (MSAS) and European geostationary navigation overlay service (EGNOS).

  18. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... Critical Control Point Principles (HACCP); Approval of Information Collection Request AGENCY: Food and... rule entitled School Food Safety Program Based on Hazard Analysis and Critical Control Point...

  19. Fire hazards analysis for the uranium oxide (UO{sub 3}) facility

    SciTech Connect

    Wyatt, D.M.

    1994-12-06

    The Fire Hazards Analysis (FHA) documents the deactivation end-point status of the UO{sub 3} complex fire hazards, fire protection and life safety systems. This FHA has been prepared for the Uranium Oxide Facility by Westinghouse Hanford Company in accordance with the criteria established in DOE 5480.7A, Fire Protection and RLID 5480.7, Fire Protection. The purpose of the Fire Hazards Analysis is to comprehensively and quantitatively assess the risk from a fire within individual fire areas in a Department of Energy facility so as to ascertain whether the objectives stated in DOE Order 5480.7, paragraph 4 are met. Particular attention has been paid to RLID 5480.7, Section 8.3, which specifies the criteria for deactivating fire protection in decommission and demolition facilities.

  20. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    PubMed

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  1. Probabilistic seismic hazard analysis for offshore structures in the Santa Barbara Channel phase 2 report

    SciTech Connect

    Foxall, W; Savy, J

    1999-08-06

    This report summarizes progress through Phase 2 of the probabilistic seismic hazards analysis (PSHA) for the Santa Barbara Channel being carried out by the Lawrence Livermore National Laboratory (LLNL) for the Minerals Management Service (MMS) of the US Department of the Interior. The purpose of the PSHA is to provide a basis for development by MMS of regulations governing evaluation of applications to re-license existing oil platforms in federal waters within the Channel with respect to seismic loading. The final product of the analysis will be hazard maps of ground motion parameters at specified probability levels of exceedence. This report summarizes the characterization of local earthquake sources within the Channel and onshore areas of the Western Transverse Ranges, development of a ground motion attenuation model for the region, and presents preliminary hazard results at three selected sites.

  2. Liquefaction hazard analysis for infrastructure development in gulf of Jakarta

    NASA Astrophysics Data System (ADS)

    Dinata, Indra A.; Darlan, Yudi; Sadisun, Imam A.; Pindratno, Haris; Saryanto, Agus

    2016-05-01

    Gulf of Jakarta is an area of active sedimentation. There exist a wide sediment deposition area on the north coast of Jakarta. Generally, these sediments have not been consolidated, so that the conditions in these area is an important factor to determining liquefaction in these area. Liquefaction may occur because of earthquake that cause loss of strength and stiffness in soils. Analysis of liquefaction potential based from SPT data taken at gulf of Jakarta, include susceptibility rate and the factors that triggering. Liquefaction analysis methods compared with each other to get the factor of safety against liquefaction according to the characteristics of the soil. Liquefaction analysis at surface using susceptibility rating factor (SRF). SRF method controled by factors: history, geology, composition, and groundwater. Each factors have parameters that determine the value of SRF.From the analysis, Gulf of Jakarta has susceptibility rating from liquefaction with SRF value 12 - 35. The value shows that Gulf of Jakarta dominated by area that have susceptibility rating from medium to high. High susceptibility rating from liquefaction concentrated at coast area.

  3. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  4. Statistical methods for launch vehicle guidance, navigation, and control (GN&C) system design and analysis

    NASA Astrophysics Data System (ADS)

    Rose, Michael Benjamin

    A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical

  5. Guidance on risk analysis and safety implications of a large liquefied natural gas (LNG) spill over water.

    SciTech Connect

    Wellman, Gerald William; Melof, Brian Matthew; Luketa-Hanlin, Anay Josephine; Hightower, Marion Michael; Covan, John Morgan; Gritzo, Louis Alan; Irwin, Michael James; Kaneshige, Michael Jiro; Morrow, Charles W.

    2004-12-01

    While recognized standards exist for the systematic safety analysis of potential spills or releases from LNG (Liquefied Natural Gas) storage terminals and facilities on land, no equivalent set of standards or guidance exists for the evaluation of the safety or consequences from LNG spills over water. Heightened security awareness and energy surety issues have increased industry's and the public's attention to these activities. The report reviews several existing studies of LNG spills with respect to their assumptions, inputs, models, and experimental data. Based on this review and further analysis, the report provides guidance on the appropriateness of models, assumptions, and risk management to address public safety and property relative to a potential LNG spill over water.

  6. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    NASA Astrophysics Data System (ADS)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  7. Damage functions for climate-related hazards: unification and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.

    2016-05-01

    Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.

  8. Responses to gestational weight management guidance: a thematic analysis of comments made by women in online parenting forums

    PubMed Central

    2014-01-01

    Background The National Institute for Health and Clinical Excellence (NICE) published guidance on weight management in pregnancy in July 2010 (NICE public health guidance 27: 2010), and this received considerable press coverage across a range of media. This offered an opportunity to examine how gestational weight management guidance was received by UK women. Methods A thematic analysis was conducted of 400 posts made in UK-based parenting internet forums in the week following the publication of the NICE guidance. This allowed us to examine the naturally occurring comments from 202 women who posted about the guidance on public forums. Results Three main themes were identified and explored: i) Perceived control/responsibility ii) Risk perception iii) Confused messages. Conclusions Women differed in their perceptions of the level of control that they had over being overweight with some feeling responsible and motivated to maintain a healthy lifestyle. Others felt there were multiple factors influencing their weight issues beyond their control. There were reports of feeling guilty about the impact of weight on the growing baby and experiencing significant obesity stigma from the public and health professionals. Information about the risks of overweight and obesity in pregnancy were difficult messages for women to hear, and for health professionals to deliver. Women reported being confused by the messages that they received. Health messages need to be delivered sensitively to women, and health professionals need support and training to do this. Risk information should always be accompanied with clear advice and support to help women to manage their weight in pregnancy. PMID:24981024

  9. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    NASA Technical Reports Server (NTRS)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  10. Hazards Analysis of Large Caliber Munitions Metal Parts Plants.

    DTIC Science & Technology

    1980-10-01

    Furnace Types Used Rotary Hearth - 4 plants Roller, Hearth - 1 plant Induction - 3 plants Hydraulic... Rotary hearth furnace with atmos. 1,000 Roller hearth furnace with atmos. 1,700 Induction heating 500 Batch furnace with atmos. 200 Roller type tempering...3) the typical working environment in the plants. A brief analysis of each is shown below. 15 Rotary Hearth Furnace The following diagram is

  11. Quality assurance guidance for field sampling and measurement assessment plates in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document is one of several guidance documents developed by the US Department of Energy (DOE) Office of Environmental Restoration and Waste Management (EM). These documents support the EM Analytical Services Program (ASP) and are based on applicable regulatory requirements and DOE Orders. They address requirements in DOE Orders by providing guidance that pertains specifically to environmental restoration and waste management sampling and analysis activities. DOE 5700.6C Quality Assurance (QA) defines policy and requirements to establish QA programs ensuring that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized. This is accomplished through the application of effective management systems commensurate with the risks imposed by the facility and the project. Every organization supporting EM`s environmental sampling and analysis activities must develop and document a QA program. Management of each organization is responsible for appropriate QA program implementation, assessment, and improvement. The collection of credible and cost-effective environmental data is critical to the long-term success of remedial and waste management actions performed at DOE facilities. Only well established and management supported assessment programs within each EM-support organization will enable DOE to demonstrate data quality. The purpose of this series of documents is to offer specific guidance for establishing an effective assessment program for EM`s environmental sampling and analysis (ESA) activities.

  12. Tank farms hazards assessment

    SciTech Connect

    Broz, R.E.

    1994-09-30

    Hanford contractors are writing new facility specific emergency procedures in response to new and revised US Department of Energy (DOE) Orders on emergency preparedness. Emergency procedures are required for each Hanford facility that has the potential to exceed the criteria for the lowest level emergency, an Alert. The set includes: (1) a facility specific procedure on Recognition and Classification of Emergencies, (2) area procedures on Initial Emergency Response and, (3) an area procedure on Protective Action Guidance. The first steps in developing these procedures are to identify the hazards at each facility, identify the conditions that could release the hazardous material, and calculate the consequences of the releases. These steps are called a Hazards Assessment. The final product is a document that is similar in some respects to a Safety Analysis Report (SAR). The document could br produced in a month for a simple facility but could take much longer for a complex facility. Hanford has both types of facilities. A strategy has been adopted to permit completion of the first version of the new emergency procedures before all the facility hazards Assessments are complete. The procedures will initially be based on input from a task group for each facility. This strategy will but improved emergency procedures in place sooner and therefore enhance Hanford emergency preparedness. The purpose of this document is to summarize the applicable information contained within the Waste Tank Facility ``Interim Safety Basis Document, WHC-SD-WM-ISB-001`` as a resource, since the SARs covering Waste Tank Operations are not current in all cases. This hazards assessment serves to collect, organize, document and present the information utilized during the determination process.

  13. Example process hazard analysis of a Department of Energy water chlorination process

    SciTech Connect

    Not Available

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  14. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  15. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  16. Using bioluminescent biosensors for hazard analysis and critical control point (HACCP) in wastewater control.

    PubMed

    Valat, C; Champiat, D; Degorce-Dumas, J R; Thomas, O

    2004-01-01

    Starting from a new approach for water pollution control and wastewater treatment plant management, the hazard analysis and critical control point (HACCP) quality concept, the interest for the development of new rapid and sensitive methods such as bioluminescence-based methods is evident. After an introduction of the HACCP procedure, a bibliographic study of the bioluminescence potentiality is presented and discussed.

  17. Guidance on health effects of toxic chemicals. Safety Analysis Report Update Program

    SciTech Connect

    Foust, C.B.; Griffin, G.D.; Munro, N.B.; Socolof, M.L.

    1994-02-01

    Martin Marietta Energy Systems, Inc. (MMES), and Martin Marietta Utility Services, Inc. (MMUS), are engaged in phased programs to update the safety documentation for the existing US Department of Energy (DOE)-owned facilities. The safety analysis of potential toxic hazards requires a methodology for evaluating human health effects of predicted toxic exposures. This report provides a consistent set of health effects and documents toxicity estimates corresponding to these health effects for some of the more important chemicals found within MMES and MMUS. The estimates are based on published toxicity information and apply to acute exposures for an ``average`` individual. The health effects (toxicological endpoints) used in this report are (1) the detection threshold; (2) the no-observed adverse effect level; (3) the onset of irritation/reversible effects; (4) the onset of irreversible effects; and (5) a lethal exposure, defined to be the 50% lethal level. An irreversible effect is defined as a significant effect on a person`s quality of life, e.g., serious injury. Predicted consequences are evaluated on the basis of concentration and exposure time.

  18. Image Guidance

    EPA Pesticide Factsheets

    Guidance that explains the process for getting images approved in One EPA Web microsites and resource directories. includes an appendix that shows examples of what makes some images better than others, how some images convey meaning more than others

  19. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  20. ALARA Analysis of Radiological Control Criteria Associated with Alternatives for Disposal of Hazardous Wastes

    SciTech Connect

    Aaberg, Rosanne L.; Bilyard, Gordon R.; Branch, Kristi M.; Lavender, Jay C.; Miller, Peter L.

    2002-05-15

    This ALARA analysis of Radiological Control Criteria (RCC) considers alternatives to continued storage of certain DOE mixed wastes. It also considers the option of treating hazardous wastes generated by DOE facilities, which have a very low concentration of radionuclide contaminants, as purely hazardous waste. Alternative allowable contaminant levels examined correspond to doses to an individual ranging from 0.01 mrem/yr to 10 to 20 mrem/yr. Generic waste inventory data and radionuclide source terms are used in the assessment. Economic issues, potential health and safety issues, and qualitative factors relating to the use of RCCs are considered.

  1. RCRA Programmatic Information Policy and Guidance

    EPA Pesticide Factsheets

    This asset includes program policy and guidance documents that are used by the EPA regions, states, tribes and private parties to implement the hazardous waste management program under the Resource Conservation and Recovery Act (RCRA). This asset does not contain facility-specific information (see RCRA Facility Information data asset). In addition to policy and guidance, included here are training materials, economic analysis, public information, and program performance and other management level data that feeds into the Performance Assessment Tool (see separate data asset). Much of the information in this asset is available through RCRA Online. The RCRA Online database is designed to enable users to locate documents, including publications and other outreach materials, that cover a wide range of RCRA issues and topics.

  2. The Variable Experiences of Becoming Retired and Seeking Retirement Guidance: A Qualitative Thematic Analysis

    ERIC Educational Resources Information Center

    Robinson, Oliver C.; Demetre, James D.; Corney, Roslyn H.

    2011-01-01

    Thirty interviews were conducted with individuals across the UK who had retired between two and ten years ago, with the aim of exploring the variability of retirement experiences in terms of (a) retirement antecedents/reasons for retirement, (b) change in wellbeing and satisfaction over time, (c) personal relationships and (d) retirement guidance.…

  3. The Variable Experiences of Becoming Retired and Seeking Retirement Guidance: A Qualitative Thematic Analysis

    ERIC Educational Resources Information Center

    Robinson, Oliver C.; Demetre, James D.; Corney, Roslyn H.

    2011-01-01

    Thirty interviews were conducted with individuals across the UK who had retired between two and ten years ago, with the aim of exploring the variability of retirement experiences in terms of (a) retirement antecedents/reasons for retirement, (b) change in wellbeing and satisfaction over time, (c) personal relationships and (d) retirement guidance.…

  4. Hazardous Waste Minimization through Life Cycle Cost Analysis at Federal Facilities.

    PubMed

    Ray, Chittaranjan; Jain, Ravi K; Donahue, Bernard A; Smith, E Dean

    1999-01-01

    In response to the Hazardous and Solid Waste Amendments, many federal facilities are carefully examining hazardous waste minimization issues. A hazardous waste minimization assessment was conducted recently at five military installations. Sources and types of waste somewhat varied among the installations. Major waste sources included motor pools and vehicle maintenance facilities; industrial maintenance and small arms shops; aviation maintenance facilities; paint shops; photography, arts and crafts shops; and hospitals, clinics, and laboratories. Typical waste streams included used engine oil, cleaning and degreasing solvents, paint thinners, antifreeze and coolants, batteries, inks, and pathological wastes. Source reduction, recycling, and treatment were considered as the three major modes of waste minimization. Through life-cycle cost analysis, feasible modes of waste minimization are presented.

  5. Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning

    NASA Astrophysics Data System (ADS)

    Jeuken, Ad; Mendoza, Guillermo; Matthews, John; Ray, Patrick; Haasnoot, Marjolijn; Gilroy, Kristin; Olsen, Rolf; Kucharski, John; Stakhiv, Gene; Cushing, Janet; Brown, Casey

    2016-04-01

    over time. They are part of the Dutch adaptive planning approach Adaptive Delta Management, executed and develop by the Dutch Delta program. Both decision scaling and adaptation pathways have been piloted in studies worldwide. The objective of CRIDA is to mainstream effective climate adaptation for professional water managers. The CRIDA publication, due in april 2016, follows the generic water design planning design cycle. At each step, CRIDA describes stepwise guidance for incorporating climate robustness: problem definition, stress test, alternatives formulation and recommendation, evaluation and selection. In the presentation the origin, goal, steps and practical tools available at each step of CRIDA will be explained. In two other abstracts ("Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region" by Gilroy et al., "The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia, by Kucharski et al.), the application of CRIDA to cases is explained

  6. A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.

    PubMed

    Morag, Ido; Luria, Gil

    2013-01-01

    Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.

  7. Spatial temporal analysis of urban heat hazard in Tangerang City

    NASA Astrophysics Data System (ADS)

    Wibowo, Adi; Kuswantoro; Ardiansyah; Rustanto, Andry; Putut Ash Shidiq, Iqbal

    2016-11-01

    Urban heat is a natural phenomenon which might caused by human activities. The human activities were represented by various types of land-use such as urban and non-urban area. The aim of this study is to identify the urban heat behavior in Tangerang City as it might threats the urban environment. This study used three types of remote sensing data namely, Landsat TM, Landsat ETM+ and Landsat OLI-TIRS, to capture the urban heat behavior and to analysis the urban heat signature of Tangerang City in 2001, 2012, 2013, 2014, 2015 and 2016. The result showed that urban heat signature change dynamically each month based on the sun radiation. The urban heat island covered only small part of Tangerang City in 2001, but it was significantly increased and reached 50% of the area in 2012. Based on the result on urban heat signature, the threshold for threatening condition is 30 oC which recognized from land surface temperature (LST). The effective temperature (ET) index explains that condition as warm, uncomfortable, increase stress due to sweating and blood flow and may causing cardiovascular disorder.

  8. CyberShake: Broadband Physics-Based Probabilistic Seismic Hazard Analysis in Southern California

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Milner, K.; Graves, R. W.; Donovan, J.; Wang, F.; Jordan, T. H.

    2012-12-01

    Researchers at the Southern California Earthquake Center (SCEC) have developed and used the CyberShake computational platform to perform probabilistic seismic hazard analysis (PSHA) in the Los Angeles region (Graves et al., 2010) using deterministic wave propagation simulations at frequencies up to 0.5 Hz, combined with stochastic methods, to produce broadband seismograms up to 10 Hz. CyberShake uses seismic reciprocity to calculate synthetic seismograms for a suite of more than 600,000 rupture realizations. From this set of seismograms we compute intensity measures, which are then combined into a PSHA hazard curve for the site of interest at various periods. With the CyberShake computational platform, we have computed broadband hazard curves for locations around Southern California, including precariously balanced rock sites and locations of Southern California Seismic Network stations. Additionally, for each location we calculated hazard curves with two different community velocity models, Community Velocity Model - Harvard (CVM-H) v11.2 and Community Velocity Model - SCEC (CVM-S) v11.2. At lower frequencies, hazard levels computed with CVM-H for sites within the deep LA basin are lower than those computed with CVM-S. On the other hand, sites within the Ventura basin show the opposite trend. We interpret these results to be related to the underlying nature of the velocity models, which we are continuing to investigate. At higher frequencies, the CyberShake results tend to be lower than hazard levels computed with traditional ground motion prediction equations (GMPEs). We will report on these results, possibly due to the long tail on GMPEs. Additionally, we will describe ways these results are being used by the SCEC community, such as in earthquake early warning, precarious rock analysis, and directivity-basin coupling.

  9. In silico analysis of nanomaterials hazard and risk.

    PubMed

    Cohen, Yoram; Rallo, Robert; Liu, Rong; Liu, Haoyang Haven

    2013-03-19

    false positive relative to false negative predictions and the reliability of toxicity data. To establish the environmental impact of ENMs identified as toxic, researchers will need to estimate the potential level of environmental exposure concentration of ENMs in the various media such as air, water, soil, and vegetation. When environmental monitoring data are not available, models of ENMs fate and transport (at various levels of complexity) serve as alternative approaches for estimating exposure concentrations. Risk management decisions regarding the manufacturing, use, and environmental regulations of ENMs would clearly benefit from both the assessment of potential ENMs exposure concentrations and suitable toxicity metrics. The decision process should consider the totality of available information: quantitative and qualitative data and the analysis of nanomaterials toxicity, and fate and transport behavior in the environment. Effective decision-making to address the potential impacts of nanomaterials will require considerations of the relevant environmental, ecological, technological, economic, and sociopolitical factors affecting the complete lifecycle of nanomaterials, while accounting for data and modeling uncertainties. Accordingly, researchers will need to establish standardized data management and analysis tools through nanoinformatics as a basis for the development of rational decision tools.

  10. Hazard analysis and critical control point systems applied to public health risks: the example of seafood.

    PubMed

    Williams, R A; Zorn, D J

    1997-08-01

    The authors describe the way in which the two components of risk analysis--risk assessment and risk management--can be used in conjunction with the hazard analysis and critical control points concept to determine the allocation of resources at potential critical control points. This approach is examined in the context of risks to human health associated with seafood, and in particular with regard to ciguatera poisoning.

  11. Hazardous Materials Management System Validation: A Comparison of the Hazard Analysis and Resource Inventories Conducted by Multnomah County and the City of Portland.

    DTIC Science & Technology

    1982-03-01

    Analysis and Resource Inventories Conducted by Multnomah County and the City of Portland Final Report for the Federal Emergency Management Agency Washington...analysis and resource inventory methodologies developed by the Multnomah County Office of Emergency Manage- ment. The hazard analysis and resource... emergency planning and response jurisdiction of the City of Portland. The Portland Office of Emergency Management is, therefore, in a unique posi

  12. 78 FR 64425 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Food for Animals; Public... proposed rule to establish requirements for current good manufacturing practice and hazard analysis and... that will establish the foundation of, and central framework for, the modern food safety system...

  13. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.

    PubMed

    Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W

    2017-08-18

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  14. What's the evidence that NICE guidance has been implemented? Results from a national evaluation using time series analysis, audit of patients' notes, and interviews

    PubMed Central

    Sheldon, Trevor A; Cullum, Nicky; Dawson, Diane; Lankshear, Annette; Lowson, Karin; Watt, Ian; West, Peter; Wright, Dianne; Wright, John

    2004-01-01

    Objectives To assess the extent and pattern of implementation of guidance issued by the National Institute for Clinical Excellence (NICE). Design Interrupted time series analysis, review of case notes, survey, and interviews. Setting Acute and primary care trusts in England and Wales. Participants All primary care prescribing, hospital pharmacies; a random sample of 20 acute trusts, 17 mental health trusts, and 21 primary care trusts; and senior clinicians and managers from five acute trusts. Main outcome measures Rates of prescribing and use of procedures and medical devices relative to evidence based guidance. Results 6308 usable patient audit forms were returned. Implementation of NICE guidance varied by trust and by topic. Prescribing of some taxanes for cancer (P < 0.002) and orlistat for obesity (P < 0.001) significantly increased in line with guidance. Prescribing of drugs for Alzheimer's disease and prophylactic extraction of wisdom teeth showed trends consistent with, but not obviously a consequence of, the guidance. Prescribing practice often did not accord with the details of the guidance. No change was apparent in the use of hearing aids, hip prostheses, implantable cardioverter defibrillators, laparoscopic hernia repair, and laparoscopic colorectal cancer surgery after NICE guidance had been issued. Conclusions Implementation of NICE guidance has been variable. Guidance seems more likely to be adopted when there is strong professional support, a stable and convincing evidence base, and no increased or unfunded costs, in organisations that have established good systems for tracking guidance implementation and where the professionals involved are not isolated. Guidance needs to be clear and reflect the clinical context. PMID:15514342

  15. Flight Guidance System Requirements Specification

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Tribble, Alan C.; Carlson, Timothy M.; Danielson, Eric J.

    2003-01-01

    This report describes a requirements specification written in the RSML-e language for the mode logic of a Flight Guidance System of a typical regional jet aircraft. This model was created as one of the first steps in a five-year project sponsored by the NASA Langley Research Center, Rockwell Collins Inc., and the Critical Systems Research Group of the University of Minnesota to develop new methods and tools to improve the safety of avionics designs. This model will be used to demonstrate the application of a variety of methods and techniques, including safety analysis of system and subsystem requirements, verification of key properties using theorem provers and model checkers, identification of potential sources mode confusion in system designs, partitioning of applications based on the criticality of system hazards, and autogeneration of avionics quality code. While this model is representative of the mode logic of a typical regional jet aircraft, it does not describe an actual or planned product. Several aspects of a full Flight Guidance System, such as recovery from failed sensors, have been omitted, and no claims are made regarding the accuracy or completeness of this specification.

  16. Graded approach for initiating event selection in a facility hazard analysis

    SciTech Connect

    Majumdar, K.; Altenbach, T.

    1998-04-01

    This paper describes a methodology for selecting initiating events or event scenarios for the hazard analysis of a new Department of Energy (DOE) facility at the Nevada Test Site for nuclear explosive operations called the Device Assembly Facility (DAF). The selection process is a very important first step in conducting the hazard analysis for the facility, which in turn may feed into a quantitative risk analysis. A comprehensive risk analysis is dependent on the identification and inclusion of a complete set of initiating events in the analysis model. A systematic and logical method of grading or screening all the potential initiating events satisfies the needs for completeness within the bounds of efficiency and practicality. By applying the graded approach to the selection of the initiating events, the task and hazard analysis was able to focus its attention on only those events having the potential to develop into credible accident scenarios. Resources were concentrated into the understanding of those scenarios, and assuring that adequate positive measures are in place to control the risk associated with them.

  17. Additional guidance for including nuclear safety equivalency in the Canister Storage Building and Cold Vacuum Drying Facility final safety analysis report

    SciTech Connect

    Garvin, L.J.

    1997-05-20

    This document provides guidance for the production of safety analysis reports that must meet both DOE Order 5480.23 and STD 3009, and be in compliance with the DOE regulatory policy that imposes certain NRC requirements.

  18. A GIS-based software for lifeline reliability analysis under seismic hazard

    NASA Astrophysics Data System (ADS)

    Sevtap Selcuk-Kestel, A.; Sebnem Duzgun, H.; Oduncuoglu, Lutfi

    2012-05-01

    Lifelines are vital networks, and it is important that those networks are still functional after major natural disasters such as earthquakes. Assessing reliability of lifelines requires spatial analysis of lifelines with respect to a given earthquake hazard map. In this paper, a GIS-based software for the spatial assessment of lifeline reliability which is developed by using GeoTools environment is presented. The developed GIS-based software imports seismic hazard and lifeline network layers and then creates a gridded network structure. Finally, it adopts a network reliability algorithm to calculate the upper and lower bounds for system reliability of the lifeline under seismic hazard. The software enables user visualizing the reliability values in graphical form as well as thematic lifeline reliability map with colors indicating reliability level along with the link and the overall network. It also provides functions for saving the analysis results in shape file format. The software is tested and validated for an application taken from literature which is a part of water distribution system of Bursa in Turkey. The developed GIS-based software module that creates GIS-based reliability map of the lifelines under seismic hazard is user friendly, modifiable, fast in execution time, illustrative and validated for the existing literature studies.

  19. Multi-hazard risk analysis using the FP7 RASOR Platform

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew

    2014-10-01

    Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.

  20. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  1. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  2. Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)

    NASA Astrophysics Data System (ADS)

    Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.

    2016-10-01

    Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.

  3. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  4. Proposed powered explicit guidance thrust integrals derivation/implementation. Mission planning, mission analysis and software formulation

    NASA Technical Reports Server (NTRS)

    Jaggers, R. F.

    1976-01-01

    A new exoatmospheric, powered explicit guidance (PEG) thrust integral formulation and a simple method of implementation are presented. The new thrust integral formulation is significantly simpler than that currently used in PEG. Preliminary estimates indicate a computer storage savings of 220 words, which is approximately 10 percent of the current PEG ascent program. Alternate methods of implementation that could produce even more savings are noted.

  5. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10-4, 4x10-5 and 1x10-5.« less

  6. Environmental justice implications of industrial hazardous waste generation in India: a national scale analysis

    NASA Astrophysics Data System (ADS)

    Basu, Pratyusha; Chakraborty, Jayajit

    2016-12-01

    While rising air and water pollution have become issues of widespread public concern in India, the relationship between spatial distribution of environmental pollution and social disadvantage has received less attention. This lack of attention becomes particularly relevant in the context of industrial pollution, as India continues to pursue industrial development policies without sufficient regard to its adverse social impacts. This letter examines industrial pollution in India from an environmental justice (EJ) perspective by presenting a national scale study of social inequities in the distribution of industrial hazardous waste generation. Our analysis connects district-level data from the 2009 National Inventory of Hazardous Waste Generating Industries with variables representing urbanization, social disadvantage, and socioeconomic status from the 2011 Census of India. Our results indicate that more urbanized and densely populated districts with a higher proportion of socially and economically disadvantaged residents are significantly more likely to generate hazardous waste. The quantity of hazardous waste generated is significantly higher in more urbanized but sparsely populated districts with a higher proportion of economically disadvantaged households, after accounting for other relevant explanatory factors such as literacy and social disadvantage. These findings underscore the growing need to incorporate EJ considerations in future industrial development and waste management in India.

  7. Seismic Hazard Analysis of Aizawl, India with a Focus on Water System Fragilities

    NASA Astrophysics Data System (ADS)

    Belair, G. M.; Tran, A. J.; Dreger, D. S.; Rodgers, J. E.

    2015-12-01

    GeoHazards International (GHI) has partnered with the University of California, Berkeley in a joint Civil Engineering and Earth Science summer internship program to investigate geologic hazards. This year the focus was on Aizawl, the capital of India's Mizoram state, situated on a ridge in the Burma Ranges. Nearby sources have the potential for large (M > 7) earthquakes that would be devastating to the approximately 300,000 people living in the city. Earthquake induced landslides also threaten the population as well as the city's lifelines. Fieldwork conducted in June 2015 identified hazards to vital water system components. The focus of this abstract is a review of the seismic hazards that affect Aizawl, with special attention paid to water system locations. To motivate action to reduce risk, GHI created an earthquake scenario describing effects of a M7 right-lateral strike-slip intraplate earthquake occurring 30 km below the city. We extended this analysis by exploring additional mapped faults as well as hypothetical blind reverse faults in terms of PGA, PGV, and PSA. Ground motions with hanging wall and directivity effects were also examined. Several attenuation relationships were used in order to assess the uncertainty in the ground motion parameters. Results were used to determine the likely seismic performance of water system components, and will be applied in future PSHA studies.

  8. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  9. [Preparation of flavonoid reference standards from Scutellariae Radix under the guidance of high performance liquid chromatography-mass spectrometry analysis].

    PubMed

    Guo, Henan; Yang, Xuedong; Liu, Jun; Zheng, Wenfeng

    2012-07-01

    Flavonoid reference standards were targeted-prepared from Scutellariae Radix under the guidance of high performance liquid chromatography-mass spectrometry (HPLC-MS) analysis. With HPLC-MS analysis of Scutellariae Radix, 19 flavonoid components were identified by analyzing and comparing their retention times, ultraviolet spectra, and mass spectrometry data with literature. The separation and purification protocols of all targeted flavonoid reference standards were optimally designed according to the results of HPLC-MS analysis and related literature. The ethanol extract of Scutellariae Radix was suspended in water and extracted with petroleum ether, ethyl acetate, and n-butanol successively. The ethyl acetate extract and n-butanol extract were separately subjected to primary separation by low pressure reverse phase preparative chromatography. Then the fractions containing targeted compounds were further purified by low pressure reverse and normal phases preparative chromatography. Finally, baicalin and wogonoside reference standards were obtained from n-butanol extract; baicaelin, wogonin, and oroxylin A reference standards were obtained from ethyl acetate extract. The structures of the 5 reference standards were identified by mass spectrometry (MS) and 1H nuclear magnetic resonance (1H NMR) spectroscopy. The HPLC analytical results showed that the purities of the 5 reference standards were all above 98%. It is demonstrated that the rapid targeted-preparation method under the guidance of the HPLC-MS analysis is applicable for the isolation and preparation of chemical components in traditional Chinese medicines.

  10. Patient-specific meta-analysis for risk assessment using multivariate proportional hazards regression.

    PubMed

    Crager, Michael R; Tang, Gong

    We propose a method for assessing an individual patient's risk of a future clinical event using clinical trial or cohort data and Cox proportional hazards regression, combining the information from several studies using meta-analysis techniques. The method combines patient-specific estimates of the log cumulative hazard across studies, weighting by the relative precision of the estimates, using either fixed- or random-effects meta-analysis calculations. Risk assessment can be done for any future patient using a few key summary statistics determined once and for all from each study. Generalizations of the method to logistic regression and linear models are immediate. We evaluate the methods using simulation studies and illustrate their application using real data.

  11. Laser safety and hazard analysis for the temperature stabilized BSLT ARES laser system.

    SciTech Connect

    Augustoni, Arnold L.

    2003-08-01

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. As a result of temperature stabilization of the BSLT laser the operating parameters of the laser had changed requiring a hazard analysis based on the new operating conditions. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  12. Analysis of the selected optical parameters of filters protecting against hazardous infrared radiation

    PubMed Central

    Gralewicz, Grzegorz; Owczarek, Grzegorz

    2016-01-01

    The paper analyses the selected optical parameters of protective optic filters used for protection of the eyes against hazardous radiation within the visible (VIS) and near infrared (NIR) spectrum range. The indexes characterizing transmission and reflection of optic radiation incident on the filter are compared. As it follows from the completed analysis, the newly developed interference filters provide more effective blocking of infrared radiation in comparison with the currently used protective filters. PMID:26327153

  13. Endangerment assessment guidance

    SciTech Connect

    Not Available

    1985-11-22

    The directive clarifies the requirement that an endangerment assessment be developed to support all administrative and judicial enforcement actions under Section 106 of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and Section 7003 of the Resource Conservation and Recovery Act (RCRA). Before taking enforcement action under these provisions to abate the hazards or potential hazards at a site, the Environmental Protection Agency (EPA) must be able to properly document and justify its assertion that an imminent and substantial endangerment to public health or welfare or the environment may exist. The endangerment assessment provides this documentation and justification. The endangerment assessment is not necessary to support Section 104 actions. It also provides guidance on the content, timing, level of detail, format, and resources required for the preparation of endangerment assessments.

  14. Mission analysis and guidance, navigation, and control design for rendezvous and docking phase of advanced reentry vehicle mission

    NASA Astrophysics Data System (ADS)

    Strippoli, L.; Colmenarejo, P.; Strauch, H.

    2013-12-01

    Advanced Reentry Vehicle (ARV) belongs to the family of vehicles designed to perform rendezvous and docking (RvD) with the International space station (ISS) [1]. Differently from its predecessor ATV (Automated Transfer Vehicle), the ARV will transport a reentry capsule, equipped with a heatshield and able to bring back cargo, experiments, or, as a possible future development, even crew, being this latter scenario very attracting in view of the Space Shuttle retirement. GMV, as subcontractor of EADS-Astrium Germany, is in charge of the RvD and departure mission analysis and GNC (Guidance, Navigation, and Control) design of ARV mission. This paper will present the main outcomes of the study.

  15. Analysis of Precipitation (Rain and Snow) Levels and Straight-line Wind Speeds in Support of the 10-year Natural Phenomena Hazards Review for Los Alamos National Laboratory

    SciTech Connect

    Kelly, Elizabeth J.; Dewart, Jean Marie; Deola, Regina

    2015-12-10

    This report provides site-specific return level analyses for rain, snow, and straight-line wind extreme events. These analyses are in support of the 10-year review plan for the assessment of meteorological natural phenomena hazards at Los Alamos National Laboratory (LANL). These analyses follow guidance from Department of Energy, DOE Standard, Natural Phenomena Hazards Analysis and Design Criteria for DOE Facilities (DOE-STD-1020-2012), Nuclear Regulatory Commission Standard Review Plan (NUREG-0800, 2007) and ANSI/ ANS-2.3-2011, Estimating Tornado, Hurricane, and Extreme Straight-Line Wind Characteristics at Nuclear Facility Sites. LANL precipitation and snow level data have been collected since 1910, although not all years are complete. In this report the results from the more recent data (1990–2014) are compared to those of past analyses and a 2004 National Oceanographic and Atmospheric Administration report. Given the many differences in the data sets used in these different analyses, the lack of statistically significant differences in return level estimates increases confidence in the data and in the modeling and analysis approach.

  16. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    SciTech Connect

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  17. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    SciTech Connect

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  18. Workflow Management of the SCEC Computational Platforms for Physics-Based Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; Callaghan, S.; Maechling, P. J.; Juve, G.; Deelman, E.; Rynge, M.; Vahi, K.; Silva, F.

    2012-12-01

    Earthquake simulation has the potential to substantially improve seismic hazard and risk forecasting, but the practicality of using simulation results is limited by the scale and complexity of the computations. Here we will focus on the experience of the Southern California Earthquake Center (SCEC) in applying workflow management tools to facilitate physics-based seismic hazard analysis. This system-level problem can be partitioned into a series of computational pathways according to causal sequences described in terms of conditional probabilities. For example, the exceedance probabilities of shaking intensities at geographically distributed sites conditional on a particular fault rupture (a ground motion prediction model or GMPM) can be combined with the probabilities of different ruptures (an earthquake rupture forecast or ERF) to create a seismic hazard map. Deterministic simulations of ground motions from very large suites (millions) of ruptures, now feasible through high-performance computational facilities such as SCEC's CyberShake Platform, are allowing seismologists to replace empirical GMPMs with physics-based models that more accurately represent wave propagation through heterogeneous geologic structures, such as the sedimentary basins that amplify seismic shaking. One iteration of the current broadband CyberShake hazard model for the Los Angeles region, which calculates ground motions deterministically up to 0.5 Hz and stochastically up to 10 Hz, requires the execution of about 3.3 billion jobs, taking 12.8 million computer hours and producing 10 TB of simulation data. We will show how the scalability and reliability of CyberShake calculations on some of the nation's largest computers has been improved using the Pegasus Workflow Management System. We will also describe the current challenges of scaling these calculations up by an order of magnitude to create a California-wide hazard model, which will be based on the new Uniform California Earthquake

  19. Post-Flight Analysis of the Guidance, Navigation, and Control Performance During Orion Exploration Flight Test 1

    NASA Technical Reports Server (NTRS)

    Barth, Andrew; Mamich, Harvey; Hoelscher, Brian

    2015-01-01

    The first test flight of the Orion Multi-Purpose Crew Vehicle presented additional challenges for guidance, navigation and control as compared to a typical re-entry from the International Space Station or other Low Earth Orbit. An elevated re-entry velocity and steeper flight path angle were chosen to achieve aero-thermal flight test objectives. New IMU's, a GPS receiver, and baro altimeters were flight qualified to provide the redundant navigation needed for human space flight. The guidance and control systems must manage the vehicle lift vector in order to deliver the vehicle to a precision, coastal, water landing, while operating within aerodynamic load, reaction control system, and propellant constraints. Extensive pre-flight six degree-of-freedom analysis was performed that showed mission success for the nominal mission as well as in the presence of sensor and effector failures. Post-flight reconstruction analysis of the test flight is presented in this paper to show whether that all performance metrics were met and establish how well the pre-flight analysis predicted the in-flight performance.

  20. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  1. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered

  2. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics

  3. Guidance on Nanomaterial Hazards and Risks

    DTIC Science & Technology

    2015-05-21

    Carbon Nanotubes (CNT) 6 Fullerene (Fullerene) 7 Graphene (Graphene) 8 Graphite (Graphite) 9 Mise (Mise...10 Multi-walled Carbon Nanotubes (MWCNT) 11 Nylon (Nylon) 12 Polymer (Polymer) 13 Silica-coated Nanotubes (SiCNT) 14 Silver Nanoparticles (Ag...PdSe) Fibrous (high aspect ratio) Carbon -based 5 Carbon Nanotubes (CNT) particles Carbon -based 10 Multi-walled Carbon Nanotubes (MWCNT) Carbon

  4. Hazard analysis of EUCLIDIAN: an image-guided robotic brachytherapy system.

    PubMed

    Hu, Yida; Podder, Tarun; Buzurovic, Ivan; Yan, Kaiguo; Ng, Wan Sing; Yu, Yan

    2007-01-01

    Robotic assistance can help clinicians to improve the flexibility of needle insertion and accuracy of seed deposition. However, the robotic platform is a safety critical system for its automated operational mode. Thus, it is important to perform Hazard Identification & Safety Insurance Control (HISIC) for securing the safety of a medical robotic system. In this paper, we have performed HISIC for our robotic platform, called Endo-Uro Computer Lattice for Intratumoral Delivery, Implementation, and Ablation with Nanosensing (ECLIDIAN). The definition and requirements of the system are described by Unified Modeling Language (UML). Failure Mode and Effect Analysis (FMEA) are executed for the principles of HISIC, such as hazard identification, safety insurance control, safety critical limit, monitoring and control. FMEA combined with UML can also be implemented to ensure reliability of the human operation. On the basis of safety control index and fuzzy mathematics, safety effective value is outlined to assess the validity of safety insurance control for robotic system. The above principles and methods are feasible and effective for hazard analysis during the development of the robotic system.

  5. Ergonomics hazards analysis of linemen's power line fixing work in China.

    PubMed

    Yu, Ming; Sun, Linyan; Du, Jianhua; Wu, Fengge

    2009-01-01

    This study used qualitative and quantitative methods, such as OWAS (Ovako working posture analysis system) and behavior observation, to analyze musculoskeletal disorder (MSD) risk factors of power line fixing work in China. Video-based sampling was used to record and analyze the frequency and posture of on-pole activities. Those key subtasks showed ergonomics characteristics of on-pole fixing tasks. Insulator-fixing was the longest subtask (33% of total working time). Bar-installing was the second longest (26% of total working time). It was evident that bar-installing and insulator-fixing were full of hazardous risks. The action categories of the 2 subtasks were higher than of the other ones. The 2 subtasks were also time-consuming, difficult and induced MSDs. Assistant linemen faced more hazardous factors than chief linemen.

  6. Probabilistic hazard analysis of Citlaltépetl (Pico de Orizaba) Volcano, eastern Mexican Volcanic Belt

    NASA Astrophysics Data System (ADS)

    De la Cruz-Reyna, Servando; Carrasco-Núñez, Gerardo

    2002-03-01

    Citlaltépetl or Pico de Orizaba is the highest active volcano in the North American continent. Although Citlaltépetl is at present in repose, its eruptive history reveals repetitive explosive eruptions in the past. Its relatively low eruption rate has favored significant population growth in areas that may be affected by a potential eruptive activity. The need of some criteria for hazards assessment and land-use planning has motivated the use of statistical methods to estimate the time and space distribution of volcanic hazards around this volcano. The analysis of past activity, from late Pleistocene to historic times, and the extent of some well-identified deposits are used to calculate the recurrence probabilities of eruptions of various size during time periods useful for land-use planning.

  7. Hazard analysis of Escherichia coli O157:H7 contamination during beef slaughtering in Calvados, France.

    PubMed

    Guyon, R; Dorey, F; Malas, J P; Leclercq, A

    2001-09-01

    To identify hazard points and critical points during beef slaughtering, which is a necessary first step toward developing a hazard analysis and critical control point system to control meat contamination by Escherichia coli O157:H7, samples (n = 192) from surfaces, work tops, worker's hands, and beef carcasses were collected from a slaughterhouse in Calvados, France. Five strains of E. coli O157:H7 were isolated from a footbridge and a worker's apron at the preevisceration post and from a worker's hand at the defatting post. Three isolates carried stx2c, eae, and EHEC-hlyA genes and showed similar molecular types by random amplified polymorphic DNA, polymerase chain reaction IS3, and XbaI pulsed-field gel electrophoresis. Thus, this study has shown that preevisceration and defatting post and associated worker's materials are critical points for carcasses contamination by E. coli O157:H7 during beef slaughtering.

  8. Comparing proportional hazards and accelerated failure time models for survival analysis.

    PubMed

    Orbe, Jesus; Ferreira, Eva; Núñez-Antón, Vicente

    2002-11-30

    This paper describes a method proposed for a censored linear regression model that can be used in the context of survival analysis. The method has the important characteristic of allowing estimation and inference without knowing the distribution of the duration variable. Moreover, it does not need the assumption of proportional hazards. Therefore, it can be an interesting alternative to the Cox proportional hazards models when this assumption does not hold. In addition, implementation and interpretation of the results is simple. In order to analyse the performance of this methodology, we apply it to two real examples and we carry out a simulation study. We present its results together with those obtained with the traditional Cox model and AFT parametric models. The new proposal seems to lead to more precise results. Copyright 2002 John Wiley & Sons, Ltd.

  9. 49 CFR 105.20 - Guidance and interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... MATERIALS PROGRAM DEFINITIONS AND GENERAL PROCEDURES General Procedures Obtaining Guidance and Public Information § 105.20 Guidance and interpretations. (a) Hazardous materials regulations. You can obtain... infocntr@dot.gov. (3) Obtain hazardous materials safety information via the Internet at http://www.phmsa...

  10. 49 CFR 105.20 - Guidance and interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... MATERIALS PROGRAM DEFINITIONS AND GENERAL PROCEDURES General Procedures Obtaining Guidance and Public Information § 105.20 Guidance and interpretations. (a) Hazardous materials regulations. You can obtain... infocntr@dot.gov. (3) Obtain hazardous materials safety information via the Internet at http://www.phmsa...

  11. Guidance for establishment and implementation of field sample management programs in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-02-18

    The role of the National Sample Management Program (NSMP) proposed by the Department of Energy`s Office of Environmental Management (EM) is to be a resource for EM programs and for local Field Sample Management Programs (FSMPs). It will be a source of information on sample analysis and data collection within the DOE complex. The purpose of this document is to establish the suggested scope of the FSMP activities to be performed under each Operations Office, list the drivers under which the program will operate, define terms and list references. This guidance will apply only to EM sampling and analysis activities associated with project planning, contracting, laboratory selection, sample collection, sample transportation, laboratory analysis and data management.

  12. Intraoperative Ultrasound Guidance Is Associated with Clear Lumpectomy Margins for Breast Cancer: A Systematic Review and Meta-Analysis

    PubMed Central

    Ding, Qiang; Dai, Juncheng; Ling, Lijun; Chen, Lin; Zha, Xiaoming; Liu, Xiaoan; Zhou, Wenbin; Wang, Shui

    2013-01-01

    Purpose Margin status is one of the most important predictors of local recurrence after breast conserving surgery (BCS). Intraoperative ultrasound guidance (IOUS) has the potential to improve surgical accuracy for breast cancer. The purpose of the present meta-analysis was to determine the efficacy of IOUS in breast cancer surgery and to compare the margin status to that of the more traditional Guide wire localization (GWL) or palpation-guidance. Methods We searched the database of PubMed for prospective and retrospective studies about the impact of IOUS on margin status of breast cancer, and a meta-analysis was conducted. Results Of the 13 studies included, 8 were eligible for the impact of IOUS on margin status of non-palpable breast cancers, 4 were eligible for palpable breast cancers, and 1 was for both non-palpable and palpable breast cancers. The rate of negative margins of breast cancers in IOUS group was significantly higher than that in control group without IOUS (risk ratio (RR)  = 1.37, 95% confidence interval (CI)  = 1.18–1.59 from 7 prospective studies, odds ratio (OR)  = 2.75, 95% CI  = 1.66–4.55 from 4 retrospective studies). For non-palpable breast cancers, IOUS-guidance enabled a significantly higher rate of negative margins than that of GWL-guidance (RR  = 1.26, 95% CI  = 1.09–1.46 from 6 prospective studies; OR  = 1.45, 95% CI  = 0.86–2.43 from 2 retrospective studies). For palpable breast cancers, relative to control group without IOUS, the RR for IOUS associated negative margins was 2.36 (95% CI  = 1.26–4.43) from 2 prospective studies, the OR was 2.71 (95% CI  = 1.25–5.87) from 2 retrospective studies. Conclusion This study strongly suggests that IOUS is an accurate method for localization of non-palpable and palpable breast cancers. It is an efficient method of obtaining high proportion of negative margins and optimum resection volumes in patients undergoing BCS. PMID:24073200

  13. Analysis of Hazards Associated with a Process Involving Uranium Metal and Uranium Hydride Powders

    SciTech Connect

    Bullock, J.S.

    2000-05-01

    An analysis of the reaction chemistry and operational factors associated with processing uranium and uranium hydride powders is presented, focusing on a specific operation in the Development Division which was subjected to the Job Hazard Analysis (JHA) process. Primary emphasis is on the thermodynamic factors leading to pyrophoricity in common atmospheres. The discussion covers feed powders, cold-pressed and hot-pressed materials, and stray material resulting from the operations. The sensitivity of the various forms of material to pyrophoricity in common atmospheres is discussed. Operational recommendations for performing the work described are given.

  14. Determination of controlling earthquakes from probabilistic seismic hazard analysis for nuclear reactor sites

    SciTech Connect

    Boissonnade, A.; Bernreuter, D.; Chokshi, N.; Murphy, A.

    1995-04-04

    Recently, the US Nuclear Regulatory Commission published, for public comments, a revision to 10 CFR Part 100. The proposed regulation acknowledges that uncertainties are inherent in estimates of the Safe Shutdown Earthquake Ground Motion (SSE) and requires that these uncertainties be addressed through an appropriate analysis. One element of this evaluation is the assessment of the controlling earthquake through the probabilistic seismic hazard analysis (PSHA) and its use in determining the SSE. This paper reviews the basis for the various key choices in characterizing the controlling earthquake.

  15. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  16. Seismic Hazard Analysis Using the Adaptive Kernel Density Estimation Technique for Chennai City

    NASA Astrophysics Data System (ADS)

    Ramanna, C. K.; Dodagoudar, G. R.

    2012-01-01

    Conventional method of probabilistic seismic hazard analysis (PSHA) using the Cornell-McGuire approach requires identification of homogeneous source zones as the first step. This criterion brings along many issues and, hence, several alternative methods to hazard estimation have come up in the last few years. Methods such as zoneless or zone-free methods, modelling of earth's crust using numerical methods with finite element analysis, have been proposed. Delineating a homogeneous source zone in regions of distributed seismicity and/or diffused seismicity is rather a difficult task. In this study, the zone-free method using the adaptive kernel technique to hazard estimation is explored for regions having distributed and diffused seismicity. Chennai city is in such a region with low to moderate seismicity so it has been used as a case study. The adaptive kernel technique is statistically superior to the fixed kernel technique primarily because the bandwidth of the kernel is varied spatially depending on the clustering or sparseness of the epicentres. Although the fixed kernel technique has proven to work well in general density estimation cases, it fails to perform in the case of multimodal and long tail distributions. In such situations, the adaptive kernel technique serves the purpose and is more relevant in earthquake engineering as the activity rate probability density surface is multimodal in nature. The peak ground acceleration (PGA) obtained from all the three approaches (i.e., the Cornell-McGuire approach, fixed kernel and adaptive kernel techniques) for 10% probability of exceedance in 50 years is around 0.087 g. The uniform hazard spectra (UHS) are also provided for different structural periods.

  17. Hazard Analysis of Pollution Abatement Techniques. Volume II. Manual of Hazard Evaluation Criteria for Implementing Pollution Abatement Processes at Various Installations.

    DTIC Science & Technology

    1974-06-01

    regeneration nrocess. B. Hazard Evaluation of Ihe Design Concept The system analyzed consists of carbon adsorption columns, diatomite filter...as refined bj ARMCOM 385-22. 2. Diatomite Filter Analysis The cylindrxw-ally-shapcd filter has dished ends and i- mounted vertically on...reduction of capacity to an unacceptable jvei. a) Source of Combustible The potential problems associated with the diatomite filter are similar to

  18. GUIDANCE FOR CONDUCTING HEALTH RISK ...

    EPA Pesticide Factsheets

    While some potential environmental hazards involve significant exposure to only a single compound, most instances of environmental contamination involve concurrent or sequential exposures to a mixture of compounds that may induce similar or dissimilar effects over exposure periods ranging from short-term to lifetime. Multichemical exposures are ubiquitous, including air and soil pollution from municipal incinerators, leakage from hazardous waste facilities and uncontrolled waste sites, and drinking water containing chemical substances formed during disinfection. To address concerns over health risks from multichemical exposures, EPA issued Guidelines for Health Risk from Exposure to Chemical Mixtures in 1986. Those Guidelines described broad concepts related to mixtures exposure and toxicity and included few specific procedures. Since then, EPA has published additional mixtures guidance documents such as 1989 guidance for the Superfund program on hazardous waste and the 1990 Technical Support Document on Health Risk Assessment of Chemical Mixtures. Because the science of environmental risk assessment continues to evolve, EPA's Risk Assessment Forum established a Technical Work Panel to ensure that the advances in the area of chemical mixtures health risk assessment are reflected in the Agency's guidance materials. This document has been developed by the Technical Work Panel to supplement the earlier guidances and is organized according to the type of data avail

  19. RMP Guidance for Chemical Distributors - Appendix D: OSHA Guidance on PSM

    EPA Pesticide Factsheets

    Guidance on the Process Safety Management standard says information (including MSDS) about chemicals, including process intermediates, must enable accurate assessment of fire/explosion characteristics, reactivity hazards, and corrosing/erosion effects.

  20. A guidance-motivated sensitivity analysis of an aero-assisted boost vehicle

    NASA Technical Reports Server (NTRS)

    Taylor, L. W.; Gracey, C.; Armstrong, C. D.

    1986-01-01

    A simple model of an aero-assisted booster is used to examine the contributions of propulsion system type, aerodynamic lift and flight trajectory to the efficiency with which payloads can be placed into low earth orbit. The higher propulsive efficiency of ramjet and scramjet propulsion has the potential of increasing the payload mass ratio significantly. The contributions of turbojet propulsion and aerodynamic lift are less significant. The additional complexity involved in using aerodynamic propulsion and lift requires dealing with a more comprehensive set of design variables than for rocket boosters. The approach taken is to derive a set of sensitivity functions which relate booster performance to the design variables. The problems of optimum mixing of aerodynamic lift with thrust and determining the optimal boost trajectory is treated. The potential payload capacity of a horizontal take-off air-breathing boost vehicle is examined. The optimization problem which considers propulsive efficiency, aerodynamic configuration, and control and guidance issues is discussed.

  1. Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

    NASA Astrophysics Data System (ADS)

    Necmioglu, Ocal; Meral Ozel, Nurcan

    2015-04-01

    Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the earthquakes resembling the

  2. Transportation of Hazardous Evidentiary Material.

    SciTech Connect

    Osborn, Douglas.

    2005-06-01

    This document describes the specimen and transportation containers currently available for use with hazardous and infectious materials. A detailed comparison of advantages, disadvantages, and costs of the different technologies is included. Short- and long-term recommendations are also provided.3 DraftDraftDraftExecutive SummaryThe Federal Bureau of Investigation's Hazardous Materials Response Unit currently has hazardous material transport containers for shipping 1-quart paint cans and small amounts of contaminated forensic evidence, but the containers may not be able to maintain their integrity under accident conditions or for some types of hazardous materials. This report provides guidance and recommendations on the availability of packages for the safe and secure transport of evidence consisting of or contaminated with hazardous chemicals or infectious materials. Only non-bulk containers were considered because these are appropriate for transport on small aircraft. This report will addresses packaging and transportation concerns for Hazardous Classes 3, 4, 5, 6, 8, and 9 materials. If the evidence is known or suspected of belonging to one of these Hazardous Classes, it must be packaged in accordance with the provisions of 49 CFR Part 173. The anthrax scare of several years ago, and less well publicized incidents involving unknown and uncharacterized substances, has required that suspicious substances be sent to appropriate analytical laboratories for analysis and characterization. Transportation of potentially hazardous or infectious material to an appropriate analytical laboratory requires transport containers that maintain both the biological and chemical integrity of the substance in question. As a rule, only relatively small quantities will be available for analysis. Appropriate transportation packaging is needed that will maintain the integrity of the substance, will not allow biological alteration, will not react chemically with the substance being

  3. Economic Analysis of the Gypsy Moth Problem in the Northeast: IV. Forest Stand Hazard Ratings for Gypsy Moth

    Treesearch

    David A. Gansner; Owen W. Herrick; William B. White

    1978-01-01

    Provides a practical method for rating the potential hazard of impending gypsy moth attacks to forest stands. Stepwise multiple regression analysis is used to develop equations for estimating tree mortality from easy-to-measure key characteristics of stand condition.

  4. Probabilistic Rockfall Hazard Analysis in the area affect by the Christchurch Earthquakes, New Zealand

    NASA Astrophysics Data System (ADS)

    Frattini, P.; Lari, S.; Agliardi, F.; Crosta, G. B.; Salzmann, H.

    2012-04-01

    To limit damages to human lives and property in case of natural disasters, land planning and zonation, as well as the design of countermeasures, are fundamental tools, requiring however a rigorous quantitative risk analysis. As a consequence of the 3rd September 2010 (Mw 7.1) Darfield Earthquake, and the 22nd February (Mw 6.2), the 16th April 2011 (Mw 5.3) and the 13th June, 2011 (Mw 6.2) aftershock events, about 6000 rockfalls were triggered in the Port Hills of Christchurch, New Zealand. Five people were killed by falling rocks in the area, and several hundred homes were damaged or evacuated. In this work, we present a probabilistic rockfall hazard analysis for a small area located in the south-eastern slope of Richmond Hill (0.6 km2, Sumner, Christchurch, NZ). For the analysis, we adopted a new methodology (Probabilistic Rockfall Hazard Analysis, PRHA), which allows to quantify the exceedance probability for a given slope location of being affected by a rockfall event with a specific level of kinetic energy, integrating the contribution of different rockfall magnitude (volume) scenarios. The methodology requires the calculation of onset annual frequency, rockfall runout, and spatially-varying kinetic energy. Onset annual frequencies for different magnitude scenarios were derived from frequency-magnitude relationship adapted from the literature. The probability distribution of kinetic energy for a given slope location and volume scenario was obtained by rockfall runout modeling of non-interacting blocks through the 3D Hy-Stone simulation code. The reference simulation was calibrated by back-analysis of rockfall events occurred during the earthquake. For each rockfall magnitude scenario, 20 rockfall trajectories have been simulated for each source cell using stochastically variable values of restitution parameters. Finally, probabilistic analysis integrating over six rockfall magnitude scenarios (ranging from 0.001 m3 to 1000 m3) was carried out to produce

  5. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    NASA Astrophysics Data System (ADS)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2017-01-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  6. Landslide hazard analysis for Hong Kong using landslide inventory and GIS

    NASA Astrophysics Data System (ADS)

    Chau, K. T.; Sze, Y. L.; Fung, M. K.; Wong, W. Y.; Fong, E. L.; Chan, L. C. P.

    2004-05-01

    This paper presents a landslide-inventory-based and GIS-based framework for systematic landslide hazard analysis by employing historical landslide data in Hong Kong, coupling with geological, geomorphological, population, climatic, and rainfall data. Based on 1448 landslide data from 1984 to 1998, the diurnal and seasonal distributions of landslides are established and compared with the seasonal rainfall variation. The cumulative fatalities and injuries caused by landslides increase with the cumulative rainfall in Hong Kong, indicating a strong correlation between rainfall and landslide consequences. The average annual fatality and injury rates in Hong Kong caused by landslide are 11.35 and 11.63, respectively. In terms of being hit by a landslide, squatter areas and roads on Hong Kong Island are at the highest risk. A frequency-volume relation for Hong Kong Island was established, and, using this relation, it was estimated that the return period of a 26,000 m 3 landslide (the size of 1995 Shum Wan Road Landslide) is about 3.12 years. A hazard zonation map for Hong Kong Island is established by using historical data. The potential use of GIS technology to incorporate various layers of information is illustrated using Hong Kong Island as an example. Both landslide hazard and risk maps are proposed using raster calculation.

  7. Characterization of impulse noise and hazard analysis of impulse noise induced hearing loss using AHAAH modeling

    NASA Astrophysics Data System (ADS)

    Wu, Qing

    Millions of people across the world are suffering from noise induced hearing loss (NIHL), especially under working conditions of either continuous Gaussian or non-Gaussian noise that might affect human's hearing function. Impulse noise is a typical non-Gaussian noise exposure in military and industry, and generates severe hearing loss problem. This study mainly focuses on characterization of impulse noise using digital signal analysis method and prediction of the auditory hazard of impulse noise induced hearing loss by the Auditory Hazard Assessment Algorithm for Humans (AHAAH) modeling. A digital noise exposure system has been developed to produce impulse noises with peak sound pressure level (SPL) up to 160 dB. The characterization of impulse noise generated by the system has been investigated and analyzed in both time and frequency domains. Furthermore, the effects of key parameters of impulse noise on auditory risk unit (ARU) are investigated using both simulated and experimental measured impulse noise signals in the AHAAH model. The results showed that the ARUs increased monotonically with the peak pressure (both P+ and P-) increasing. With increasing of the time duration, the ARUs increased first and then decreased, and the peak of ARUs appeared at about t = 0.2 ms (for both t+ and t-). In addition, the auditory hazard of experimental measured impulse noises signals demonstrated a monotonically increasing relationship between ARUs and system voltages.

  8. A first hazard analysis of the Harrat Ash Shamah volcanic field, Syria-Jordan Borderline

    NASA Astrophysics Data System (ADS)

    Cagnan, Zehra; Akkar, Sinan; Moghimi, Saed

    2017-04-01

    The northernmost part of the Saudi Cenozoic Volcanic Fields, the 100,000 km2 Harrat Ash Shamah has hosted some of the most recent volcanic eruptions along the Syria-Jordan borderline. With rapid growth of the cities in this region, exposure to any potential renewed volcanism increased considerably. We present here a first-order probabilistic hazard analysis related to new vent formation and subsequent lava flow from Harrat Ash Shamah. The 733 visible eruption vent sites were utilized to develop a probability density function for new eruption sites using Gaussian kernel smoothing. This revealed a NNW striking zone of high spatial hazard surrounding the cities Amman and Irbid in Jordan. The temporal eruption recurrence rate is estimated to be approximately one vent per 3500 years, but the temporal record of the field is so poorly constrained that the lower and upper bounds for the recurrence interval are 17,700 yrs and 70 yrs, respectively. A Poisson temporal model is employed within the scope of this study. In order to treat the uncertainties associated with the spatio-temporal models as well as size of the area affected by the lava flow, the logic tree approach is adopted. For the Syria-Jordan borderline, the spatial variation of volcanic hazard is computed as well as uncertainty associated with these estimates.

  9. Hazardous waste crime: a contextual analysis of the offense and the offender

    SciTech Connect

    Rebovich, D.J.

    1986-01-01

    The goal of this study is to analyze hazardous waste offense and offender characteristics. Criminal case data were collected from four sample states (Maine, Maryland, New Jersey, and Pennsylvania). Files of disposed criminal cases charged between 1977 and 1984 were content-analyzed, and interviews were conducted with prominent hazardous waste crime enforcement personnel from the sample states. Areas of analysis include methods of crime commission, skills required for crime commission, patterns of criminal network relationships, and degree of syndicate crime influence. While there has been some previous speculation that hazardous waste criminal behavior is directed through centralized racketeering, the present study of known offenders found little evidence of syndicate crime family infiltration. Crimes occurred within small, informal networks of waste generators, waste transporters, the employees of treatment/storage/disposal (TSD) facilities and certain non-industry peripheral actors. The study concludes that, while attempts have been made by syndicate crime operatives to infiltrate, these attempts have failed largely due to features of criminal commission methods and to the inherent fragmentation of hauling and TSD firm interests.

  10. Laser hazard analysis for airborne AURA (Big Sky variant) Proteus platform.

    SciTech Connect

    Augustoni, Arnold L.

    2004-02-01

    A laser safety and hazard analysis was performed for the airborne AURA (Big Sky Laser Technology) lidar system based on the 2000 version of the American National Standard Institute's (ANSI) Standard Z136.1, for the Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for the Safe Use of Lasers Outdoors. The AURA lidar system is installed in the instrument pod of a Proteus airframe and is used to perform laser interaction experiments and tests at various national test sites. The targets are located at various distances or ranges from the airborne platform. In order to protect personnel, who may be in the target area and may be subjected to exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength, calculate the Nominal Ocular Hazard Distance (NOHD), and determine the maximum 'eye-safe' dwell times for various operational altitudes and conditions. It was also necessary to calculate the appropriate minimum Optical Density (ODmin) of the laser safety eyewear used by authorized personnel who may receive hazardous exposures during ground base operations of the airborne AURA laser system (system alignment and calibration).

  11. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, North-Western Italy

    NASA Astrophysics Data System (ADS)

    Turconi, L.; Tropeano, D.; Savio, G.; De, S. Kumar; Mason, P. J.

    2015-04-01

    A Civil Protection Plan has been drafted for a 600 km2 mountainous region in NW Italy Consisting of Orco and Soana Valleys. It is a part of the oldest natural park in Italy and attracts several thousand tourists every year. The work is concerned with the analysis of relevant physiographic characteristics of this Alpine landscapehaving extremely variable geomorphology and possess a long history of instability. Thousands of records as well as digital maps (involving overlay and comparison of up to 90 GIS layers) have been analyzed and cross-correlated to find out the details of the events. The study area experienced different types of natural hazards, typical of the whole Alpine environment. Thus, the present area has been selected for such multi-hazard research in which several natural processes have been investigated, concerning their damaging effects over the land. Due to 36 different severe hazardous events at least 250 deaths have been recorded in the area since 18th Century, in the occasion of.

  12. The hazard analysis and critical control point system in food safety.

    PubMed

    Herrera, Anavella Gaitan

    2004-01-01

    The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series.

  13. Critical load analysis in hazard assessment of metals using a Unit World Model.

    PubMed

    Gandhi, Nilima; Bhavsar, Satyendra P; Diamond, Miriam L

    2011-09-01

    A Unit World approach has been used extensively to rank chemicals for their hazards and to understand differences in chemical behavior. Whereas the fate and effects of an organic chemical in a Unit World Model (UWM) analysis vary systematically according to one variable (fraction of organic carbon), and the chemicals have a singular ranking regardless of environmental characteristics, metals can change their hazard ranking according to freshwater chemistry, notably pH and dissolved organic carbon (DOC). Consequently, developing a UWM approach for metals requires selecting a series of representative freshwater chemistries, based on an understanding of the sensitivity of model results to this chemistry. Here we analyze results from a UWM for metals with the goal of informing the selection of appropriate freshwater chemistries for a UWM. The UWM loosely couples the biotic ligand model (BLM) to a geochemical speciation model (Windermere Humic Adsorption Model [WHAM]) and then to the multi-species fate transport-speciation (Transpec) model. The UWM is applied to estimate the critical load (CL) of cationic metals Cd, Cu, Ni, Pb, and Zn, using three lake chemistries that vary in trophic status, pH, and other parameters. The model results indicated a difference of four orders of magnitude in particle-to-total dissolved partitioning (K(d)) that translated into minimal differences in fate because of the short water residence time used. However, a maximum 300-fold difference was calculated in Cu toxicity among the three chemistries and three aquatic organisms. Critical loads were lowest (greatest hazard) in the oligotrophic water chemistry and highest (least hazard) in the eutrophic water chemistry, despite the highest fraction of free metal ion as a function of total metal occurring in the mesotrophic system, where toxicity was ameliorated by competing cations. Water hardness, DOC, and pH had the greatest influence on CL, because of the influence of these factors on aquatic

  14. A deterministic analysis of tsunami hazard and risk for the southwest coast of Sri Lanka

    NASA Astrophysics Data System (ADS)

    Wijetunge, J. J.

    2014-05-01

    This paper describes a multi-scenario, deterministic analysis carried out as a pilot study to evaluate the tsunami hazard and risk distribution in the southwest coast of Sri Lanka. The hazard and risk assessment procedure adopted was also assessed against available field records of the impact of the Indian Ocean tsunami in 2004. An evaluation of numerically simulated nearshore tsunami amplitudes corresponding to ‘maximum-credible' scenarios from different subduction segments in the Indian Ocean surrounding Sri Lanka suggests that a seismic event similar to that generated the tsunami in 2004 can still be considered as the ‘worst-case' scenario for the southwest coast. Furthermore, it appears that formation of edge waves trapped by the primary waves diffracting around the southwest significantly influences the nearshore tsunami wave field and is largely responsible for relatively higher tsunami amplitudes in certain stretches of the coastline under study. The extent of inundation from numerical simulations corresponding to the worst-case scenario shows good overall agreement with the points of maximum penetration of inundation from field measurements in the aftermath of the 2004 tsunami. It can also be seen that the inundation distribution is strongly influenced by onshore topography. The present study indicates that the mean depth of inundation could be utilised as a primary parameter to quantify the spatial distribution of the tsunami hazard. The spatial distribution of the risk of the tsunami hazard to the population and residential buildings computed by employing the standard risk formula shows satisfactory correlation with published statistics of the affected population and the damage to residential property during the tsunami in 2004.

  15. A Framework for the Validation of Probabilistic Seismic Hazard Analysis Maps Using Strong Ground Motion Data

    NASA Astrophysics Data System (ADS)

    Bydlon, S. A.; Beroza, G. C.

    2015-12-01

    Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.

  16. Ethical guidance in the era of managed care: an analysis of the American College of Healthcare Executives' Code of Ethics.

    PubMed

    Higgins, W

    2000-01-01

    Market competition and the rise of managed care are transforming the healthcare system from a physician-dominated cottage industry into a manager-dominated corporate enterprise. The managed care revolution is also undermining the safe-guards offered by medical ethics and raising serious public concerns. These trends highlight the growing importance of ethical standards for managers. The most comprehensive ethical guidance for health service managers is contained in the American College of Healthcare Executives' (ACHE) Code of Ethics. An analysis of the ACHE Code suggests that it does not adequately address several ethical concerns associated with managed care. The ACHE may wish to develop a supplemental statement regarding ethical issues in managed care. A supplemental statement that provides more specific guidance in the areas of financial incentives to reduce utilization, social mission, consumer/patient information, and the health service manager's responsibility to patients could be extremely valuable in today's complex and rapidly changing environment. More specific ethical guidelines would not ensure individual or organizational compliance. However, they would provide professional standards that could guide decision making and help managers evaluate performance in managed care settings.

  17. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  18. Risk Analysis of Coastal hazard Considering Sea-level Rise and Local Environment in Coastal Area

    NASA Astrophysics Data System (ADS)

    Sangjin, P.; Lee, D. K.; KIM, H.; Ryu, J. E.; Yoo, S.; Ryoo, H.

    2014-12-01

    Recently, natural hazards has been more unpredictable with increasing frequency and strength due to climate change. Especially, coastal areas would be more vulnerable in the future because of sea-level rise (SLR). In case of Korea, it is surrounded by oceans and has many big cities at coastal area, thus a hazard prevention plan in coastal area is absolutely necessary. However, prior to making the plan, finding areas at risk would be the first step. In order to find the vulnerable area, local characteristics of coastal areas should also be considered along with SLR. Therefore, the objective of the research is to find vulnerable areas, which could be damaged by coastal hazards considering local environment and SLR of coastal areas. Spatial scope of the research was set up as 1km from the coastline according to the 'coastal management law' in Korea. The assessment was done up to the year of 2050, and the highest sea level rise scenario was used. For risk analysis, biophysical and socioeconomic characteristics were considered as to represent local characteristics of coastal area. Risk analysis was carried out through the combination of 'possibility of hazard' and the 'level of damages', and both of them reflect the above-mentioned regional characteristics. Since the range of inundation was narrowed down to the inundation from typhoon in this research, the possibility of inundation caused by typhoon was estimated by using numerical model, which calculated the height of storm surge considering wave, tide, sea-level pressure and SLR. Also the level of damage was estimated by categorizing the socioeconomic character into four factors; human, infrastructure, ecology and socioeconomic. Variables that represent each factor were selected and used in damage estimation with their classification and weighting value. The result shows that the urban coastal areas are more vulnerable and hazardous than other areas because of socioeconomic factors. The east and the south coast are

  19. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    PubMed Central

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  20. Remedial Investigation/Feasibility Study (RI/FS) process, elements and techniques guidance

    SciTech Connect

    Not Available

    1993-12-01

    This manual provides detailed guidance on Remedial Investigation/Feasibility Studies (RI/FSs) conducted pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) at Department of Energy (DOE) facilities. The purpose of the RI/FS, to assess the risk posed by a hazardous waste site and to determine the best way to reduce that risk, and its structure (site characterization, risk assessment, screening and detailed analysis of alternatives, etc.) is defined in the National Oil and Hazardous Substances Pollution Contingency Plan (NCP) and further explained in the Environmental Protection Agency`s (EPA`s) Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA (Interim Final) 540/G-89/004, OSWER Directive 9355.3-01, October 1988. Though issued in 1988, the EPA guidance remains an excellent source of information on the conduct and structure of an RI/FS. This document makes use of supplemental RI/FS-related guidance that EPA has developed since its initial document was issued in 1988, incorporates practical lessons learned in more than 12 years of experience in CERCLA hazardous site remediation, and drawing on those lessons, introduces the Streamlined Approach For Environmental Restoration (SAFER), developed by DOE as a way to proceed quickly and efficiently through the RI/FS process at DOE facilities. Thus as its title implies, this guidance is intended to describe in detail the process and component elements of an RI/FS, as well as techniques to manage the RI/FS effectively.

  1. Frequency agile laser safety & hazard analysis for the Sandia Remote Sensing System LIDAR.

    SciTech Connect

    Augustoni, Arnold L.

    2009-05-01

    A laser safety and hazard analysis was performed for the Raytheon Frequency Agile Laser (FAL) to be used with the Sandia Remote Sensing System (SRSS) B-70 Trailer based on the 2007 version of the American National Standards Institute's (ANSI) Standard 136.1, for Safe Use of Lasers and the 2005 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The B-70 SRSS LIDAR system is a portable platform, which is used to perform laser interaction experiments and tests at various national test sites.

  2. Detection and depth profiling of hazardous elements using N-SCAN prompt gamma neutron activation analysis

    SciTech Connect

    Ruddy, F.H.; Congedo, T.V.; Dulloo, A.R.

    1995-12-31

    A low-background method of prompt gamma neutron activation analysis (PGNAA) has been developed and demonstrated. This method employs a pulsed electronic neutron generator, a high resolution, high purity germanium detector, and microsecond coordination of neutron pulsing and gamma detection through a computer-controlled acquisition interface module. The system has been used to detect trace amounts of hazardous elements in concretes and soils to provide depth profiles of contaminant burden down to nearly 1 ft in packed soil, and also to perform rapid identification of the contents of munitions bearing simulants of chemical weapons agents.

  3. Multi-factor Constrained Information Analysis Method for Landslide Hazard Risk

    NASA Astrophysics Data System (ADS)

    Tao, Kunwang; Wang, Liang; Qian, Xinlin

    2015-04-01

    Landslide hazard cause enormous damage to human life, property, and the environment. The most effective way to mitigate the effects of landslides is to evaluate the risk of the landslides, and take measures to avoid the loss in advance. Various factors should be considered for the landslides' risk assessment, so the assessment has great complexity and uncertainty. According to landslides, the multiple factors constrained method for landslides risk assessment was proposed at the same time, which with three step to carried it out, first using the GIS technology to divide the analysis grid as the base analysis unit, second, according to the available information, slope, lithology, faults, landuse, etc. as the multiple evaluation factors, finally, based on years of landslides in the observed data, the risk assessment analysis of landslides were realized with of multiple factors constrained assessment model, which the weight value of every factor was determined by the information model. The region of Gongliu was selected as the experimental area which located in Xinjiang Ili River basin and the altitude of 600 to 4000 meters, with the special terrain characteristics of long at the east to west, narrow at north to south. The unique topography characters is the abundant rainfall which causing frequent landslides. By selecting 500m * 500m as an analysis grid which covering the whole study area, based on the multiple factors constrained method for the landslides risk assessment, a comprehensive assessment of the landslides risk in this region were computed out, and the landslide hazard classification map was realized at the end. From the experimental results of the statistical perspective, the proportion of landslide hazard point is 94.04% at a little high risk and high risk areas. And the proportion of 4.64% at the low risk zone, 1.32% at the most low risk zone. The results showed a high probability of landslides at high level of the assess region, which showed that

  4. Landslide and debris-flow hazard analysis and prediction using GIS in Minamata Hougawachi area, Japan

    NASA Astrophysics Data System (ADS)

    Wang, Chunxiang; Esaki, Tetsuro; Xie, Mowen; Qiu, Cheng

    2006-10-01

    On July 20, 2003, following a short duration of heavy rainfall, a debris-flow disaster occurred in the Minamata Hougawachi area, Kumamoto Prefecture, Japan. This disaster was triggered by a landslide. In order to assess the landslide and debris-flow hazard potential of this mountainous region, the study of historic landslides is critical. The objective of the study is to couple 3D slope-stability analysis models and 2D numerical simulation of debris flow within a geographical information systems in order to identity the potential landslide-hazard area. Based on field observations, the failure mechanism of the past landslide is analyzed and the mechanical parameters for 3D slope-stability analysis are calculated from the historic landslide. Then, to locate potential new landslides, the studied area is divided into slope units. Based on 3D slope-stability analysis models and on Monte Carlo simulation, the spots of potential landslides are identified. Finally, we propose a depth-averaged 2D numerical model, in which the debris and water mixture is assumed to be a uniform continuous, incompressible, unsteady Newtonian fluid. The method accurately models the historic debris flow. According to the 2D numerical simulation, the results of the debris-flow model, including the potentially inundated areas, are analyzed, and potentially affected houses, river and road are mapped.

  5. How much do hazard mitigation plans cost? An analysis of federal grant data.

    PubMed

    Jackman, Andrea M; Beruvides, Mario G

    2013-01-01

    Under the Disaster Mitigation Act of 2000 and Federal Emergency Management Agency's subsequent Interim Final Rule, the requirement was placed on local governments to author and gain approval for a Hazard Mitigation Plan (HMP) for the areas under their jurisdiction. Low completion percentages for HMPs--less than one-third of eligible governments--were found by an analysis conducted 3 years after the final deadline for the aforementioned legislation took place. Follow-up studies showed little improvement at 5 and 8 years after the deadline. It was hypothesized that the cost of a HMP is a significant factor in determining whether or not a plan is completed. A study was conducted using Boolean Matrix Analysis methods to determine what, if any, characteristics of a certain community will most influence the cost of a HMP. The frequency of natural hazards experienced by the planning area, the number of jurisdictions participating in the HMEP, the population, and population density were found to significantly affect cost. These variables were used in a regression analysis to determine their predictive power for cost. It was found that along with two interaction terms, the variables explain approximately half the variation in HMP cost.

  6. Simulation, design, and analysis for magnetic anchoring and guidance of instruments for minimally invasive surgery

    NASA Astrophysics Data System (ADS)

    Luo, Haifeng; Wilson, Emmanuel; Cleary, Kevin

    2012-02-01

    The exploration of natural orifice transluminal endoscopic surgery (NOTES) has brought considerable interest in magnetic anchoring of intracorporeal tools. Magnetic anchoring and guidance system (MAGS) is the concept of anchoring miniature in-vivo tools and device to the parietal peritoneum by coupling with an external magnetic holder module placed on the skin surface. MAGS has been shown to be effective in anchoring passive tools such as in-vivo cameras or tissue retractors. The strength of the magnetic field and magnet configurations employed depends on the size, shape and weight of the in-vivo tools, the coupling distance between internal and external modules, and physiological concerns such as tool interaction and tissue ischemia. This paper presents our effort to develop a better understanding of the coupling dynamic between a small in-vivo robot designed for tissue manipulation, and an external MAGS handle used to position the in-vivo robot. An electromagnetic simulation software (Vizimag 3.19) was used to simulate coupling forces between a two-magnet configuration of the MAGS handle. A prototype model of the in-vivo robot and a two-magnet configuration of a MAGS handle were fabricated. Based on this study, we were able to identify an optimal design solution for a MAGS module given the mechanical constraints of the internal module design.

  7. Design and analysis of field studies with bees: A critical review of the draft EFSA guidance.

    PubMed

    Bakker, Frank

    2016-07-01

    The specific protection goal, primary assessment endpoints, acceptable effect thresholds, and experimental design proposed in the European Food Safety Authority (EFSA) update of the bee guidance document are subjected to critical review. It is concluded that the negligible effect criteria were established without sufficient regulatory definition and without convincing scientific argumentation. For the assessment endpoints, effects on hive strength lack temporal definition and the reduction to numbers of bees is inappropriate to evaluate effects. Restricting mortality assessments to homing failure is not theoretically justified and specific criteria were incorrectly derived. The combination of acute effect estimates with models for chronic stressors is biased risk assessment and a temporal basis for the acceptability of effects is missing. Effects on overwintering success cannot be experimentally assessed using the proposed criteria. The experimental methodology proposed is inappropriate and the logistical consequences, in particular those related to replication and land use are such that field studies are no longer a feasible option for the risk assessment. It may be necessary to explore new lines of thought for the set-up of field studies and to clearly separate experimentation from monitoring. Integr Environ Assess Manag 2016;12:422-428. © 2015 SETAC. © 2015 SETAC.

  8. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  9. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    NASA Astrophysics Data System (ADS)

    Grasso, S.; Maugeri, M.

    After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either

  10. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  11. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

    USGS Publications Warehouse

    Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.

    2004-01-01

    -motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.

  12. Broadband CyberShake Platform: Seismogram Synthesis for Broadband Physics-Based Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Small, P.; Milner, K.; Graves, R. W.; Jordan, T. H.; CyberShake Collaboration

    2011-12-01

    Researchers at the Southern California Earthquake Center (SCEC) have developed the CyberShake computational platform to perform probabilistic seismic hazard analysis (PSHA) in the Los Angeles region (Graves et al., 2010) using deterministic wave propagation simulations at frequencies up to 0.5 Hz. CyberShake uses seismic reciprocity to calculate synthetic seismograms for a suite of more than 600,000 rupture realizations. From this set of seismograms we compute intensity measures, which are then combined into a PSHA hazard curve for the site of interest. SCEC has also developed the SCEC Broadband Ground Motion Simulation Platform, a software system that can calculate broadband seismograms at frequencies up to 10 Hz for historical and scenario earthquakes using multiple earthquake rupture generators, multiple low- and high-frequency wave propagation simulation codes, and multiple site effects modules. Here we report how we have integrated the high-frequency computational capabilities of the SCEC Broadband Platform into CyberShake, producing the Broadband CyberShake Platform. The Broadband CyberShake Platform extends the frequency range up to 10 Hz by combining low frequency deterministic synthetic seismograms with higher frequency stochastic seismograms. We can now calculate physics-based seismograms and PSHA hazard curves for intensity measures such as PGA that are strongly dependent on higher frequency ground motions. A potential benefit of this approach, particularly at higher frequencies, is that given adequate sampling of the parameter space, the physics-based model naturally limits the upper bound of the estimated ground motion response. This often leads to a reduction in hazard at longer return periods. We are applying the computational capabilities of the SCEC Broadband CyberShake Platform at southern California sites selected to support validation of this newly developed PSHA computational technique. This includes calculation of Broadband Cyber

  13. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

    NASA Astrophysics Data System (ADS)

    Petersen, Mark D.; Dewey, James; Hartzell, Stephan; Mueller, Charles; Harmsen, Stephan; Frankel, ArthurD.; Rukstales, Ken

    2004-10-01

    -motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.

  14. Deterministic Tectonic Origin Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

    NASA Astrophysics Data System (ADS)

    Necmioglu, O.; Meral Ozel, N.

    2014-12-01

    Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) and medium (1 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the

  15. Seismic probabilistic tsunami hazard: from regional to local analysis and use of geological and historical observations

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Lorito, S.; Orefice, S.; Graziani, L.; Brizuela, B.; Smedile, A.; Volpe, M.; Romano, F.; De Martini, P. M.; Maramai, A.; Selva, J.; Piatanesi, A.; Pantosti, D.

    2016-12-01

    Site-specific probabilistic tsunami hazard analyses demand very high computational efforts that are often reduced by introducing approximations on tsunami sources and/or tsunami modeling. On one hand, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could easily lead to important bias in the analysis. On the other hand, detailed inundation maps computed by tsunami numerical simulations require very long running time. When tsunami effects are calculated at regional scale, a common practice is to propagate tsunami waves in deep waters (up to 50-100 m depth) neglecting non-linear effects and using coarse bathymetric meshes. Then, maximum wave heights on the coast are empirically extrapolated, saving a significant amount of computational time. However, moving to local scale, such assumptions drop out and tsunami modeling would require much greater computational resources. In this work, we perform a local Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) for the 50 km long coastal segment between Augusta and Siracusa, a touristic and commercial area placed along the South-Eastern Sicily coast, Italy. The procedure consists in using the outcomes of a regional SPTHA as input for a two-step filtering method to select and substantially reduce the number of scenarios contributing to the specific target area. These selected scenarios are modeled using high resolution topo-bathymetry for producing detailed inundation maps. Results are presented as probabilistic hazard curves and maps, with the goal of analyze, compare and highlight the different results provided by regional and local hazard assessments. Moreover, the analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for the selected target areas are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore

  16. Analysis of Flood Hazards for the Materials and Fuels Complex at the Idaho National Laboratory Site

    SciTech Connect

    Skaggs, Richard; Breithaupt, Stephen A.; Waichler, Scott R.; Kim, Taeyun; Ward, Duane L.

    2010-11-01

    Researchers at Pacific Northwest National Laboratory conducted a flood hazard analysis for the Materials and Fuels Complex (MFC) site located at the Idaho National Laboratory (INL) site in southeastern Idaho. The general approach for the analysis was to determine the maximum water elevation levels associated with the design-basis flood (DBFL) and compare them to the floor elevations at critical building locations. Two DBFLs for the MFC site were developed using different precipitation inputs: probable maximum precipitation (PMP) and 10,000 year recurrence interval precipitation. Both precipitation inputs were used to drive a watershed runoff model for the surrounding upland basins and the MFC site. Outflows modeled with the Hydrologic Engineering Centers Hydrologic Modeling System were input to the Hydrologic Engineering Centers River Analysis System hydrodynamic flood routing model.

  17. Strong Ground-Motion Prediction in Seismic Hazard Analysis: PEGASOS and Beyond

    NASA Astrophysics Data System (ADS)

    Scherbaum, F.; Bommer, J. J.; Cotton, F.; Bungum, H.; Sabetta, F.

    2005-12-01

    The SSHAC Level 4 approach to probabilistic seismic hazard analysis (PSHA), which could be considered to define the state-of-the-art in PSHA using multiple expert opinions, has been fully applied only twice, firstly in the multi-year Yucca Mountain study and subsequently (2002-2004) in the PEGASOS project. The authors of this paper participated as ground-motion experts in this latter project, the objective of which was comprehensive seismic hazard analysis for four nuclear power plant sites in Switzerland, considering annual exceedance frequencies down to 1/10000000. Following SSHAC procedure, particular emphasis was put on capturing both the aleatory and epistemic uncertainties. As a consequence, ground motion prediction was performed by combining several empirical ground motion models within a logic tree framework with the weights on each logic tree branch expressing the personal degree-of-belief of each ground-motion expert. In the present paper, we critically review the current state of ground motion prediction methodology in PSHA in particular for regions of low seismicity. One of the toughest lessons from PEGASOS was that in systematically and rigorously applying the laws of uncertainty propagation to all of the required conversions and adjustments of ground motion models, a huge price has to be paid in an ever-growing aleatory variability. Once this path has been followed, these large sigma values will drive the hazard, particularly for low annual frequencies of exceedance. Therefore, from a post-PEGASOS perspective, the key issues in the context of ground-motion prediction for PSHA for the near future are to better understand the aleatory variability of ground motion and to develop suites of ground-motion prediction equations that employ the same parameter definitions. The latter is a global rather than a regional challenge which might be a desirable long-term goal for projects similar to the PEER NGA (Pacific Earthquake Engineering Research Center, Next

  18. System hazards in managing laboratory test requests and results in primary care: medical protection database analysis and conceptual model

    PubMed Central

    Bowie, Paul; Price, Julie; Hepworth, Neil; Dinwoodie, Mark; McKay, John

    2015-01-01

    Objectives To analyse a medical protection organisation's database to identify hazards related to general practice systems for ordering laboratory tests, managing test results and communicating test result outcomes to patients. To integrate these data with other published evidence sources to inform design of a systems-based conceptual model of related hazards. Design A retrospective database analysis. Setting General practices in the UK and Ireland. Participants 778 UK and Ireland general practices participating in a medical protection organisation's clinical risk self-assessment (CRSA) programme from January 2008 to December 2014. Main outcome measures Proportion of practices with system risks; categorisation of identified hazards; most frequently occurring hazards; development of a conceptual model of hazards; and potential impacts on health, well-being and organisational performance. Results CRSA visits were undertaken to 778 UK and Ireland general practices of which a range of systems hazards were recorded across the laboratory test ordering and results management systems in 647 practices (83.2%). A total of 45 discrete hazard categories were identified with a mean of 3.6 per practice (SD=1.94). The most frequently occurring hazard was the inadequate process for matching test requests and results received (n=350, 54.1%). Of the 1604 instances where hazards were recorded, the most frequent was at the ‘postanalytical test stage’ (n=702, 43.8%), followed closely by ‘communication outcomes issues’ (n=628, 39.1%). Conclusions Based on arguably the largest data set currently available on the subject matter, our study findings shed new light on the scale and nature of hazards related to test results handling systems, which can inform future efforts to research and improve the design and reliability of these systems. PMID:26614621

  19. Design, analysis and presentation of multinational economic studies: the need for guidance.

    PubMed

    Pang, Francis

    2002-01-01

    Over the last decade, there has been a proliferation in the number of economic evaluations of pharmaceuticals to meet the growing demand for information about the economic benefits of healthcare technologies. The majority of these studies have been commissioned by pharmaceutical companies for the purposes of drawing attention to the resource and quality-of-life aspects of new or existing products. Such information has become important in overcoming a new obstacle, namely the demonstration of cost effectiveness (the so-called 'fourth hurdle'), in addition to the three well-established criteria of quality, tolerability and efficacy. To ensure the maintenance of standards, guidance for economic evaluations has emerged lately in the form of guidelines, regulations, principles, policies and positions. Drummond outlined three purposes of these guidelines, as follows: as a requirement prior to reimbursement, as statements of methodological standards, and as a statement of ethical standards. Such guidelines are designed to assist both the economic analyst and the decision-maker. In laying out the state of the art regarding the methodology of economic evaluation, guidelines assist the analyst in performing high-quality, scientifically valid studies, and assist the decision-maker in properly interpreting and assessing their quality. In response to these growing requirements for cost-effectiveness data globally, it has become increasingly common for economic evaluations to be conducted on an international scale. However, the recommendations in pharmacoeconomics guidelines regarding the manner in which these multinational economic evaluations should be designed, analysed and presented are too limited to be of any real value. This article examines the various issues that must be taken into consideration when conducting multinational studies, and provides a review of the techniques and approaches that have been suggested to date. It concludes with recommendations for potential

  20. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  1. Hazard prioritization in ecological risk assessment through spatial analysis of toxicant gradients.

    PubMed

    Preston, Benjamin L

    2002-01-01

    The analysis of spatial relationships among the distribution of environmental stressors and observed or predicted adverse effects may be a useful method of prioritizing hazards in regional ecological risk assessment (ERA). Geographic information systems were used to compare the spatial distribution of toxicant concentrations in sediments of Chesapeake Bay with the distribution of areas in the basin where ecological impacts have historically been observed. Toxicants were then prioritized based upon the strength of their spatial association with the high impact areas. This method of hazard identification/prioritization was validated against the Chesapeake Bay Program's lists of toxics of concern and toxics of potential concern (TOC and TOPC, respectively). Of the 18 toxicants on the TOC/TOPC lists that were considered in the current study, 15 (83%) were identified as priority contaminants in the current study, 11 (73%) of which were either of primary or secondary concern. The use of spatial analysis tools in ERA may lead to more rapid and rigorous methods for prioritizing environmental risks.

  2. Geomorphological analysis of sinkhole and landslide hazard in a karst area of the Venetian Prealps- Italy

    NASA Astrophysics Data System (ADS)

    Tiberi, Valentina

    2010-05-01

    In the pedemountain area of the Asiago Plateau (Venetian Prealps - NE Italy) sinkholes and landslides represent in many cases a complex response to karst processes. Field survey showed that both soil and bedrock are involved, mainly represented by colluvial-alluvial sediments and carbonate rocks. Preliminary observations also reveal the key role of piping and cave-collapse phenomena and the importance of human remedial measures. Within study area, these processes cause damage mainly to agricultural and pasture activities and expose peoples and farm animals to very high hazards. This work provides preliminary results of geomorphological analysis carried out to define sinkhole and landslide hazard and his connections with karst processes. During first phases of the research program, an inventory of interesting phenomena has been elaborated employing GIS technologies. The database has been constantly revised and enriched with new field measurements and thematic maps (i.e. geomorphological, geo-structural, hydrogeological, caves development maps). Specifically, field survey focused on the morphodynamic definition of instability elements allowing to recognize a wide range of morphotypes (mainly with regard to sinkholes) and polygenic morphologies (i.e. mixed sinkholes-landslides configurations). Geomorphological analysis also revealed specific evolutionary trends of instability processes; they could be useful employed to program more effective mitigation strategies.

  3. Site specific seismic hazard analysis at the DOE Kansas City Plant

    SciTech Connect

    Lynch, D.T.; Drury, M.A.; Meis, R.C.; Bieniawski, A.; Savy, J.B.; Llopis, J.L.; Constantino, C.; Hashimoto, P.S.; Campbell, K.W.

    1995-10-01

    A site specific seismic hazard analysis is being conducted for the Kansas City Plant to support an on-going structural evaluation of existing buildings. This project is part of the overall review of facilities being conducted by DOE. The seismic hazard was probabilistically defined at the theoretical rock outcrop by Lawrence Livermore National Laboratory. The USArmy Engineer Waterways Experiment Station conducted a subsurface site investigation to characterize in situ S-wave velocities and other subsurface physical properties related to the geology in the vicinity of the Main Manufacturing Building (MMB) at the Bannister Federal Complex. The test program consisted of crosshole S-wave, seismic cone penetrometer testing,and laboratory soil analyses. The information acquired from this investigation was used in a site response analysis by City College of New York to determine the earthquake motion at grade. Ground response spectra appropriate for design and evaluation of Performance Category 1 and 2 structures, systems, and components were recommended. Effects of seismic loadings on the buildings will be used to aid in designing any structural modifications.

  4. Fire Hazard Analysis for the Cold Vacuum Drying facility (CVD) Facility

    SciTech Connect

    SINGH, G.

    2000-09-06

    The CVDF is a nonreactor nuclear facility that will process the Spent Nuclear Fuels (SNF) presently stored in the 105-KE and 105-KW SNF storage basins. Multi-canister overpacks (MCOs) will be loaded (filled) with K Basin fuel transported to the CVDF. The MCOs will be processed at the CVDF to remove free water from the fuel cells (packages). Following processing at the CVDF, the MCOs will be transported to the CSB for interim storage until a long-term storage solution can be implemented. This operation is expected to start in November 2000. A Fire Hazard Analysis (FHA) is required for all new facilities and all nonreactor nuclear facilities, in accordance with U.S. Department of Energy (DOE) Order 5480.7A, Fire Protection. This FHA has been prepared in accordance with DOE 5480.7A and HNF-PRO-350, Fire Hazard Analysis Requirements. Additionally, requirements or criteria contained in DOE, Richland Operations Office (RL) RL Implementing Directive (RLID) 5480.7, Fire Protection, or other DOE documentation are cited, as applicable. This FHA comprehensively assesses the risk of fire at the CVDF to ascertain whether the specific objectives of DOE 5480.7A are met. These specific fire protection objectives are: (1) Minimize the potential for the occurrence of a fire. (2) Ensure that fire does not cause an onsite or offsite release of radiological and other hazardous material that will threaten the public health and safety or the environment. (3) Establish requirements that will provide an acceptable degree of life safety to DOE and contractor personnel and ensure that there are no undue hazards to the public from fire and its effects in DOE facilities. (4) Ensure that vital DOE programs will not suffer unacceptable delays as a result of fire and related perils. (5) Ensure that property damage from fire and related perils does not exceed an acceptable level. (6) Ensure that process control and safety systems are not damaged by fire or related perils. This FHA is based on the

  5. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    NASA Astrophysics Data System (ADS)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  6. Towards a probabilistic tsunami hazard analysis for the Gulf of Cadiz

    NASA Astrophysics Data System (ADS)

    Løvholt, Finn; Urgeles, Roger

    2017-04-01

    Landslides and volcanic flank collapses constitute a significant portion of all known tsunami sources, and they are less constrained geographically than earthquakes as they are not tied to large fault zones. While landslides have mostly produced local tsunamis historically, prehistoric evidence show that landslides can also produce ocean wide tsunamis. Because the landslide induced tsunami probability is more difficult to quantify than the one induced by earthquakes, our understanding of the landslide tsunami hazard is less understood. To improve our understanding and methodologies to deal with this hazard, we here present results and methods for a preliminary landslide probabilistic tsunami hazard assessment (LPTHA) for the Gulf of Cadiz for submerged landslides. The present literature on LPTHA is sparse, and studies have so far been separated into two groups, the first based on observed magnitude frequency distributions (MFD's), the second based on simplified geotechnical slope stability analysis. We argue that the MFD based approach is best suited when a sufficient amount of data covering a wide range of volumes is available, although uncertainties in the dating of the landslides often represent a potential large source of bias. To this end, the relatively rich availability of landslide data in the Gulf of Cadiz makes this area suitable for developing and testing LPTHA models. In the presentation, we will first explore the landslide data and statistics, including different spatial factors such as slope versus volume relationships, faults etc. Examples of how random realizations can be used to distribute tsunami source over the study area will be demonstrated. Furthermore, computational strategies for simulating both the landslide and the tsunami generation in a simplified way will be described. To this end, we use depth averaged viscoplastic landslide model coupled to the numerical tsunami model to represent a set of idealized tsunami sources, which are in turn

  7. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    PubMed

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  8. Characterizing Guidance in Visual Analytics.

    PubMed

    Ceneda, Davide; Gschwandtner, Theresia; May, Thorsten; Miksch, Silvia; Schulz, Hans-Jorg; Streit, Marc; Tominski, Christian

    2017-01-01

    Visual analytics (VA) is typically applied in scenarios where complex data has to be analyzed. Unfortunately, there is a natural correlation between the complexity of the data and the complexity of the tools to study them. An adverse effect of complicated tools is that analytical goals are more difficult to reach. Therefore, it makes sense to consider methods that guide or assist users in the visual analysis process. Several such methods already exist in the literature, yet we are lacking a general model that facilitates in-depth reasoning about guidance. We establish such a model by extending van Wijk's model of visualization with the fundamental components of guidance. Guidance is defined as a process that gradually narrows the gap that hinders effective continuation of the data analysis. We describe diverse inputs based on which guidance can be generated and discuss different degrees of guidance and means to incorporate guidance into VA tools. We use existing guidance approaches from the literature to illustrate the various aspects of our model. As a conclusion, we identify research challenges and suggest directions for future studies. With our work we take a necessary step to pave the way to a systematic development of guidance techniques that effectively support users in the context of VA.

  9. A prototype web-GIS application for risk analysis of natural hazards in Switzerland

    NASA Astrophysics Data System (ADS)

    Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien

    2016-04-01

    Following changes in the system of Swiss subsidy in January 2008, the Swiss cantons and the Federal Office for the Environment (FOEN) were forced to prioritize different natural hazard protection projects based on their cost-effectiveness, as a response to limited financial resources (Bründl et al., 2009). For this purpose, applications such as EconoMe (OFEV, 2016) and Valdorisk (DGE, 2016) were developed for risk evaluation and prioritization of mitigation projects. These tools serve as a useful decision-making instrument to the community of practitioners and responsible authorities for natural hazard risk management in Switzerland. However, there are several aspects which could be improved, in particular, the integration and visualization of spatial information interactively through a web-GIS interface for better risk planning and evaluation. Therefore, in this study, we aim to develop an interactive web-GIS application based on the risk concepts applied in Switzerland. The purpose of this tool is to provide a rapid evaluation of risk before and after protection measures, and to test the efficiency of measures by using a simplified cost-benefit analysis within the context of different protection projects. This application allows to integrate different layers which are necessary to calculate risk, in particular, hazard intensity (vector) maps for different scenarios (such as 30, 100 and 300 years of return periods based on Swiss guidelines), exposed objects (such as buildings) and vulnerability information of these objects. Based on provided information and additional parameters, risk is calculated automatically and results are visualized within the web-GIS interface of the application. The users can modify these input information and parameters to create different risk scenarios. Based on the resultant risk scenarios, the users can propose and visualize (preliminary) risk reduction measures before realizing the actual design and dimensions of such protective

  10. Variability and Uncertainty in Probabilistic Seismic Hazard Analysis for the Island of Montreal

    NASA Astrophysics Data System (ADS)

    Elkady, Ahmed Mohamed Ahmed

    The current seismic design process for structures in Montreal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montreal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "epsilon" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montreal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.

  11. Incorporating Climate Change Projections into a Hydrologic Hazard Analysis for Friant Dam

    NASA Astrophysics Data System (ADS)

    Holman, K. D.; Novembre, N.; Sankovich-Bahls, V.; England, J. F.

    2015-12-01

    The Bureau of Reclamation's Dam Safety Office has initiated a series of pilot studies focused on exploring potential impacts of climate change on hydrologic hazards at specific dam locations across the Western US. Friant Dam, located in Fresno, California, was chosen for study because the site had recently undergone a high-level hydrologic hazard analysis using the Stochastic Event Flood Model (SEFM). SEFM is a deterministic flood-event model that treats input parameters as variables, rather than fixed values. Monte Carlo sampling allows the hydrometeorological input parameters to vary according to observed relationships. In this study, we explore the potential impacts of climate change on the hydrologic hazard at Friant Dam using historical and climate-adjusted hydrometeorological inputs to the SEFM. Historical magnitude-frequency relationships of peak inflow and reservoir elevation were developed at Friant Dam for the baseline study using observed temperature and precipitation data between 1966 and 2011. Historical air temperatures, antecedent precipitation, mean annual precipitation, and the precipitation-frequency curve were adjusted for the climate change study using the delta method to create climate-adjusted hydrometeorological inputs. Historical and future climate projections are based on the Bias-Corrected Spatially-Disaggregated CMIP5 dataset (BCSD-CMIP5). The SEFM model was run thousands of times to produce magnitude-frequency relationships of peak reservoir inflow, inflow volume, and reservoir elevation, based on historical and climate-adjusted inputs. Results suggest that peak reservoir inflow and peak reservoir elevation increase (decrease) for all return periods under mean increases (decreases) in precipitation, independently of changes in surface air temperature.

  12. Analytical Problems Associated with the Analysis of Metals in a Simulated Hazardous Waste

    NASA Astrophysics Data System (ADS)

    Dunnivant, F. M.

    2002-06-01

    Analysis of samples subject to physical and chemical interferences can greatly enhance the learning experience in instrumental analysis and environmental chemistry laboratories. This article describes a project-based experience in which students analyze simulated hazardous waste samples (carbonated beverages) for calcium by six techniques: (i) flame atomic absorption spectroscopy (FAAS) using external standard calibration, (ii) FAAS using external standard calibration with a releasing agent (Sr), (iii) FAAS using standard addition, (iv) FAAS using standard addition with a releasing agent (Sr), (v) ethylenediaminetetraacetic acid (EDTA) titration, and (vi) Ca-ion-specific electrode. Not surprisingly, students find that these different techniques yield conflicting results and their assignment is to explain their data in the format of a peer-reviewed journal article. Students report that this series of lab experiments is challenging and highly rewarding. Laboratory experiences such as this one should significantly improve the student's ability to analyze problematic samples and interpret experimental data.

  13. Spatial analysis of climate hazards in relation to urban environments in Iran by using AHP and Kriging methods

    NASA Astrophysics Data System (ADS)

    Alijani, B.; Alijani, S.

    2010-09-01

    values. The highest value of this hazard index is between 36 and 40 percent in the southern coastal areas and the lowest value was below 10 percent in the northern parts of the country. We also found that the AHP method is suitable for grouping and identifying hazard indices and the Ordinary Kriging interpolation method was able to provide relatively precise spatial information over the very diverse land of Iran. We hope that the findings of this research will help the urban planners and designers in improving the urban life over the country especially in the hazardous areas of the South where, both torrential floods and sultry weather prevail. Key words: climate hazards, urban environment, urban designing, climate and urban environment, spatial analysis, urban planning. AHP and Kriging methods.

  14. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed

  15. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  16. A New Insight into Probabilistic Seismic Hazard Analysis for Central India

    NASA Astrophysics Data System (ADS)

    Mandal, H. S.; Shukla, A. K.; Khan, P. K.; Mishra, O. P.

    2013-12-01

    The Son-Narmada-Tapti lineament and its surroundings of Central India (CI) is the second most important tectonic regime following the converging margin along Himalayas-Myanmar-Andaman of the Indian sub-continent, which attracted several geoscientists to assess its seismic hazard potential. Our study area, a part of CI, is bounded between latitudes 18°-26°N and longitudes 73°-83°E, representing a stable part of Peninsular India. Past damaging moderate magnitude earthquakes as well as continuing microseismicity in the area provided enough data for seismological study. Our estimates based on regional Gutenberg-Richter relationship showed lower b values (i.e., between 0.68 and 0.76) from the average for the study area. The Probabilistic Seismic Hazard Analysis carried out over the area with a radius of ~300 km encircling Bhopal yielded a conspicuous relationship between earthquake return period ( T) and peak ground acceleration (PGA). Analyses of T and PGA shows that PGA value at bedrock varies from 0.08 to 0.15 g for 10 % ( T = 475 years) and 2 % ( T = 2,475 years) probabilities exceeding 50 years, respectively. We establish the empirical relationships and between zero period acceleration (ZPA) and shear wave velocity up to a depth of 30 m [ V s (30)] for the two different return periods. These demonstrate that the ZPA values decrease with increasing shear wave velocity, suggesting a diagnostic indicator for designing the structures at a specific site of interest. The predictive designed response spectra generated at a site for periods up to 4.0 s at 10 and 2 % probability of exceedance of ground motion for 50 years can be used for designing duration dependent structures of variable vertical dimension. We infer that this concept of assimilating uniform hazard response spectra and predictive design at 10 and 2 % probability of exceedance in 50 years at 5 % damping at bedrocks of different categories may offer potential inputs for designing earthquake resistant

  17. Evaluation of the Potential of NASA Multi-satellite Precipitation Analysis in Global Landslide Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Hong, Yang; Adler, Robert F.; Huffman, George J.

    2007-01-01

    Landslides are one of the most widespread natural hazards on Earth, responsible for thousands of deaths and billions of dollars in property damage every year. In the U.S. alone landslides occur in every state, causing an estimated $2 billion in damage and 25- 50 deaths each year. Annual average loss of life from landslide hazards in Japan is 170. The situation is much worse in developing countries and remote mountainous regions due to lack of financial resources and inadequate disaster management ability. Recently, a landslide buried an entire village on the Philippines Island of Leyte on Feb 17,2006, with at least 1800 reported deaths and only 3 houses left standing of the original 300. Intense storms with high-intensity , long-duration rainfall have great potential to trigger rapidly moving landslides, resulting in casualties and property damage across the world. In recent years, through the availability of remotely sensed datasets, it has become possible to conduct global-scale landslide hazard assessment. This paper evaluates the potential of the real-time NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA) system to advance our understanding of and predictive ability for rainfall-triggered landslides. Early results show that the landslide occurrences are closely associated with the spatial patterns and temporal distribution of rainfall characteristics. Particularly, the number of landslide occurrences and the relative importance of rainfall in triggering landslides rely on the influence of rainfall attributes [e.g. rainfall climatology, antecedent rainfall accumulation, and intensity-duration of rainstorms). TMPA precipitation data are available in both real-time and post-real-time versions, which are useful to assess the location and timing of rainfall-triggered landslide hazards by monitoring landslide-prone areas while receiving heavy rainfall. For the purpose of identifying rainfall-triggered landslides, an empirical global rainfall intensity

  18. Implementing Extreme Value Analysis in a Geospatial Workflow for Storm Surge Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Catelli, J.; Nong, S.

    2014-12-01

    Gridded data of 100-yr (1%) and 500-yr (0.2%) storm surge flood elevations for the United States, Gulf of Mexico, and East Coast are critical to understanding this natural hazard. Storm surge heights were calculated across the study area utilizing SLOSH (Sea, Lake, and Overland Surges from Hurricanes) model data for thousands of synthetic US landfalling hurricanes. Based on the results derived from SLOSH, a series of interpolations were performed using spatial analysis in a geographic information system (GIS) at both the SLOSH basin and the synthetic event levels. The result was a single grid of maximum flood elevations for each synthetic event. This project addresses the need to utilize extreme value theory in a geospatial environment to analyze coincident cells across multiple synthetic events. The results are 100-yr (1%) and 500-yr (0.2%) values for each grid cell in the study area. This talk details a geospatial approach to move raster data to SciPy's NumPy Array structure using the Python programming language. The data are then connected through a Python library to an outside statistical package like R to fit cell values to extreme value theory distributions and return values for specified recurrence intervals. While this is not a new process, the value behind this work is the ability to keep this process in a single geospatial environment and be able to easily replicate this process for other natural hazard applications and extreme event modeling.

  19. Analysis of aerosol emission and hazard evaluation of electrical discharge machining (EDM) process.

    PubMed

    Jose, Mathew; Sivapirakasam, S P; Surianarayanan, M

    2010-01-01

    The safety and environmental aspects of a manufacturing process are important due to increased environmental regulations and life quality. In this paper, the concentration of aerosols in the breathing zone of the operator of Electrical Discharge Machining (EDM), a commonly used non traditional manufacturing process is presented. The pattern of aerosol emissions from this process with varying process parameters such as peak current, pulse duration, dielectric flushing pressure and the level of dielectric was evaluated. Further, the HAZOP technique was employed to identify the inherent safety aspects and fire risk of the EDM process under different working conditions. The analysis of aerosol exposure showed that the concentration of aerosol was increased with increase in the peak current, pulse duration and dielectric level and was decreased with increase in the flushing pressure. It was also found that at higher values of peak current (7A) and pulse duration (520 micros), the concentration of aerosols at breathing zone of the operator was above the permissible exposure limit value for respirable particulates (5 mg/m(3)). HAZOP study of the EDM process showed that this process is vulnerable to fire and explosion hazards. A detailed discussion on preventing the fire and explosion hazard is presented in this paper. The emission and risk of fire of the EDM process can be minimized by selecting proper process parameters and employing appropriate control strategy.

  20. Safety analysis for the use of hazardous production materials in photovoltaic applications

    SciTech Connect

    Moskowitz, P.D.; Fthenakis, V.M.; Crandall, R.S.; Nelson, B.P.

    1993-12-31

    A wide range of hazardous production materials (HPMs) are used in industrial and university facilities engaged in research and development (R&D) related to semiconductor and photovoltaic devices. Because of the nature of R&D facilities where research activities are constantly changing, it is important for facility managers to pro-actively control the storage, distribution, use and disposal of these HPMs. As part of this control process, facility managers must determine the magnitude of the risk presented by their operations and the protection afforded by the administrative, engineering and personnel controls that have been implemented to reduce risks to life and property to acceptable levels. Facility auditing combined with process hazard analysis (PHA), provides a mechanism for identifying these risks and evaluating their magnitude. In this paper, the methods and results of a PHA for a photovoltaic R&D facility handling HPMs are presented. Of the 30 potential accidents identified, none present High or even Moderate Risks; 18 present Low Risks; and, 12 present Routine Risks. Administrative, engineering and personal safety controls associated with each accident are discussed. 15 refs., 2 figs., 6 tabs.