Science.gov

Sample records for hazard analysis guidance

  1. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    SciTech Connect

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  2. HANDBOOK: HAZARDOUS WASTE INCINERATION MEASUREMENT GUIDANCE

    EPA Science Inventory

    This publication, Volume III of the Hazardous Waste Incineration Guidance Series, contains general guidance to permit writers in reviewing hazardous waste incineration permit applications and trial burn plans. he handbook is a how-to document dealing with how incineration measure...

  3. K Basin Hazard Analysis

    SciTech Connect

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  4. K Basins Hazard Analysis

    SciTech Connect

    WEBB, R.H.

    1999-12-29

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062, Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  5. 76 FR 23823 - Guidance for Industry on Fish and Fishery Products Hazards and Controls, Fourth Edition...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    ...The Food and Drug Administration (FDA) is announcing the availability of a guidance for industry entitled ``Fish and Fishery Products Hazards and Controls Guidance, Fourth Edition.'' The updated guidance supports and complements FDA's regulations for the safe and sanitary processing and importing of fish and fishery products using hazard analysis and critical control point (HACCP)...

  6. GUIDANCE MANUAL ON OVERTOPPING CONTROL TECHNIQUES FOR HAZARDOUS WASTE IMPOUNDMENTS

    EPA Science Inventory

    The overall objective of the project was to provide guidance for selecting cost-effective interim management methods to control overtopping of impoundments, pits, ponds, or lagoons at uncontrolled hazardous waste sites until final remedial actions could be initiated. Hazardous wa...

  7. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  8. Automated Hazard Analysis

    Energy Science and Technology Software Center (ESTSC)

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control andmore » job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the work planning process.« less

  9. Hazard Analysis Database Report

    SciTech Connect

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  10. Hazard Analysis Database Report

    SciTech Connect

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  11. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  12. HAZARD ANALYSIS SOFTWARE

    SciTech Connect

    Sommer, S; Tinh Tran, T

    2008-04-08

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process.

  13. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  14. Chemical process hazards analysis

    SciTech Connect

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  15. Counterfactual Volcano Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2013-04-01

    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However

  16. 78 FR 41853 - Safety Advisory Guidance: Heating Rail Tank Cars To Prepare Hazardous Material for Unloading or...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-12

    ...This guidance provides safety precautions and recommended guidance for persons responsible for unloading or transloading \\1\\ hazardous materials from rail tank cars, specifically those persons heating a rail tank car to prepare its hazardous material contents for unloading or transloading. Further, this guidance reminds such persons of current regulatory requirements addressing this type of......

  17. Track 2 sites: Guidance for assessing low probability hazard sites at the INEL. Revision 6

    SciTech Connect

    Not Available

    1994-01-01

    This document presents guidance for assessment of Track 2 low probability hazard sites (LPHS) at the Idaho National Engineering Laboratory (INEL). The Track 2 classification was developed specifically for the INEL to streamline the implementation of Comprehensive Environmental Response, Compensation, and Liability Act. Track 2 LPHSs are described as sites where insufficient data are available to make a decision concerning the risk level or to select or design a remedy. As such, these types of sites are not described in the National Contingency Plan or existing regulatory guidance. The goal of the Track 2 process is to evaluate LPHSs using existing qualitative and quantitative data to minimize the collection of new environmental data. To this end, this document presents a structured format consisting of a series of questions and tables. A qualitative risk assessment is used. The process is iterative, and addresses an LPHS from multiple perspectives (i.e., historical, empirical, process) in an effort to generate a reproducible and defensible method. This rigorous approach follows the data quality objective process and establishes a well organized, logical approach to consolidate and assess existing data, and set decision criteria. If necessary, the process allows for the design of a sampling and analysis strategy to obtain new environmental data of appropriate quality to support decisions for each LPHS. Finally, the guidance expedites consensus between regulatory parties by emphasizing a team approach to Track 2 investigations.

  18. 78 FR 69992 - Guidance for Industry on Purchasing Reef Fish Species Associated With the Hazard of Ciguatera...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ..., 2013 (78 FR 18273), FDA made available a draft guidance entitled ``Guidance for Industry: Purchasing... Fish Species Associated With the Hazard of Ciguatera Fish Poisoning; Availability AGENCY: Food and Drug... announcing the availability of a guidance for industry entitled ``Guidance for Industry: Purchasing Reef...

  19. CHARACTERIZATION OF ORGANIC EMISSIONS FROM HAZARDOUS WASTE INCINERATION PROCESSES UNDER THE NEW EPA DRAFT RISK BURN GUIDANCE: MEASUREMENT ISSUES

    EPA Science Inventory

    The paper discusses measurement issues relating to the characterization of organic emissions from hazardous waste incineration processes under EPA's new risk burn guidance. The recently published draft quidance recommends that hazardous waste combustion facilities complete a mass...

  20. Preparedness of hazardous materials emergencies in railyards: Guidance for railroads and adjacent communities

    SciTech Connect

    Not Available

    1991-09-01

    Railroads are a key part of the distribution system for hazardous materials and, thus, much hazardous material passes through railyards en route to intermediate or final consumers. While the vast majority of these materials are shipped without incident, both the number of shipments and the nature of the materials themselves dictate that railyards and surrounding communities be prepared to respond quickly and effectively to emergencies. This report contains information on 11 emergency preparedness functions and 150 guidance recommendations.

  1. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  2. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  3. Preliminary Hazards Analysis Plasma Hearth Process

    SciTech Connect

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  4. POTW sludge sampling and analysis guidance document

    SciTech Connect

    Not Available

    1989-08-01

    In view of the variability of municipal sludge quality, appropriate procedures must be followed to collect and analyze samples that accurately represent each POTW's sludge quality. The manual was developed to provide that guidance to POTW operators, engineers, managers, chemists and permit writers. It was intended to provide guidance in developing and implementing a sampling and analysis program, to gather information on sludge quality and determine compliance with permit conditions. The manual is based on current, state-of-the-art field and laboratory practices and therefore is recommended for all sludge sampling and analysis programs.

  5. VTOL shipboard letdown guidance system analysis

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.; Karmali, M. S.

    1983-01-01

    Alternative letdown guidance strategies are examined for landing of a VTOL aircraft onboard a small aviation ship under adverse environmental conditions. Off line computer simulation of shipboard landing task is utilized for assessing the relative merits of the proposed guidance schemes. The touchdown performance of a nominal constant rate of descent (CROD) letdown strategy serves as a benchmark for ranking the performance of the alternative letdown schemes. Analysis of ship motion time histories indicates the existence of an alternating sequence of quiescent and rough motions called lulls and swells. A real time algorithms lull/swell classification based upon ship motion pattern features is developed. The classification algorithm is used to command a go/no go signal to indicate the initiation and termination of an acceptable landing window. Simulation results show that such a go/no go pattern based letdown guidance strategy improves touchdown performance.

  6. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    SciTech Connect

    R. Longwell; J. Keifer; S. Goodin

    2001-01-22

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  7. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.7 Hazard analysis. (a) Each processor shall develop, or have developed for it, a written hazard analysis...

  8. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  9. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  10. CHARACTERISTICS OF ORGANIC EMISSIONS FROM HAZARDOUS WASTE INCINERATION PROCESSES UNDER THE NEW EPA DRAFT RISK BURN GUIDANCE: MEASUREMENT ISSUES

    EPA Science Inventory

    EPA's recently published draft Risk Burn Guidance recommends that hazardous waste combustion facilities complete a mass balance of the total organics (TOs) that may be emitted from the combustor. TOs, consisting of three distinct fractions (volatile, semivolatile, and nonvolatile...

  11. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  12. An analysis of physical guidance as reinforcement for noncompliance.

    PubMed

    Kern, Lee; Delaney, Beth A; Hilt, Alexandra; Bailin, Deborah E; Elliot, Christina

    2002-09-01

    Physical guidance is a strategy commonly used for noncompliance. In two experiments, we evaluated reinforcing effects of physical guidance. Experiment 1 include three individuals with developmental disabilities who were noncompliant with tasks. Anecdotal observations indicated physical contact was highly reinforcing, and a functional analysis identified attention as a reinforcer for problem behavior. Two conditions compared physical guidance following noncompliance and no physical guidance following noncompliance. Results showed noncompliance increased for all three participants when physical guidance followed noncompliance. The second experiment empirically evaluated the function of noncompliance prior to examining the effect of physical guidance on noncompliance. A functional analysis, conducted with two participants, indicated noncompliance served an escape function for one and an attention function for the other. Comparisons of physical guidance and no physical guidance indicated physical guidance resulted in decreased noncompliance for the participant with an escape function but increased noncompliance for the participant with an attention function. PMID:12205825

  13. Developing guidance for budget impact analysis.

    PubMed

    Trueman, P; Drummond, M; Hutton, J

    2001-01-01

    The role of economic evaluation in the efficient allocation of healthcare resources has been widely debated. Whilst economic evidence is undoubtedly useful to purchasers, it does not address the issue of affordability which is an increasing concern. Healthcare purchasers are concerned not just with maximising efficiency but also with the more simplistic goal of remaining within their annual budgets. These two objectives are not necessarily consistent. This paper examines the issue of affordability, the relationship between affordability and efficiency and builds the case for why there is a growing need for budget impact models to complement economic evaluation. Guidance currently available for such models is also examined and it is concluded that this guidance is currently insufficient. Some of these insufficiencies are addressed and some thoughts on what constitutes best practice in budget impact modelling are suggested. These suggestions include consideration of transparency, clarity of perspective, reliability of data sources, the relationship between intermediate and final end-points and rates of adoption of new therapies. They also include the impact of intervention by population subgroups or indications, reporting of results, probability of re-deploying resources, the time horizon, exploring uncertainty and sensitivity analysis, and decision-maker access to the model. Due to the nature of budget impact models, the paper does not deliver stringent methodological guidance on modelling. The intention was to provide some suggestions of best practice in addition to some foundations upon which future research can build. PMID:11456210

  14. Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    EPA Science Inventory

    The Technical Guidance for Assessing Environmental Justice in Regulatory Analysis (also referred to as the Environmental Justice Technical Guidance or EJTG) is intended for use by Agency analysts, including risk assessors, economists, and other analytic staff that conduct analyse...

  15. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  16. 78 FR 18273 - Draft Guidance for Industry on Purchasing Reef Fish Species Associated With the Hazard of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... Reef Fish Species Associated With the Hazard of Ciguatera Fish Poisoning; Availability AGENCY: Food and...: Purchasing Reef Fish Species Associated With the Hazard of Ciguatera Fish Poisoning.'' The draft guidance, when finalized, will advise primary seafood processors who purchase reef fish how to minimize the...

  17. Quality Guidance: A Sectoral Analysis. NICEC Project Report.

    ERIC Educational Resources Information Center

    Watts, A. G.; Sadler, Jackie

    This report reviews the structure of the guidance field and provides a sector-by-sector analysis of current quality assurance arrangements in the United Kingdom. Part 1 presents an outline of the guidance sector, including some key concepts, structures, and roles. It defines guidance and other terms; discusses the three main categories…

  18. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  19. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  20. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.7 Hazard analysis. (a) Each processor shall develop,...

  1. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    SciTech Connect

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  2. Canister Storage Building (CSB) Hazard Analysis Report

    SciTech Connect

    POWERS, T.B.

    2000-03-16

    This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safety analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other

  3. Canister storage building hazard analysis report

    SciTech Connect

    POWERS, T.B.

    1999-05-11

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis was performed in accordance with the DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', and meets the intent of HNF-PRO-704, ''Hazard and Accident Analysis Process''. This hazard analysis implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports''.

  4. Hazard screening application guide. Safety Analysis Report Update Program

    SciTech Connect

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  5. Guidance for Safety Analysis of Other Than Nuclear Facilities/Activities at the INEEL

    SciTech Connect

    Swanson, Douglas Sidney; Perry, Scott William

    2002-06-01

    The U.S. Department of Energy Idaho Operations Office (DOE-ID) provided guidance per DOE-ID Orders 420.C, "Safety Basis Review and Approval Process," and 420.D, "Requirements and Guidance for Safety Analysis," for conducting safety analysis for facilities and activities that do not meet either the nuclear facility criteria or the criteria for not requiring additional safety analysis (NRASA). These facilities and activities are thus designated as "other than nuclear" (OTN), and hazard analyses are performed using a graded approach. This graded approach is done in accordance with DOE-ID Order 420.D. DOE-ID guidance is used to format these OTN facilities and activities into 3-chapter documents, rather than the 17-chapter format specified in DOE-STD-3009-94, "Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports."

  6. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation measures derived... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee...

  7. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation measures derived... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee...

  8. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation measures derived... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee...

  9. Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.

    2008-01-01

    Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.

  10. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  11. MONITORED GEOLOGIC REPOSITORY INTERNAL HAZARDS ANALYSIS

    SciTech Connect

    S.E. Salzman

    1999-11-05

    This analysis was performed by the Management and Operating Contractor (M&O) Safety Assurance Department to identify and document the internal hazards and preliminary events associated with preclosure operations of the Monitored Geologic Repository (MGR). Internal hazards are those hazards presented by operation of the facility and associated processes. These are in contrast to external hazards which involve natural phenomena and external man-made hazards. The hazard analysis methodology used in this analysis provides a systematic means to identify facility hazards and associated events that may result in radiological consequences to the public and facility worker during the MGR preclosure period. The events are documented in a preliminary events list and are intended to be used as input to the MGR Design Basis Event (DBE) selection process. It is expected that the results from this analysis will undergo further screening and analysis based on the criteria that apply to the performance of DBE analyses for the preclosure period of repository operation. As the MGR design progresses, this analysis will be reviewed to ensure no new hazards are introduced and that previously evaluated hazards have not increased in severity.

  12. Cold Vacuum Drying Facility hazard analysis report

    SciTech Connect

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  13. Canister storage building hazard analysis report

    SciTech Connect

    Krahn, D.E.; Garvin, L.J.

    1997-07-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  14. 78 FR 27235 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-09

    ... AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis AGENCY..., ``Technical Guidance for Assessing Environmental Justice in Regulatory Analysis.'' The purpose of this... regulatory analyses. This guidance takes into account EPA's past experience in integrating EJ into...

  15. Analysis of Regulatory Guidance for Health Monitoring

    NASA Technical Reports Server (NTRS)

    Munns, Thomas E.; Beard, Richard E.; Culp, Aubrey M.; Murphy, Dennis A.; Kent, Renee M.; Cooper, Eric G. (Technical Monitor)

    2000-01-01

    The purpose of this study was to assess the connection between current FAA regulations and the incorporation of Health Management (HM) systems into commercial aircraft. To address the overall objectives ARINC: (1) investigated FAA regulatory guidance, (2) investigated airline maintenance practices, (3) systematically identified regulations and practices that would be affected or could act as barriers to the introduction of HM technology, and (4) assessed regulatory and operational tradeoffs that should be considered for implementation. The assessment procedure was validated on a postulated structural HM capability for the B757 horizontal stabilizer.

  16. Analysis of guidance law performance using personal computers

    NASA Technical Reports Server (NTRS)

    Barrios, J. Rene

    1990-01-01

    A point mass, three-degree of freedom model is presented as a basic development tool for PC based simulation models. The model has been used in the development of guidance algorithms as well as in other applications such as performance management systems to compute optimal speeds. Its limitations and advantages are discussed with regard to the windshear environment. A method for simulating a simple autopilot is explained in detail and applied in the analysis of different guidance laws.

  17. Hazard categorization and accident analysis techniques for compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports

    SciTech Connect

    1992-12-31

    The purpose of this DOE Standard is to establish guidance for facility managers and Program Secretarial Officers (PSOs) and thereby help them to comply consistently and more efficiently with the requirements of DOE Order 5480.23, Nuclear Safety Analysis Reports. To this end, this guidance provides the following practical information: (1) The threshold quantities of radiological material inventory below which compliance with DOE Order 5480.23 is not required. (2) The level of effort to develop the program plan and schedule required in Section 9.b. (2) of the Order, and information for making a preliminary assessment of facility hazards. (3) A uniform methodology for hazard categorization under the Order. (4) Insight into the ''graded approach'' for SAR development, especially in hazard assessment and accident analysis techniques. Individual PSOs may develop additional guidance addressing safety requirements for facilities which fall below the threshold quantities specified in this document.

  18. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  19. Multicriteria analysis in hazards assessment in Libya

    NASA Astrophysics Data System (ADS)

    Zeleňáková, Martina; Gargar, Ibrahim; Purcz, Pavol

    2012-11-01

    Environmental hazards (natural and man-made) have always constituted problem in many developing and developed countries. Many applications proved that these problems could be solved through planning studies and detailed information about these prone areas. Determining time and location and size of the problem are important for decision makers for planning and management activities. It is important to know the risk represented by those hazards and take actions to protect against them. Multicriteria analysis methods - Analytic hierarchy process, Pairwise comparison, Ranking method are used to analyse which is the most dangerous hazard facing Libya country. The multicriteria analysis ends with a more or less stable ranking of the given alternatives and hence a recommendation as to which alternative(s) problems should be preferred. Regarding our problem of environmental risk assessment, the result will be a ranking or categorisation of hazards with regard to their risk level.

  20. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard analysis. (1) Every official establishment shall conduct, or have conducted for it, a hazard analysis...

  1. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard analysis. (1) Every official establishment shall conduct, or have conducted for it, a hazard analysis...

  2. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Hazard Analysis and HACCP Plan. 417.2... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard analysis. (1) Every official establishment shall conduct, or have conducted for it, a hazard analysis...

  3. WIPP waste characterization program sampling and analysis guidance manual

    SciTech Connect

    Not Available

    1991-01-01

    The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastes at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.

  4. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  5. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  6. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  7. Analysis of the combined maglev levitation, propulsion, and guidance system

    SciTech Connect

    He, J.L.; Coffey, H.T.; Rote, D.M.

    1995-03-01

    An analysis of a Japanese maglev system that uses only one set of coils in the guideway for combined levitation, propulsion, and guidance functions is presented. This preliminary study, using the dynamic circuit approach, indicates that the system is very promising.

  8. Ontario's Guidance and Career Education Program: A Democratic Analysis

    ERIC Educational Resources Information Center

    Hyslop-Margison, Emery J.; McKerracher, Adrian

    2008-01-01

    This article applies a democratic learning conceptual framework to an analysis of the Ontario (Canada) Guidance and Career Education program. Although the program respects some principles of democratic learning, it fails to explicitly recognise the distinction between social and natural reality. As a result, student agency and democratic…

  9. Repository Subsurface Preliminary Fire Hazard Analysis

    SciTech Connect

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  10. Fire hazard analyses and safety analysis reports relationship

    SciTech Connect

    Olson, W.W., Westinghouse Hanford

    1996-05-30

    DOE Order 5480.7A requires that Fire Hazard Analyses (FHA) be performed for new facilities, for facilities requiring a DOE 5480.23 Safety Analysis, or as directed by the DOE Program Secretarial Officer. DOE Order 5480.23 requires that a Safety Analysis Report (SAR) be prepared for DOE nuclear facilities. Both the FHA and SAR `documents provide important information and direction for facility design and operation. Each of the two documents address the effects of postulated fire scenarios, and both have common or at least consistent bases, and have overlapping elements. However, some of the objectives of the required analyses are distinctly different. These differences have historically resulted in variations in the interpretation and Understanding of the DOE Orders and associated guidance by organizations and individuals within the Westinghouse Hanford Company.

  11. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  12. DESIGN, CONSTRUCTION, AND MAINTENANCE OF COVER SYSTEMS FOR HAZARDOUS WASTE: AN ENGINEERING GUIDANCE DOCUMENT

    EPA Science Inventory

    Engineering for cover over solid hazardous waste addresses complex interactions among many technical, environmental, and economical factors. The document emphasizes the special characteristics of solid waste management as they bear on the cover system while at the same time stres...

  13. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard area analysis. (a) General. A flight safety analysis must include a flight hazard area analysis that...

  14. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Flight hazard area analysis. 417.223..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard area analysis. (a) General. A flight safety analysis must include a flight hazard area analysis that...

  15. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight hazard area analysis. 417.223..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard area analysis. (a) General. A flight safety analysis must include a flight hazard area analysis that...

  16. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Flight hazard area analysis. 417.223..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard area analysis. (a) General. A flight safety analysis must include a flight hazard area analysis that...

  17. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Flight hazard area analysis. 417.223..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard area analysis. (a) General. A flight safety analysis must include a flight hazard area analysis that...

  18. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  19. Decision analysis for INEL hazardous waste storage

    SciTech Connect

    Page, L.A.; Roach, J.A.

    1994-01-01

    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft{sup 2} of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies.

  20. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  1. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard... processing category identified in this paragraph, if the food safety hazards, critical control points, critical limits, and procedures required to be identified and performed in paragraph (c) of this...

  2. Analysis of cell locomotion. Contact guidance of human polymorphonuclear leukocytes.

    PubMed

    Matthes, T; Gruler, H

    1988-01-01

    The methods of statistical physics have been applied to the analysis of cell movement. Human polymorphonuclear leukocytes were exposed to different surfaces possessing parallel oriented physical structures (scratched glass surface, machine drilled aluminum surface, optical grid and stretched polyethylene foil) and cell migration was observed using time-lapse photography. We demonstrate that in cell migration along physical structures, referred to as contact guidance, two subgroups can be distinguished: 1) The nematic type where the cell size is large in relation to the grid distance of the undulate surface. 2) The smectic type where the cell size is small in relation to the grid distance of the substrate. Nematic contact guidance is characterized by an anisotropic random walk. In all substrates investigated the diffusion process parallel to the lines was faster than the diffusion process perpendicular to them. The angular dependent diffusion coefficient was described by an ellipse. Deviation from a circle defined an apolar order parameter, whose value was about 0.3. The amount of information which the cells collected from, the undulate surface was very low, between 0.1 and 0.2 bits. We demonstrate that cells do not recognize all the details of their surroundings and that their migration can be compared to the "groping around" of a short sighted man. The blurred environment can be described by a mean field whose strength is proportional to the apolar order parameter. It is argued that the anisotropic surface tension is the basic source for nematic contact guidance. Smectic contact guidance is characterized by an anisotropic random walk and is quantified by a density order parameter which is 0.28 in the case of the scratched glass surface of a Neubauer counting chamber. The information which the cells collect from their environment is very low (0.03 bits). The lines seen by the cell can be described by a mean field whose strength is proportional to the density oder

  3. Characterizing adoption of precautionary risk management guidance for nanomaterials, an emerging occupational hazard.

    PubMed

    Schubauer-Berigan, Mary K; Dahm, Matthew M; Schulte, Paul A; Hodson, Laura; Geraci, Charles L

    2015-01-01

    Exposure to engineered nanomaterials (substances with at least one dimension of 1-100 nm) has been of increased interest, with the recent growth in production and use of nanomaterials worldwide. Various organizations have recommended methods to minimize exposure to engineered nanomaterials. The purpose of this study was to evaluate available data to examine the extent to which studied U.S. companies (which represent a small fraction of all companies using certain forms of engineered nanomaterials) follow the guidelines for reducing occupational exposures to engineered nanomaterials that have been issued by the National Institute for Occupational Safety and Health (NIOSH) and other organizations. Survey data, field reports, and field notes for all NIOSH nanomaterial exposure assessments conducted between 2006 and 2011 were collected and reviewed to: (1) determine the level of adoption of precautionary guidance on engineering controls and personal protective equipment (PPE), and (2) evaluate the reliability of companies' self-reported use of engineering controls and PPE. Use of PPE was observed among 89% [95% confidence interval (CI): 76%-96%] of 46 visited companies, and use of containment-based engineering controls for at least some processes was observed among 83% (95% CI: 76%-96%). In on-site evaluations, more than 90% of the 16 engineered carbonaceous nanomaterial companies that responded to an industrywide survey were observed to be using engineering controls and PPE as reported or more stringently than reported. Since PPE use was slightly more prevalent than engineering controls, better communication may be necessary to reinforce the importance of the hierarchy of controls. These findings may also be useful in conducting exposure assessment and epidemiologic research among U.S. workers handling nanomaterials. PMID:25093252

  4. Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Maltach, E. G.

    1969-01-01

    The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.

  5. Preliminary hazards analysis for the National Ignition Facility

    SciTech Connect

    Brereton, S.J.

    1993-10-01

    This report documents the Preliminary Hazards Analysis (PHA) for the National Ignition Facility (NIF). In summary, it provides: a general description of the facility and its operation; identification of hazards at the facility; and details of the hazards analysis, including inventories, bounding releases, consequences, and conclusions. As part of the safety analysis procedure set forth by DOE, a PHA must be performed for the NIF. The PHA characterizes the level of intrinsic potential hazard associated with a facility, and provides the basis for hazard classification. The hazard classification determines the level of safety documentation required, and the DOE Order governing the safety analysis. The hazard classification also determines the level of review and approval required for the safety analysis report. The hazards of primary concern associated with NIF are radiological and toxicological in nature. The hazard classification is determined by comparing facility inventories of radionuclides and chemicals with threshold values for the various hazard classification levels and by examining postulated bounding accidents associated with the hazards of greatest significance. Such postulated bounding accidents cannot take into account active mitigative features; they must assume the unmitigated consequences of a release, taking into account only passive safety features. In this way, the intrinsic hazard level of the facility can be ascertained.

  6. ANALYSIS OF GEOTHERMAL WASTES FOR HAZARDOUS COMPONENTS

    EPA Science Inventory

    Regulations governing the disposal of hazardous wastes led to an assessment for geothermal solid wastes for potentially hazardous properties. Samples were collected from three active geothermal sites in the western United States: The Geysers, Imperial Valley, and northwestern Nev...

  7. Regional Frequency Analysis of Ocean Hazard

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro; Weiss, Jerome; Benoit, Michel; Andreewsky, Marc

    2015-04-01

    The estimation of the extreme return level (up to 10-4 annual probability of exceedence) of natural phenomena is a very uncertain exercise, when extrapolating using the information and the measure collected in a single site. The aim of the Regional Frequency Analysis (RFA) is to benefit from the information contained in observations and data collected not only on the site of interested but in a larger set of sites, located in the same region of the site of interest or sharing with it similar characteristics. This technique was introduced in the '60 and widely used in various domains including hydrology and meteorology. The RFA was recently acknowledge as a potential choice for the estimation of flooding hazard in the Methodological Guide for flooding hazard estimation [1], published in 2013 by the French Nuclear Safety Autority. The aim of this presentation is to bring in the main concepts of the RFA and illustrate the latest innovation on its application, delivered by EDF R&D. They concerns the statistical definition of storms, the formation of homogeneous regions and a new approach for filtering the redundant information linked to the spatial correlation of natural phenomena. Application to skew surges and waves will be shown 1. ASN, Guide pour la Protection des installations nucléaires de base contre les inondations externes. 2013, ASN. p. 44.

  8. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    NASA Technical Reports Server (NTRS)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  9. Probabilistic Seismic Hazard Analysis for Southern California Coastal Facilities

    SciTech Connect

    Savy, J; Foxall, B

    2004-04-16

    The overall objective of this study was to develop probabilistic seismic hazard estimates for the coastal and offshore area of Ventura, Los Angeles and Orange counties for use as a basis for the University of Southern California (USC) to develop physical models of tsunami for the coastal regions and by the California State Lands Commission (SLC) to develop regulatory standards for seismic loading and liquefaction evaluation of marine oil terminals. The probabilistic seismic hazard analysis (PSHA) was carried out by the Lawrence Livermore National Laboratory (LLNL), in several phases over a time period of two years, following the method developed by LLNL for the estimation of seismic hazards at Department Of Energy (DOE) facilities, and for 69 locations of nuclear plants in the Eastern United States, for the Nuclear Regulatory Commission (NRC). This method consists in making maximum use of all physical data (qualitative, and quantitative) and to characterize the uncertainties by using a set of alternate spatiotemporal models of occurrence of future earthquakes, as described in the SSHAC, PSHA Guidance Document (Budnitz et al., 1997), and implemented for the NRC (Savy et al., 2002). In general, estimation of seismic hazard is based not only on our understanding of the regional tectonics and detailed characterization of the faults in the area but also on the analysis methods employed and the types of physical and empirical models that are deemed appropriate for the analysis. To develop this understanding, the body of knowledge in the scientific community is sampled in a series of workshops with a group of experts representative of the entire scientific community, including geologists and seismologists from the United States Geological Survey (USGS), members of the South California Earthquake Center (SCEC), and members of academic institutions (University of California Santa-Cruz, Stanford, UC Santa Barbara, and University of Southern California), and members of

  10. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  11. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  12. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  13. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  14. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  15. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  16. 327 Building fire hazards analysis implementation plan

    SciTech Connect

    BARILO, N.F.

    1999-05-10

    In March 1998, the 327 Building Fire Hazards Analysis (FHA) (Reference 1) was approved by the U.S. Department of Energy, Richland Operations Office (DOE-E) for implementation by B and W Hanford Company (BWC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in five areas and provided nine recommendations (11 items) to bring the 327 Building into compliance. A status is provided for each recommendation in this document. BWHC will use this Implementation Plan to bring the 327 Building and its operation into compliance with DOE Order 5480.7A and IUD 5480.7.

  17. Fire hazard analysis for the fuel supply shutdown storage buildings

    SciTech Connect

    REMAIZE, J.A.

    2000-09-27

    The purpose of a fire hazards analysis (FHA) is to comprehensively assess the risk from fire and other perils within individual fire areas in a DOE facility in relation to proposed fire protection so as to ascertain whether the objectives of DOE 5480.7A, Fire Protection, are met. This Fire Hazards Analysis was prepared as required by HNF-PRO-350, Fire Hazards Analysis Requirements, (Reference 7) for a portion of the 300 Area N Reactor Fuel Fabrication and Storage Facility.

  18. Fire hazards analysis of transuranic waste storage and assay facility

    SciTech Connect

    Busching, K.R., Westinghouse Hanford

    1996-07-31

    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  19. ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis

    NASA Technical Reports Server (NTRS)

    Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.

    2006-01-01

    Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.

  20. Tracking Hazard Analysis Data in a Jungle of Changing Design

    SciTech Connect

    Sullivan, Robin S.; Young, Jonathan

    2006-05-16

    Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.

  1. Fire hazards analysis for solid waste burial grounds

    SciTech Connect

    McDonald, K.M.

    1995-09-28

    This document comprises the fire hazards analysis for the solid waste burial grounds, including TRU trenches, low-level burial grounds, radioactive mixed waste trenches, etc. It analyzes fire potential, and fire damage potential for these facilities. Fire scenarios may be utilized in future safety analysis work, or for increasing the understanding of where hazards may exist in the present operation.

  2. 78 FR 39284 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-01

    ... AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis AGENCY... Regulatory Analysis.'' The purpose of this notice is to extend the public comment period by 60 days. The... Environmental Justice in Regulatory Analysis, Environmental Protection Agency, Mailcode: 1890T,...

  3. Cold Vacuum Drying (CVD) Facility Hazards Analysis Report

    SciTech Connect

    CROWE, R.D.

    2000-08-07

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) Hazard Analysis to support the CVDF Final Safety Analysis Report and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports,'' and implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports.''

  4. 327 Building fire hazards analysis implementation plan

    SciTech Connect

    Eggen, C.D.

    1998-09-16

    In March 1998, the 327 Building Fire Hazards Analysis (FRA) (Reference 1) was approved by the US Department of Energy, Richland Operations Office (DOE-RL) for implementation by B and W Hanford Company (B and WHC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in five areas and provided nine recommendations (11 items) to bring the 327 Building into compliance. To date, actions for five of the 11 items have been completed. Exemption requests will be transmitted to DOE-RL for two of the items. Corrective actions have been identified for the remaining four items. The completed actions address combustible loading requirements associated with the operation of the cells and support areas. The status of the recommendations and actions was confirmed during the July 1998 Fire Protection Assessment. B and WHC will use this Implementation Plan to bring the 327 Building and its operation into compliance with DOE Order 5480.7A and RLID 5480.7.

  5. Active polarimeter optical system laser hazard analysis.

    SciTech Connect

    Augustoni, Arnold L.

    2005-07-01

    A laser hazard analysis was performed for the SNL Active Polarimeter Optical System based on the ANSI Standard Z136.1-2000, American National Standard for Safe Use of Lasers and the ANSI Standard Z136.6-2000, American National Standard for Safe Use of Lasers Outdoors. The Active Polarimeter Optical System (APOS) uses a pulsed, near-infrared, chromium doped lithium strontium aluminum fluoride (Cr:LiSAF) crystal laser in conjunction with a holographic diffuser and lens to illuminate a scene of interest. The APOS is intended for outdoor operations. The system is mounted on a height adjustable platform (6 feet to 40 feet) and sits atop a tripod that points the beam downward. The beam can be pointed from nadir to as much as 60 degrees off of nadir producing an illuminating spot geometry that can vary from circular (at nadir) to elliptical in shape (off of nadir). The JP Innovations crystal Cr:LiSAF laser parameters are presented in section II. The illuminating laser spot size is variable and can be adjusted by adjusting the separation distance between the lens and the holographic diffuser. The system is adjusted while platform is at the lowest level. The laser spot is adjusted for a particular spot size at a particular distance (elevation) from the laser by adjusting the separation distance (d{sub diffuser}) to predetermined values. The downward pointing angle is also adjusted before the platform is raised to the selected operation elevation.

  6. 324 Building fire hazards analysis implementation plan

    SciTech Connect

    Eggen, C.D.

    1998-09-16

    In March 1998, the 324 Building Fire Hazards Analysis (FHA) (Reference 1) was approved by the US Department of Energy, Richland Operations Office (DOE-RL) for implementation by B and W Hanford Company (BWHC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in six areas and provided 20 recommendations to bring the 324 Building into compliance with DOE Order 5480.7A. Additionally, one observation was provided. To date, four of the recommendations and the one observation have been completed. Actions identified for seven of the recommendations are currently in progress. Exemption requests will be transmitted to DOE-RL for three of the recommendations. Six of the recommendations are related to future shut down activities of the facility and the corrective actions are not being addressed as part of this plan. The actions for recommendations associated with the safety related part of the 324 Building and operation of the cells and support areas were evaluated using the Unreviewed Safety Question (USQ) process. Major Life Safety Code concerns have been corrected. The status of the recommendations and actions was confirmed during the July 1998 Fire Protection Assessment. BVMC will use this Implementation Plan to bring the 324 Building and its operation into compliance with DOE Order 5480.7A and RLID 5480.7.

  7. 324 Building fire hazards analysis implementation plan

    SciTech Connect

    BARILO, N.F.

    1999-05-10

    In March 1998, the 324 Building Fire Hazards Analysis (FHA) (Reference 1) was approved by the U S. Department of Energy, Richland Operations Office (DOE-RL) for implementation by B and W Hanford Company (BWHC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in six areas and provided 20 recommendations to bring the 324 Building into compliance with DOE Order 5480 7A. Additionally, one observation was provided. A status is provided for each recommendation in this document. The actions for recommendations associated with the safety related part of the 324 Building and operation of the cells and support areas were evaluated using the Unreviewed Safety Question (USQ) process BWHC will use this Implementation Plan to bring the 324 Building and its operation into compliance with DOE Order 5480 7A and RLID 5480.7.

  8. Optical radiation hazards analysis of ultraviolet headlamps

    NASA Astrophysics Data System (ADS)

    Sliney, David H.; Fast, Peder; Ricksand, Anders

    1995-08-01

    The potential hazards to the eye and skin associated with the use of UV-emitting automotive headlamps are considered. Wide-scale use of high-beam, near-UV headlamps to permit viewing of fluorescence in clothes and fluorescent road markers at great distances can increase automotive safety; however, the potential hazards from exposure of persons to UV radiation must be evaluated. From the study we conclude that such headlamps can be safely designed to preclude human exposure to potentially hazardous levels of UV radiation.

  9. Simple Sensitivity Analysis for Orion Guidance Navigation and Control

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar

    2013-01-01

    The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.

  10. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    SciTech Connect

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  11. Safety analysis reports for multiple hazard category facilities

    SciTech Connect

    Geeting, M.W.; Gerrard, P.B.

    1995-12-31

    The Savannah River Site contains many nuclear facilities for which safety analysis reports (SARs) are required. The current requirements with which the SARs must comply are given in U.S. Department of Energy (DOE) Order 5480.23. This order requires use of the graded approach. The graded approach demands a more thoroughly documented assessment of complex, higher hazard facilities than simple, lower hazard facilities because grading is a function of both hazard potential and complexity. The treatment of different hazard category facilities in the development of the SAR for the Central Laboratory Facility at the Savannah River Site is described here.

  12. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These... completed to comply with 29 CFR 1910.119(e) are acceptable to meet the requirements of this paragraph. (g...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault...

  13. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These... completed to comply with 29 CFR 1910.119(e) are acceptable to meet the requirements of this paragraph. (g...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault...

  14. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These... completed to comply with 29 CFR 1910.119(e) are acceptable to meet the requirements of this paragraph. (g...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis....

  15. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These... completed to comply with 29 CFR 1910.119(e) are acceptable to meet the requirements of this paragraph. (g...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis....

  16. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These... completed to comply with 29 CFR 1910.119(e) are acceptable to meet the requirements of this paragraph. (g...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis....

  17. Hazardous Substance Release Reporting Under CERCLA, EPCR {section}304 and DOE Emergency Management System (EMS) and DOE Occurrence Reporting Requirements. Environmental Guidance

    SciTech Connect

    Traceski, T.T.

    1994-06-01

    Releases of various substances from DOE facilities may be subject to reporting requirements under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and the Emergency Planning and Community Right-to-Know Act (EPCRA), as well as DOE`s internal ``Occurrence Reporting and Processing of Operations Information`` and the ``Emergency Management System`` (EMS). CERCLA and EPCPA are Federal laws that require immediate reporting of a release of a Hazardous Substance (HS) and an Extremely Hazardous Substance (EHS), respectively, in a Reportable Quantity (RQ) or more within a 24-hour period. This guidance uses a flowchart, supplemental information, and tables to provide an overview of the process to be followed, and more detailed explanations of the actions that must be performed, when chemical releases of HSs, EHSs, pollutants, or contaminants occur at DOE facilities. This guidance should be used in conjunction with, rather than in lieu of, applicable laws, regulations, and DOE Orders. Relevant laws, regulations, and DOE Orders are referenced throughout this guidance.

  18. Hazard Analysis of Japanese Boxed Lunches (Bento).

    ERIC Educational Resources Information Center

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary reason…

  19. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  20. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  1. Dispersion analysis for baseline reference mission 3A using powered explicit guidance

    NASA Technical Reports Server (NTRS)

    Snow, L. S.

    1975-01-01

    A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for baseline reference mission 3A. Powered explicit guidance (PEG) was used to develop closed loop steering commands for this dispersion analysis. The nominal profile for the dispersion analysis is identical to the nominal profile of Reference 1 ascent trajectory with the exception that generalized linear tangent (GLT) guidance is used in Reference 1. Nominal trajectory differences which result from using PEG instead of GLT were determined and dispersion data were developed using PEG for comparison with similar data developed using GLT guidance.

  2. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    SciTech Connect

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka; Ken Yanagisawa; Tadashi Annaka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present an example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)

  3. Department of Defense radiation hazards testing analysis

    NASA Astrophysics Data System (ADS)

    Fortinberry, Jarrod Douglas

    The United States Department of Defense tests its weapons systems to many different real as well as man-made environments prior to deploying the systems to the troops. The intent of testing is to insure the systems function as they are intended without adverse reactions. One of the required tests is a Radiation Hazards test to insure that nonionizing radio frequency waves generated by transmitters, such as radios and jammers, do not cause harm to personnel, ordnance or fuel. This test is typically performed at a DoD test lab and data taken at the lab is used to determine safe operating parameters for a particular piece of equipment. This thesis presents measurements as well as mathematical models to demonstrate methods that can be employed to take more relevant Radiation Hazards data.

  4. ODH, oxygen deficiency hazard cryogenic analysis

    SciTech Connect

    Augustynowicz, S.D.

    1993-07-01

    An oxygen deficiency exists when the concentration of oxygen, by volume, drops to a level at which atmosphere supplying respiratory protection must be provided. Since liquid cryogens can expand by factors of 700 (LN{sub 2}) to 850 (LH{sub e}), the uncontrolled release into an enclosed space can easily cause an oxygen-deficient condition. An oxygen deficiency hazard (ODH) fatality rate per hour ({O}) is defined as: {O} = {Sigma} N{sub i}P{sub i}F{sub i}, where N{sub i} = number of components, P{sub i} =probability of failure or operator error, and F{sub i} - fatality factor. ODHs range from ``unclassified`` ({O}<10{sup {minus}9} 1/h) to class 4, which is the most hazardous ({O}>10{sup {minus}1} 1/h). For Superconducting Super Collider Laboratory (SSCL) buildings where cryogenic systems exist, failure rate, fatality factor, reduced oxygen ratio, and fresh air circulation are examined.

  5. Analysis of Brazilian data for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Drouet, S.; Assumpção, M.

    2013-05-01

    Seismic hazard analysis in Brazil is going to be re-assessed in the framework of the Global Earthquake Model (GEM) project. Since the last worldwide Global Seismic Hazard Analysis Project (GSHAP) there has been no specific study in this field in Brazil. Brazil is a stable continental region and is characterized by a low seismic activity. In this particular type of regions, seismic hazard assessment is a very hard task due to the limited amount of data available regarding the seismic sources, earthquake catalogue, or ground-motion amplitudes, and the uncertainties associated are very large. This study focuses on recorded data in South-East Brazil where broadband stations are installed, belonging to two networks: the network managed by the seismology group at the IAG-USP in São Paulo which exists since about 20 years, and the network managed by the Observatorio Nacional in Rio de Janeiro which has just been set up. The two networks are now integrated into the national network RSB (Rede Sismográfica Brasileira) which will also include stations from the rest of Brazil currently in installation by the Universities of Brasilia and Natal. There are a couple of events with magnitude greater than 3 recorded at these very sensitive stations, usually at rather large distances. At first sight these data may appear meaningless in the context of seismic hazard but they can help to improve different parts involved in the process. The analysis of the S-wave Fourier spectra can help to better resolve source, path and site effects in Brazil. For instance moment magnitudes can be computed from the flat part of the Fourier spectra. These magnitudes are of utmost importance in order to build an homogeneous catalogue in terms of moment magnitude. At the moment only body wave magnitude (or some equivalent scale) are determined routinely for the events in Brazil. Attenuation and site effect, especially the high-frequency attenuation known as the kappa effect will also help to

  6. Fire Hazards Analysis for the 200 Area Interim Storage Area

    SciTech Connect

    JOHNSON, D.M.

    2000-01-06

    This documents the Fire Hazards Analysis (FHA) for the 200 Area Interim Storage Area. The Interim Storage Cask, Rad-Vault, and NAC-1 Cask are analyzed for fire hazards and the 200 Area Interim Storage Area is assessed according to HNF-PRO-350 and the objectives of DOE Order 5480 7A. This FHA addresses the potential fire hazards associated with the Interim Storage Area (ISA) facility in accordance with the requirements of DOE Order 5480 7A. It is intended to assess the risk from fire to ensure there are no undue fire hazards to site personnel and the public and to ensure property damage potential from fire is within acceptable limits. This FHA will be in the form of a graded approach commensurate with the complexity of the structure or area and the associated fire hazards.

  7. Speed your hazard analysis with the focused what if?

    SciTech Connect

    Goodman, L.

    1996-07-01

    Chemical process industries (CPI) plants worldwide must diminish their associated hazards. CPI firms are driven by their need to protect the public, their plants` neighbors, and their operating staffs from the consequences of accidents, as well as to meet regulations. Assuring the company and its regulators of safe operation requires conducting hazard analyses to demonstrate that plants are designed and operated to be as safe as practicable. For all organizations, large and small, it can be daunting and resource-consuming to conduct hazard analyses. There are various acceptable methods of hazard analysis: the hazard and operability analysis (HAZOP), the what if?, failure modes and effects and analysis (FMEA), and checklists, among others. Of all these methods, the HAZOP is on its way to becoming the de facto standard, because it is seemingly the most comprehensive and detailed method. Unfortunately, the HAZOP analysis, in many cases, may not be a good choice for a hazard analysis. This article explains more of the why of the focused what if (FWI), describes what it is, tells where it has been applied successfully, and provides an abridged example of its use.

  8. Seismic hazard analysis for Jayapura city, Papua

    SciTech Connect

    Robiana, R. Cipta, A.

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  9. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... Collection; Comment Request; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary... collection provisions of our regulations mandating the application of hazard......

  10. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, B.; Siu, Y. L.; Mitchell, G.

    2015-12-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  11. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  12. A sensitivity analysis of hazardous waste disposal site climatic and soil design parameters using HELP3

    SciTech Connect

    Adelman, D.D.; Stansbury, J.

    1997-12-31

    The Resource Conservation and Recovery Act (RCRA) Subtitle C, Comprehensive Environmental Response, Compensation, And Liability Act (CERCLA), and subsequent amendments have formed a comprehensive framework to deal with hazardous wastes on the national level. Key to this waste management is guidance on design (e.g., cover and bottom leachate control systems) of hazardous waste landfills. The objective of this research was to investigate the sensitivity of leachate volume at hazardous waste disposal sites to climatic, soil cover, and vegetative cover (Leaf Area Index) conditions. The computer model HELP3 which has the capability to simulate double bottom liner systems as called for in hazardous waste disposal sites was used in the analysis. HELP3 was used to model 54 combinations of climatic conditions, disposal site soil surface curve numbers, and leaf area index values to investigate how sensitive disposal site leachate volume was to these three variables. Results showed that leachate volume from the bottom double liner system was not sensitive to these parameters. However, the cover liner system leachate volume was quite sensitive to climatic conditions and less sensitive to Leaf Area Index and curve number values. Since humid locations had considerably more cover liner system leachate volume than and locations, different design standards may be appropriate for humid conditions than for and conditions.

  13. Frequency Analysis of Aircraft hazards for License Application

    SciTech Connect

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  14. Guidance in Kagome-like photonic crystal fibres I: analysis of an ideal fibre structure.

    PubMed

    Chen, Lei; Pearce, Greg J; Birks, Timothy A; Bird, David M

    2011-03-28

    Propagation of light in a square-lattice hollow-core photonic crystal fibre is analysed as a model of guidance in a class of photonic crystal fibres that exhibit broad-band guidance without photonic bandgaps. A scalar governing equation is used and analytic solutions based on transfer matrices are developed for the full set of modes. It is found that an exponentially localised fundamental mode exists for a wide range of frequencies. These analytic solutions of an idealised structure will form the basis for analysis of guidance in a realistic structure in a following paper. PMID:21451720

  15. Fire hazards analysis for the Center for National Security and Arms Control (CNSAC) Facility

    SciTech Connect

    Klamerus, E.W.; Ross, S.B.

    1993-07-01

    This Fire Hazards Analysis is sponsored by Plant Engineering and is prepared to support the Safety Assessment for the CNSAC Facility. This is a preliminary fire hazards analysis of a yet to be constructed facility and is based upon the current building design and the current understanding of the potential occupancy hazards. The governing occupancy for this building is personnel offices. The CNSAC facility will be dedicated primarily to two activities: (1) arms control and verification technology and (2) intelligence. This report supplements the Safety Assessment for the CNSAC facility and follows the guidance of DOE Memorandum EH-31.3 and meets the objectives of paragraph 4 of DOE Order 5480.7A, ``Fire Protection.`` This analysis demonstrates that under ``worst case`` assumptions a fire in the CNSAC facility will result in consequences which are below DOE offsite guidelines for accident conditions. This report is based upon preliminary design information and any major changes to the building design may require additional analyses.

  16. AN ENHANCED HAZARD ANALYSIS PROCESS FOR THE HANFORD TANK FARMS

    SciTech Connect

    SHULTZ MV

    2008-05-15

    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process.

  17. Slash fire hazard analysis on the Siskiyou National Forest

    NASA Astrophysics Data System (ADS)

    Radloff, David L.; Schopfer, Walter C.; Yancik, Richard F.

    1982-11-01

    Potential increase in fire hazard as a result of timber harvesting is a concern of forest managers throughout the United States. Treating fuels can help reduce unacceptable fire hazards. To evaluate alternative fuel treatments, managers need to know their effects on fire hazard. A decision analysis approach to estimating fire hazard in terms of expected burned area was applied to a watershed in the Siskiyou National Forest (Oregon). Three treatment alternatives (do nothing and two levels of yarding unmerchantable material) were evaluated, and the effects of the treatments were projected over a 90-yr period. Initially, the effects of applying a treatment are small. After 50 years of treatment, the most intense alternative can be expected to show almost a 50% reduction in burned area compared to no treatment. The procedure also estimates burned are by fire size and fire intensity classes. Managers may find this useful for estimating expected fire effects associated with a particular fuel treatment regime.

  18. FMEA, the alternative process hazard method. [Failure Mode Effects Analysis

    SciTech Connect

    Goyal, R.K. )

    1993-05-01

    Failure mode effects analysis (FMEA) is an old reliability/assurance tool finding its way into the HPI. Not popular yet, this hazard technique has some viable applications that can improve hazard assessment data. Notably, FMEA studies can identify possible areas for improvement that may have not been discovered using other methods. Also, FMEA is not as labor intensive and costly as other process hazard analysis (PHA) methods. PSHA 1910.119 set in place an informational structure whose main purpose is the reduction of potential accidents and minimizing risks in the event of an accident. Consequently, HPI operators must evaluate their process systems and identify potential major hazards, such as fires, explosions and accidental release of toxic/hazardous chemicals, and protect their facilities, employees, the public and the environment. But, which PHA method(s) apply to a particular plant or process still remains a difficult question. This paper describes what FMEA is; types of FMEA; how to conduct a FMEA study; comparison with HAZOP (hazard and operability study); computer software; applicability of FMEA; and examples of its use.

  19. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  20. Pedestrian Evacuation Analysis for Tsunami Hazards

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Wood, N. J.

    2014-12-01

    Recent catastrophic tsunamis in the last decade, as well as the 50th anniversary of the 1964 Alaskan event, have heightened awareness of the threats these natural hazards present to large and increasing coastal populations. For communities located close to the earthquake epicenter that generated the tsunami, strong shaking may also cause significant infrastructure damage, impacting the road network and hampering evacuation. There may also be insufficient time between the earthquake and first wave arrival to rely on a coordinated evacuation, leaving at-risk populations to self-evacuate on foot and across the landscape. Emergency managers evaluating these coastal risks need tools to assess the evacuation potential of low-lying areas in order to discuss mitigation options, which may include vertical evacuation structures to provide local safe havens in vulnerable communities. The U.S. Geological Survey has developed the Pedestrian Evacuation Analyst software tool for use by researchers and emergency managers to assist in the assessment of a community's evacuation potential by modeling travel times across the landscape and producing both maps of travel times and charts of population counts with corresponding times. The tool uses an anisotropic (directionally dependent) least cost distance model to estimate evacuation potential and allows for the variation of travel speed to measure its effect on travel time. The effectiveness of vertical evacuation structures on evacuation time can also be evaluated and compared with metrics such as travel time maps showing each structure in place and graphs displaying the percentage change in population exposure for each structure against the baseline. Using the tool, travel time maps and at-risk population counts have been generated for some coastal communities of the U.S. Pacific Northwest and Alaska. The tool can also be used to provide valuable decision support for tsunami vertical evacuation siting.

  1. Hazardous material analysis and coding system (HAZMZCS). Final report

    SciTech Connect

    Bryant, J.W.

    1991-06-01

    A new hazardous material classification system is being implemented. It consists of 55 Hazardous Characteristic Codes (HCC). The HCC will provide critical information needed to effectively manage, store and ship hazardous materials such as poisons, pesticides, radioactive materials, oxidizers, corrosive liquids and explosives. With implementation of new automated Defense Logistics Agency (DLA) Warehousing and Shipping Procedures (DWASP), DLA depot receiving personnel will be required to assign the HCC if it it missing from pertinent documents. Without the HCC, the DWASP system will not assign a depot storage location. Because the new HCC must be assigned quickly and accurately, an expert systems approach offers a feasible and practical means for providing this support. Accordingly, the Hazardous Material Analysis and Coding System (HAZMACS) was developed. HAZMACS is a PC-based expert system which queries the user about the known characteristics of suspected hazardous material and assigns an HCC based on the user's responses. HAZMACS consists of a main knowledge base file which chains to any of 13 other hazard-specific knowledge base files.

  2. Influence of analysis methods on interpretation of hazard maps.

    PubMed

    Koehler, Kirsten A; Peters, Thomas M

    2013-06-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with 'off-the-shelf' mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

  3. Tracking Hazard Analysis Data in a Jungle of Changing Design

    SciTech Connect

    Sullivan, Robin S.; Young, Jonathan

    2006-05-14

    The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site in Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.

  4. 75 FR 16202 - Office of New Reactors; Interim Staff Guidance on Ensuring Hazard-Consistent Seismic Input for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ... Site Response and Soil Structure Interaction Analyses AGENCY: Nuclear Regulatory Commission (NRC...) DC/COL-ISG-017 titled ``Ensuring Hazard-Consistent Seismic Input for Site Response and Soil Structure...-Consistent Seismic Input for Site Response and Soil Structure Interaction Analyses,'' (ADAMS Accession...

  5. Hazardous-waste analysis plan for LLNL operations

    SciTech Connect

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  6. Landslide hazards and systems analysis: A Central European perspective

    NASA Astrophysics Data System (ADS)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  7. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  8. Fire Hazards Analysis for the Inactive Equipment Storage Sprung Structure

    SciTech Connect

    MYOTT, C.F.

    2000-02-03

    The purpose of the analysis is to comprehensively assess the risk from fire within individual fire areas in relation to proposed fire protection so as to ascertain whether the fire protection objective of DOE Order 5480.1A are met. The order acknowledges a graded approach commensurate with the hazards involved.

  9. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    DOE Data Explorer

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  10. Implementation of hazard analysis critical control point in jameed production.

    PubMed

    Al-Saed, A K; Al-Groum, R M; Al-Dabbas, M M

    2012-06-01

    The average of standard plate count and coliforms, Staphylococcus aureus and Salmonella counts for three home-made jameed samples, a traditional fermented dairy product, before applying hazard analysis critical control point system were 2.1 × 10(3), 8.9 × 10(1), 4 × 10(1) and less than 10 cfu/g, respectively. The developed hazard analysis critical control point plan resulted in identifying ten critical control points in the flow chart of jameed production. The critical control points included fresh milk receiving, pasteurization, addition of starter, water and salt, straining, personnel hygiene, drying and packaging. After applying hazard analysis critical control point system, there was significant improvement in the microbiological quality of the home-made jameed. The standard plate count was reduced to 3.1 × 10(2) cfu/g whereas coliform and Staphylococcus aureus counts were less than 10 cfu/g and Salmonella was not detected. Sensory evaluation results of color and flavor of sauce prepared from jameed showed a significant increase in the average scores given after hazard analysis critical control point application. PMID:22701056

  11. Phase 2 fire hazard analysis for the canister storage building

    SciTech Connect

    Sadanaga, C.T., Westinghouse Hanford

    1996-07-01

    The fire hazard analysis assesses the risk from fire in a facility to ascertain whether the fire protection policies are met. This document provides a preliminary FHA for the CSB facility. Open items have been noted in the document. A final FHA will be required at the completion of definitive design, prior to operation of the facility.

  12. Time to the Doctorate: Multilevel Discrete-Time Hazard Analysis

    ERIC Educational Resources Information Center

    Wao, Hesborn O.

    2010-01-01

    Secondary data on 1,028 graduate students nested within 24 programs and admitted into either a Ph. D. or Ed. D. program between 1990 and 2006 at an American public university were used to illustrate the benefits of employing multilevel discrete-time hazard analysis in understanding the timing of doctorate completion in Education and the factors…

  13. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    SciTech Connect

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  14. Guidance and Control Algorithms for the Mars Entry, Descent and Landing Systems Analysis

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; CwyerCianciolo, Alicia M.; Powell, Richard W.; Shidner, Jeremy D.; Garcia-Llama, Eduardo

    2010-01-01

    The purpose of the Mars Entry, Descent and Landing Systems Analysis (EDL-SA) study was to identify feasible technologies that will enable human exploration of Mars, specifically to deliver large payloads to the Martian surface. This paper focuses on the methods used to guide and control two of the contending technologies, a mid- lift-to-drag (L/D) rigid aeroshell and a hypersonic inflatable aerodynamic decelerator (HIAD), through the entry portion of the trajectory. The Program to Optimize Simulated Trajectories II (POST2) is used to simulate and analyze the trajectories of the contending technologies and guidance and control algorithms. Three guidance algorithms are discussed in this paper: EDL theoretical guidance, Numerical Predictor-Corrector (NPC) guidance and Analytical Predictor-Corrector (APC) guidance. EDL-SA also considered two forms of control: bank angle control, similar to that used by Apollo and the Space Shuttle, and a center-of-gravity (CG) offset control. This paper presents the performance comparison of these guidance algorithms and summarizes the results as they impact the technology recommendations for future study.

  15. PO*WW*ER mobile treatment unit process hazards analysis

    SciTech Connect

    Richardson, R.B.

    1996-06-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damage and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  16. Hazard analysis of Clostridium perfringens in the Skylab Food System

    NASA Technical Reports Server (NTRS)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  17. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product will be consumed without a process sufficient to kill the parasites, or where the processor...

  18. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product will be consumed without a process sufficient to kill the parasites, or where the processor...

  19. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product will be consumed without a process sufficient to kill the parasites, or where the processor...

  20. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product will be consumed without a process sufficient to kill the parasites, or where the processor...

  1. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product will be consumed without a process sufficient to kill the parasites, or where the processor...

  2. D0 Detector Collision Hall Oxygen Deficiancy Hazard Analysis

    SciTech Connect

    Wu, J.; /Fermilab

    1992-08-06

    EN-258, D0 Platform ODH Analysts. provided the oxygen deficiency hazard analysts for the D0 detector in the Assembly Hall. This note covers the same analysis. but revised for the Collision Hall. Liquid cryogens. released and warming to atmosphere conditions, expand to, on average, seven hundred times their liquid volume, and displace vital atmospheric oxygen. An oxygen deficiency hazard analysis assesses the increased risk to personnel in areas containing cryogenic systems. The D0 detector Collision Hall ODH analysis has been approached five different ways using established methods. If the low beta quad magnets are powered, and the exhaust rate is below 4220 scfm, the area is ODH class 1. In any other case, the analysis shows the area to be ODH class 0 as equipped (with ventilation fans) and requiring no special safety provisions. System designers have provided for a reduced oxygen level detection and warning system as well as emergency procedures to address fault conditions.

  3. Evaluation and Application of Probabilistic Tsunami Hazard Analysis in California

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Wilson, R. I.; Miller, K.

    2014-12-01

    The California Geological Survey (CGS) and URS Corporation are in the process of generating tsunami hazard map products for land-use planning and construction through the California Seismic Hazard Mapping Act (Public Resources Code, sec 2690 et seq.). Similar to seismic hazard zonation, these products for land-use and development decision-making are typically based on a probabilistic analysis, and require information on the frequency of occurrence through a probabilistic tsunami hazard analysis (PTHA). In Phase 1 of CGS's work, the California PTHA Work Group was established to evaluate the results of PTHA demonstration projects in Crescent City and Huntington Beach. The results of this Phase 1 review of the two independent analyses indicate PTHA's can be developed with recommended improvements in source characterization, PTHA methods, and numerical model use. An immediate significant improvement is to align the characterization of the Cascadia Subduction Zone PTHA with the seismic characterization of the National Seismic Hazard Map Program of the USGS. In addition to applying PTHA to land-use planning and the two demonstration projects, CGS and the CA-PTHA Work Group identified other potential applications for various PTHA risk levels (ARP = Average Return Period), including flood insurance (100 and 500 year ARP), building codes (2,500 year ARP), and emergency response planning (1000 year ARP or larger). CGS is working with URS Corp., the California Office of Emergency Services, and FEMA on a Phase 2 plan to produce a single set of reliable and consistent PTHA maps for multiple risk levels and work with various end-users to determine how to use the maps. The California PTHA and the results of the Work Group review are also proposed to be used by the U.S. National Tsunami Hazard Mitigation Program to develop guidelines for production in other coastal states.

  4. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  5. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  6. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  7. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials. PMID:24239259

  8. Performance analysis of the combined EDS maglev propulsion, levitation, and guidance system

    SciTech Connect

    He, J.L.; Coffey, H.T.; Rote, D.M.

    1993-10-01

    An analysis of the Japanese maglev system which uses only one set of coils in the guideway for combined levitation, propulsion, and guidance functions is presented in this paper. This preliminary study, using the dynamic circuit approach, indicates that the system is very promising.

  9. Guidance analysis of the aeroglide plane change maneuver as a turning point problem

    NASA Technical Reports Server (NTRS)

    Gracey, Christopher

    1989-01-01

    The development of guidance approximations for the atmospheric (aeroglide) portion of the minimum fuel, orbital plane change, trajectory optimization problem is described. Asymptotic methods are used to reduce the two point, boundary value, optimization problem to a turning point problem from the bank angle control. The turning point problem solution, which yields an approximate optimal control policy, is given in terms of parabolic cylinder functions, which are tabulated, and integral expressions, which must be numerically computed. Comparisons of the former, over their region of validity, with optimal control solutions show good qualitative agreement. Additional work and analysis is needed to compute the guidance approximation work.

  10. Long term volcanic hazard analysis in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  11. Surface Fire Hazards Analysis Technical Report-Constructor Facilities

    SciTech Connect

    R.E. Flye

    2000-10-24

    The purpose of this Fire Hazards Analysis Technical Report (hereinafter referred to as Technical Report) is to assess the risk from fire within individual fire areas to ascertain whether the U.S. Department of Energy (DOE) fire safety objectives are met. The objectives identified in DOE Order 420.1, Change 2, Facility Safety, Section 4.2, establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public, or the environment; Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding defined limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  12. A Procedure for Rapid Localized Earthquake Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Holliday, J. R.; Rundle, J. B.

    2010-12-01

    In this presentation, we introduce various ground shaking and building response models. We then discuss the forecasting capabilities of different models submitted to the Collaboratory for the Study of Earthquake Predictability (CSEP) and show how they can be used as inputs for these models. Finally, we discuss how outputs from such multi- tiered calculations would prove invaluable for real-time and scenario-based hazard assessment and for cost-benefit analysis of possible mitigation actions.

  13. A methodology for probabilistic fault displacement hazard analysis (PFDHA)

    USGS Publications Warehouse

    Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H., III; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

    2003-01-01

    We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

  14. Coupling induced seismic hazard analysis with reservoir design

    NASA Astrophysics Data System (ADS)

    Gischig, V.; Wiemer, S.; Alcolea, A. R.

    2013-12-01

    positive impact on seismic hazard. However, as smaller magnitudes contribute less to permeability enhancement the efficiency of stimulation is degraded in case of high b-value conditions. Nevertheless, target permeability enhancement can be still be achieved under high b-value condition without reaching an unacceptable seismic hazard level, if either initial permeability is already high or if several fractures are stimulated. The proposed modelling approach is a first step towards including induced seismic hazard analysis into the design of reservoir stimulation.

  15. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan....

  16. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 2 2014-07-01 2014-07-01 false What hazards analysis criteria must my SEMS... and Environmental Management Systems (SEMS) § 250.1911 What hazards analysis criteria must my SEMS program meet? You must ensure that a hazards analysis (facility level) and a JSA (operations/task...

  17. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 2 2013-07-01 2013-07-01 false What hazards analysis criteria must my SEMS... and Environmental Management Systems (SEMS) § 250.1911 What hazards analysis criteria must my SEMS program meet? You must ensure that a hazards analysis (facility level) and a JSA (operations/task...

  18. 14 CFR Appendix B to Part 417 - Flight Hazard Area Analysis for Aircraft and Ship Protection

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Flight Hazard Area Analysis for Aircraft... Appendix B to Part 417—Flight Hazard Area Analysis for Aircraft and Ship Protection B417.1Scope This... launch site hazard area analysis that protects the public, aircraft, and ships from the...

  19. 14 CFR Appendix B to Part 417 - Flight Hazard Area Analysis for Aircraft and Ship Protection

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight Hazard Area Analysis for Aircraft... Appendix B to Part 417—Flight Hazard Area Analysis for Aircraft and Ship Protection B417.1Scope This... launch site hazard area analysis that protects the public, aircraft, and ships from the...

  20. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... hazards and each hazard control involved in the process. An analysis that complies with 29 CFR 1910.119(e... Analysis and Operational Procedures I Appendix I to Part 417 Aeronautics and Space COMMERCIAL SPACE..., App. I Appendix I to Part 417—Methodologies for Toxic Release Hazard Analysis and...

  1. A Hazard Analysis for a Generic Insulin Infusion Pump

    PubMed Central

    Zhang, Yi; Jones, Paul L.; Jetley, Raoul

    2010-01-01

    Background Researchers at the Food and Drug Administration (FDA)/Center for Device and Radiological Health/Office of Science and Engineering Laboratories have been exploring the concept of model-based engineering as a means for improving the quality of medical device software. Insulin pumps were chosen as a research subject because their design provides the desired degree of research complexity and these types of devices present an ongoing regulatory challenge. Methods Insulin pump hazards and their contributing factors are considered in the context of a highly abstract generic insulin infusion pump (GIIP) model. Hazards were identified by consulting with manufacturers, pump users, and clinicians; by reviewing national and international standards and adverse event reports collected by the FDA; and from workshops sponsored by Diabetes Technology Society. This information has been consolidated in tabular form to facilitate further community analysis and discussion. Results A generic insulin infusion pump model architecture has been established. A fairly comprehensive hazard analysis document, corresponding to the GIIP model, is presented in this article. Conclusions We believe that this work represents the genesis of an insulin pump safety reference standard upon which future insulin pump designs can be based to help ensure a basic level of safety. More interaction with the diabetes community is needed to assure the quality of this safety modeling process. PMID:20307387

  2. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    SciTech Connect

    Sevougian, S. David

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  3. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  4. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  5. Hazard analysis of a computer based medical diagnostic system.

    PubMed

    Chudleigh, M F

    1994-07-01

    Medical screening of sectors of the population is now a routine and vital part of health care: an example is cervical smear testing. There is currently significant interest in the possible introduction of semi-automated microscopy systems for cervical cytology and one such experimental system is now undergoing laboratory trials. A collaborative project has been set up to demonstrate the benefits and constraints that arise from applying safety-critical methods developed in other domains to such a diagnostic system. We have carried out a system hazard analysis, successfully using the HAZOP technique adapted from the petrochemical industry. PMID:7988111

  6. Probabilistic Seismic Hazard Analysis: Adaptation for CO2 Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Vasudevan, K.; Eaton, D. W.

    2011-12-01

    Large-scale sequestration of CO2 in depleted oil and gas fields in sedimentary basins such as the Western Canada Sedimentary Basin (WCSB) and in particular, central Alberta, should consider, among other safety and risk issues, a seismic hazard analysis that would include potential ground motions induced by earthquakes. The region is juxtaposed to major tectonically active seismogenic zones such as the Cascadia Subduction Zone, the Queen Charlotte Fault Zone, and the northern Cordillera region. Hazards associated with large-scale storage from strong ground motions caused by large-magnitude earthquakes along the west coast of Canada, and/or medium-to-large magnitude earthquakes triggered by such earthquakes in the neighbourhood of the storage site, must be clearly understood. To this end, stochastic modeling of the accelerograms recorded during large magnitude earthquakes in western Canada has been undertaken. A lack of recorded accelerograms and the absence of a catalogue of ground-motion prediction equations similar to the Next Generation Attenuation (NGA) database, however, hamper such analysis for the WCSB. In order to generate our own database of ground-motions for probabilistic seismic hazard analysis, we employ a site-based stochastic simulation approach. We use it to simulate three-component ground-motion accelerograms recorded during the November 3, 2002 Denali earthquake to mimic the Queen Charlotte Fault earthquakes. To represent a Cascadia megathrust earthquake, we consider three-component strong-motion accelerograms recorded during the March 11, 2011 Tohoku earthquake in Japan. Finally, to simulate an event comparable to the thrust-style Kinbasket Lake earthquake of 1908, we use three-component ground-motion accelerograms recorded during the 1985 Nahanni earthquake and the 2004 Chuetsu earthquake. Here, we develop predictive equations for the stochastic model parameters that describe ground motions in terms of earthquake and site characteristics such as

  7. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... hazards and each hazard control involved in the process. An analysis that complies with 29 CFR 1910.119(e... cloud center of mass altitude. (vi) Worst case initial source term assuming instantaneous release...

  8. Analytic Drag Controller guidance gains evaluation. Mission planning, mission analysis and software formulation

    NASA Technical Reports Server (NTRS)

    Treybig, J. H.

    1975-01-01

    Results of a study to optimize the guidance gains for the analytical drag control entry guidance system are presented. The guidance gains were optimized for study points chosen in all phases of entry.

  9. A LiDAR based analysis of hydraulic hazard mapping

    NASA Astrophysics Data System (ADS)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  10. Hazards assessment for the INEL Landfill Complex

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-02-01

    This report documents the hazards assessment for the INEL Landfill Complex (LC) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and the DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes the hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding the LC, the buildings and structures at the LC, and the processes that are used at the LC are described in this report. All hazardous materials, both radiological and nonradiological, at the LC were identified and screened against threshold quantities according to DOE Order 5500.3A guidance. Asbestos at the Asbestos Pit was the only hazardous material that exceeded its specified threshold quantity. However, the type of asbestos received and the packaging practices used are believed to limit the potential for an airborne release of asbestos fibers. Therefore, in accordance with DOE Order 5500.3A guidance, no further hazardous material characterization or analysis was required for this hazards assessment.

  11. Fire hazard analysis for Plutonium Finishing Plant complex

    SciTech Connect

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  12. Recent Advances on Probabilistic Seismic Hazard Analysis in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, C.; Cheng, C.; Lin, P.

    2002-12-01

    Before the occurrence of the September 21, 1999, Chi-Chi Taiwan earthquake (MW7.6), the results of probabilistic seismic hazard analysis (PSHA) from different agencies and different authors were totally underestimated earthquake hazard in Central Taiwan. The shortcoming of previous PSHA in Taiwan is due to lack of proper handling activity of fault sources. Other impacts on PSHA after the Chi-Chi earthquake includes: previously-used local magnitude ML is saturated at large magnitude, relative low of ground-motion level for a MW 7.6 earthquake, distinct hangingwall effect, velocity pulse of directivity and fling, and a must to use closest distance to fault in developing a ground-motion attenuation relationship and in PSHA. We firstly worked out the earthquake catalog and the strong-motion records, and established a main-shock catalog in moment magnitude MW, a database for strong-motion spectral accelerations (Sa) for each major earthquake and each recording station, and a database for attributes of active faults. Then, we carefully divided the earthquake sources into crustal earthquakes, subduction zone interface earthquakes and subduction zone intraslab earthquakes, and divided the site condition into hard site and soft site, and considered hangingwall effect for certain faults. We adopted the closest distance to seismogenic rupture and selected the Campbell form for regression analysis of Sa attenuation relationship for each combination of conditions. Totally eight sets of good quality Sa attenuation equation with 18 different spectral periods each were got. For PSHA, our primary achievement is the accomplishment of assessing fault parameters and developing a characteristic-earthquake model for each fault. We adopted the logic-tree method to handle uncertainty of parameters. Result reveals that the use of fault sources in PSHA is absolutely necessary in a tectonic active region such as in Taiwan. More meaningful hazard pattern and more reasonable hazard level have

  13. Precise terrestrial time: A means for improved ballistic missile guidance analysis

    NASA Technical Reports Server (NTRS)

    Ehrsam, E. E.; Cresswell, S. A.; Mckelvey, G. R.; Matthews, F. L.

    1978-01-01

    An approach developed to improve the ground instrumentation time tagging accuracy and adapted to support the Minuteman ICBM program is desired. The Timing Insertion Unit (TIU) technique produces a telemetry data time tagging resolution of one tenth of a microsecond, with a relative intersite accuracy after corrections and velocity data (range, azimuth, elevation and range rate) also used in missile guidance system analysis can be correlated to within ten microseconds of the telemetry guidance data. This requires precise timing synchronization between the metric and telemetry instrumentation sites. The timing synchronization can be achieved by using the radar automatic phasing system time correlation methods. Other time correlation techniques such as Television (TV) Line-10 and the Geostationary Operational Environmental Satellites (GEOS) terrestial timing receivers are also considered.

  14. Resources Available for Hazards Analysis of Aerospace Fluids

    NASA Technical Reports Server (NTRS)

    Woods, S. S.; Stewart, W. F.; Baker, D. L.

    2001-01-01

    In recent years, the legislative and executive branches of the federal government have pushed to make government more efficient and responsive to the needs of the marketplace. One of these initiatives, Public Law 104-113, also known as the National Technology Transfer and Advancement Act of 1995 (NTTAA), is designed to accelerate technology transfer to industry and promote government-industry partnership. Summarized, NTTAA states that '... all Federal agencies and departments shall use technical standards that are developed or adopted by voluntary consensus standards bodies, using such technical standards as a means to carry out policy objectives or activities determined by the agencies and departments. Government agencies must now determine if their in-house requirement-setting activities are sufficiently unique that no public interest is served by having them adopted by a voluntary consensus organization (VCO), or if not, to use or develop voluntary consensus standards. The Office of Management and Budget (OMB) is chartered by the law to monitor federal agency progress and report the results to Congress. In response to NTTAA, agency-wide oxygen and hydrogen safety standards sponsored by the NASA Headquarters (HQ) Office of Safety and Mission Assurance (OSMA) were obvious choices for early adoption by VCOs. In 1996, HQ sought assistance from the Johnson Space Center (JSC) White Sands Test Facility (WSTF), the technical lead for development of these safety standards, to evaluate their adoption by VCOs. At that time, WSTF-developed propellant hazards manuals were likewise identified for possible VCO adoption. Subsequently, WSTF was asked to represent NASA for development of an international ISO safety standard for hydrogen use. Concurrent with these WSTF standards activities are related efforts to develop and publish propellant hazards analysis protocols and safety courses for the industrial, propellant use of oxygen, hydrogen, and hypergols. This paper reports on

  15. The European ASAMPSA_E project : towards guidance to model the impact of high amplitude natural hazards in the probabilistic safety assessment of nuclear power plants. Information on the project progress and needs from the geosciences.

    NASA Astrophysics Data System (ADS)

    Raimond, Emmanuel; Decker, Kurt; Guigueno, Yves; Klug, Joakim; Loeffler, Horst

    2015-04-01

    The Fukushima nuclear accident in Japan resulted from the combination of two correlated extreme external events (earthquake and tsunami). The consequences, in particular flooding, went beyond what was considered in the initial engineering design design of nuclear power plants (NPPs). Such situations can in theory be identified using probabilistic safety assessment (PSA) methodology. PSA results may then lead industry (system suppliers and utilities) or Safety Authorities to take appropriate decisions to reinforce the defence-in-depth of the NPP for low probability event but high amplitude consequences. In reality, the development of such PSA remains a challenging task. Definitions of the design basis of NPPs, for example, require data on events with occurrence probabilities not higher than 10-4 per year. Today, even lower probabilities, down to 10-8, are expected and typically used for probabilistic safety analyses (PSA) of NPPs and the examination of so-called design extension conditions. Modelling the combinations of natural or man-made hazards that can affect a NPP and affecting some meaningful probability of occurrence seems to be difficult. The European project ASAMPSAE (www.asampsa.eu) gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants. It offers a framework to discuss, at a technical level, how "extended PSA" can be developed efficiently and be used to verify if the robustness of Nuclear Power Plants (NPPs) in their environment is sufficient. The paper will present the objectives of this project, some first lessons and introduce which type of guidance is being developed. It will explain the need of expertise from geosciences to support the nuclear safety assessment in the different area (seismotectonic, hydrological, meteorological and biological

  16. Independent Orbiter Assessment (IOA): Analysis of the guidance, navigation, and control subsystem

    NASA Technical Reports Server (NTRS)

    Trahan, W. H.; Odonnell, R. A.; Pietz, K. C.; Hiott, J. M.

    1986-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) is presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The independent analysis results corresponding to the Orbiter Guidance, Navigation, and Control (GNC) Subsystem hardware are documented. The function of the GNC hardware is to respond to guidance, navigation, and control software commands to effect vehicle control and to provide sensor and controller data to GNC software. Some of the GNC hardware for which failure modes analysis was performed includes: hand controllers; Rudder Pedal Transducer Assembly (RPTA); Speed Brake Thrust Controller (SBTC); Inertial Measurement Unit (IMU); Star Tracker (ST); Crew Optical Alignment Site (COAS); Air Data Transducer Assembly (ADTA); Rate Gyro Assemblies; Accelerometer Assembly (AA); Aerosurface Servo Amplifier (ASA); and Ascent Thrust Vector Control (ATVC). The IOA analysis process utilized available GNC hardware drawings, workbooks, specifications, schematics, and systems briefs for defining hardware assemblies, components, and circuits. Each hardware item was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.

  17. Seismic hazard analysis for the NTS spent reactor fuel test site

    SciTech Connect

    Campbell, K.W.

    1980-05-02

    An experiment is being directed at the Nevada Test Site to test the feasibility for storage of spent fuel from nuclear reactors in geologic media. As part of this project, an analysis of the earthquake hazard was prepared. This report presents the results of this seismic hazard assessment. Two distinct components of the seismic hazard were addressed: vibratory ground motion and surface displacement. (ACR)

  18. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  19. Development, analysis, and testing of robust nonlinear guidance algorithms for space applications

    NASA Astrophysics Data System (ADS)

    Wibben, Daniel R.

    This work focuses on the analysis and application of various nonlinear, autonomous guidance algorithms that utilize sliding mode control to guarantee system stability and robustness. While the basis for the algorithms has previously been proposed, past efforts barely scratched the surface of the theoretical details and implications of these algorithms. Of the three algorithms that are the subject of this research, two are directly derived from optimal control theory and augmented using sliding mode control. Analysis of the derivation of these algorithms has shown that they are two different representations of the same result, one of which uses a simple error state model (Delta r/Deltav) and the other uses definitions of the zero-effort miss and zero-effort velocity (ZEM/ZEV) values. By investigating the dynamics of the defined sliding surfaces and their impact on the overall system, many implications have been deduced regarding the behavior of these systems which are noted to feature time-varying sliding modes. A formal finite time stability analysis has also been performed to theoretically demonstrate that the algorithms globally stabilize the system in finite time in the presence of perturbations and unmodeled dynamics. The third algorithm that has been subject to analysis is derived from a direct application of higher-order sliding mode control and Lyapunov stability analysis without consideration of optimal control theory and has been named the Multiple Sliding Surface Guidance (MSSG). Via use of reinforcement learning methods an optimal set of gains has been found that make the guidance perform similarly to an open-loop optimal solution. Careful side-by-side inspection of the MSSG and Optimal Sliding Guidance (OSG) algorithms has shown some striking similarities. A detailed comparison of the algorithms has demonstrated that though they are nearly indistinguishable at first glance, there are some key differences between the two algorithms and they are indeed

  20. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    SciTech Connect

    , R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  1. Statistical modeling of ground motion relations for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2013-10-01

    We introduce a new approach for ground motion relations (GMR) in the probabilistic seismic hazard analysis (PSHA), being influenced by the extreme value theory of mathematical statistics. Therein, we understand a GMR as a random function. We derive mathematically the principle of area equivalence, wherein two alternative GMRs have an equivalent influence on the hazard if these GMRs have equivalent area functions. This includes local biases. An interpretation of the difference between these GMRs (an actual and a modeled one) as a random component leads to a general overestimation of residual variance and hazard. Beside this, we discuss important aspects of classical approaches and discover discrepancies with the state of the art of stochastics and statistics (model selection and significance, test of distribution assumptions, extreme value statistics). We criticize especially the assumption of logarithmic normally distributed residuals of maxima like the peak ground acceleration (PGA). The natural distribution of its individual random component (equivalent to exp( ɛ 0) of Joyner and Boore, Bull Seism Soc Am 83(2):469-487, 1993) is the generalized extreme value. We show by numerical researches that the actual distribution can be hidden and a wrong distribution assumption can influence the PSHA negatively as the negligence of area equivalence does. Finally, we suggest an estimation concept for GMRs of PSHA with a regression-free variance estimation of the individual random component. We demonstrate the advantages of event-specific GMRs by analyzing data sets from the PEER strong motion database and estimate event-specific GMRs. Therein, the majority of the best models base on an anisotropic point source approach. The residual variance of logarithmized PGA is significantly smaller than in previous models. We validate the estimations for the event with the largest sample by empirical area functions, which indicate the appropriate modeling of the GMR by an anisotropic

  2. Analysis of hazardous material releases due to natural hazards in the United States.

    PubMed

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Cruz, Ana Maria

    2012-10-01

    Natural hazards were the cause of approximately 16,600 hazardous material (hazmat) releases reported to the National Response Center (NRC) between 1990 and 2008-three per cent of all reported hazmat releases. Rain-induced releases were most numerous (26 per cent of the total), followed by those associated with hurricanes (20 per cent), many of which resulted from major episodes in 2005 and 2008. Winds, storms or other weather-related phenomena were responsible for another 25 per cent of hazmat releases. Large releases were most frequently due to major natural disasters. For instance, hurricane-induced releases of petroleum from storage tanks account for a large fraction of the total volume of petroleum released during 'natechs' (understood here as a natural hazard and the hazardous materials release that results). Among the most commonly released chemicals were nitrogen oxides, benzene, and polychlorinated biphenyls. Three deaths, 52 injuries, and the evacuation of at least 5,000 persons were recorded as a consequence of natech events. Overall, results suggest that the number of natechs increased over the study period (1990-2008) with potential for serious human and environmental impacts. PMID:22329456

  3. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  4. Hazard function analysis for flood planning under nonstationarity

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  5. Recent Advances in Launch Vehicle Toxic Hazard and Risk Analysis

    NASA Astrophysics Data System (ADS)

    Nyman, R. L.

    2012-01-01

    A number of widely used rocket propellants produce toxic combustion byproducts or are themselves toxic in their un-reacted state. In this paper we focus on the methodology used to evaluate early flight catastrophic failures and nominal launch emissions that release large amounts of propellant or combustion products into the planetary boundary layer that pose a potential risk to launch area personnel, spectators, or the general public. The United States has traditionally used the Rocket Exhaust Effluent Diffusion Model (REEDM) [1] to access the hazard zones associated with such releases. REEDM is a 1970's vintage Gaussian atmospheric dispersion model that is limited in its ability to accurately simulate certain aspects of the initial source geometry and dynamics of a vehicle breakup and propellant fragment dispersion. The Launch Area Toxic Risk Analysis 3-Dimensional (LATRA3D) [2] computer program has been developed that addresses many of REEDM's deficiencies. LATRA3D is a probabilistic risk analysis tool that simulates both nominal vehicle flight and in-flight failure emissions.

  6. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    SciTech Connect

    Thomas, John

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  7. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP) plan. 120.8 Section 120.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP)...

  8. Hazard Analysis and Disaster Preparedness in the Fairbanks North Star Borough, Alaska using Hazard Simulations, GIS, and Network Analysis

    NASA Astrophysics Data System (ADS)

    Schaefer, K.; Prakash, A.; Witte, W.

    2011-12-01

    The Fairbanks North Star Borough (FNSB) lies in interior Alaska, an area that is dominated by semiarid, boreal forest climate. FNSB frequently witnesses flooding events, wild land fires, earthquakes, extreme winter storms and other natural and man-made hazards. Being a large 19,065 km2 area, with a population of approximately 97,000 residents, providing emergency services to residents in a timely manner is a challenge. With only four highways going in and out of the borough, and only two of those leading to another city, most residents do not have quick access to a main road. Should a major disaster occur and block one of the two highways, options for evacuating or getting supplies to the area quickly dwindle. We present the design of a Geographic Information System (GIS) and network analysis based decision support tool that we have created for planning and emergency response. This tool will be used by Emergency Service (Fire/EMS), Emergency Management, Hazardous Materials Team, and Law Enforcement Agencies within FNSB to prepare and respond to a variety of potential disasters. The GIS combines available road and address networks from different FNSB agencies with the 2010 census data. We used ESRI's ArcGIS and FEMA's HAZUS-MH software to run multiple disaster scenarios and create several evacuation and response plans. Network analysis resulted in determining response time and classifying the borough by response times to facilitate allocation of emergency resources. The resulting GIS database can be used by any responding agency in FNSB to determine possible evacuation routes, where to open evacuation centers, placement of resources, and emergency response times. We developed a specific emergency response plan for three common scenarios: (i) major wildfire threatening Fairbanks, (ii) a major earthquake, (iii) loss of power during flooding in a flood-prone area. We also combined the network analysis results with high resolution imagery and elevation data to determine

  9. Analysis of rockfall hazards at Los Alamos National Laboratory

    SciTech Connect

    McLin, S.G.

    1993-09-01

    In the early years at Los Alamos National Laboratory, rockfall hazards were intuitively recognized and avoided. Hence mesa tops were selected over canyon floors for construction sites, although some canyon bottoms were still used. The Omega West reactor site was located in a narrow portion of Los Alamos Canyon adjacent to 400 foot high vertical cliffs. In 1944, a quarter-mile long rock catcher was installed above the reactor to protect the facility from occasional rockfalls. In addition, an annual rock catcher inspection was initiated. Between 1944 and 1993, 24 separate rockfall events were documented; individual rocks trapped in the catcher ranged in size from 300 to 21,000 pounds. These rockfall inspection data were arranged into an annual exceedance series, and a frequency analysis was performed. This type of analysis is routinely employed in flood studies when stream gaging records are available. Prior to this study, however, such techniques had never been used with rockfall data. This analysis indicates that the annual rockfall series is approximately log-normally distributed, and that the 500-year rockfall event will probably exceed 187 tons. In addition, a Markov generation scheme, which preserves the statistics of observed logarithms from the historical data, was used to generate a synthetic rockfall series. These synthetic data suggest that the cliff face will retreat at an average rate approximating 2 to 3 centimeters per 1000 years. This rate is comparable to independently computed rates that range from 4 to 14 centimeters per 1000 years. These cliff-face erosion processes are important because they affect mesa-top trench burial operations of low-level radioactive wastes.

  10. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  11. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  12. Hazardous Gas Leak Analysis in the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.

    1991-01-01

    Helium tests of the main propulsion system in the Space Shuttle and on hydrogen leaks are examined. The hazardous gas detection system (HGDS) in the mobile launch pad uses mass spectrometers (MS) to monitor the shuttle environment for leaks. The mass spectrometers are fed by long tubes to sample gas from the payload bay, mid-body, aft engine compartment, and external tank. The purpose is to improve the HGDS, especially in its potential for locating cryogen leaks. Pre-existing leak data was analyzed for transient information to determine if the leak location could be pinpointed from test data. A rapid response leak detection experiment was designed, built, and tested. Large eddies and vortices were visually seen with Schlieren imaging, and they were detected in the time plots of the various instruments. The response time of the MS was found in the range of 0.05 to 0.1 sec. Pulsed concentration waves were clearly detected at 25 cycles per sec by spectral analysis of MS data. One conclusion is that the backup HGDS sampling frequency should be increased above the present rate of 1 sample per second.

  13. Probabilistic Tsunami Hazard Analysis for Eastern Sicily (Italy)

    NASA Astrophysics Data System (ADS)

    Lorito, S.; Piatanesi, A.; Romano, F.; Basili, R.; Kastelic, V.; Tiberti, M. M.; Valensise, G.

    2009-12-01

    We applied the method for Probabilistic Tsunami Hazard Analysis (PTHA), first introduced by Rikitake and Aida (1988), to the coasts of eastern Sicily (Italy), and focusing on important cities such as Messina, Catania, and Augusta. The method has been very recently improved by González et al. (2009) to explicitly include inundation. Our approach considers both far- and near-field sources. Based on the approach described by Lorito et al. (2008), we identified a few Source Zones (SZ), each of which is characterized by a Typical Fault (TF) floating within it and capable of generating the Maximum Credible Earthquake (MCE). Some of the MCE recall historical events such as the 1908 Messina, the 1693 Catania and the 365 AD Crete earthquakes. We calculated the combined probability for all considered SZs by assigning a recurrence interval to each of them and a Poisson distribution of the inter-event times. We performed the PTHA for different damage metrics, such as runup, current speed, momentum and Froude number, with probability of exceedance of given thresholds being evaluated for different time intervals.

  14. Preparation of waste oil for analysis to determine hazardous metals

    SciTech Connect

    Essling, A.M.; Huff, D.R.; Huff, E.A.; Fox, I.M.; Graczyk, D.G.

    1995-07-01

    Two methods for preparing waste-oil samples to permit measurement of their metals content were evaluated. For this evaluation, metals-in-oil standard reference materials were prepared by each method and the resulting solutions were analyzed for 20 metals, including those (As, Ba, Cd, Cr, Pb, Hg, Se, and Ag) regulated as hazardous under the Resource Conservation and Recovery Act. One preparation method involved combustion of the waste oil under oxygen at 25 atm pressure, as described in the American Society for Testing and Materials test method E926-88. As we applied it, this method gave recoveries well under 90% for most of the metals that we examined and, hence, proved unsatisfactory for routine application to waste-oil analysis. With the other method, nitric acid decomposition in a sealed vessel heated with microwave energy (analogous to US Environmental Protection Agency Method 3051), recoveries of all 20 metal contaminants were within 90 to 110% of the certified values. This microwave digestion procedure was also more efficient since it allowed six samples to be prepared together, whereas the oxygen combustion approach allowed processing of only one sample at a time.

  15. Analysis of hazardous biological material by MALDI mass spectrometry

    SciTech Connect

    KL Wahl; KH Jarman; NB Valentine; MT Kingsley; CE Petersen; ST Cebula; AJ Saenz

    2000-03-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-MS) has become a valuable tool for analyzing microorganisms. The speed with which data can be obtained from MALDI-MS makes this a potentially important tool for biological health hazard monitoring and forensic applications. The excitement in the mass spectrometry community in this potential field of application is evident by the expanding list of research laboratories pursuing development of MALDI-MS for bacterial identification. Numerous research groups have demonstrated the ability to obtain unique MALDI-MS spectra from intact bacterial cells and bacterial cell extracts. The ability to differentiate strains of the same species has been investigated. Reproducibility of MALDI-MS spectra from bacterial species under carefully controlled experimental conditions has also been demonstrated. Wang et al. have reported on interlaboratory reproducibility of the MALDI-MS analysis of several bacterial species. However, there are still issues that need to be addressed, including the careful control of experimental parameters for reproducible spectra and selection of optimal experimental parameters such as solvent and matrix.

  16. Including uncertainty in hazard analysis through fuzzy measures

    SciTech Connect

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process.

  17. Rapid microbiological methods with hazard analysis critical control point.

    PubMed

    Griffiths, M W

    1997-01-01

    The proactive approach to ensuring food safety termed hazard analysis critical control point (HACCP) was introduced in the 1960s by the Pillsbury Company, in collaboration with the U.S. Army Natick Laboratories and National Aeronautics and Space Administration, to help guarantee that astronauts would not be incapacitated by the trauma of foodborne illness during space flights. The approach has subsequently been adopted as the standard food safety management system world-wide and is seen as forming the basis for harmonization of food inspection regulations necessitated by trade agreements such as General Agreement on Tariffs and Trade and North American Free Trade Agreement as the move toward globalization of trade in food products gains momentum. The new U.S. Department of Agriculture Mega-Reg requires mandatory introduction of HACCP, and the Food Safety Enhancement Program of Agriculture and Agri-food Canada, as well as the "due diligence" legislation of the European Union, is centered on HACCP principles. PMID:9419852

  18. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    SciTech Connect

    Dionne, B.J.; Morris, S. III; Baum, J.W.

    1998-03-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique.

  19. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    SciTech Connect

    Dionne, B.J.; Morris, S.C. III; Baum, J.W.

    1998-01-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique. This document contains the Appendices for the report.

  20. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    SciTech Connect

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  1. Hazards assessment for the Hazardous Waste Storage Facility

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-04-01

    This report documents the hazards assessment for the Hazardous Waste Storage Facility (HWSF) located at the Idaho National Engineering Laboratory. The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding HWSF, the buildings and structures at HWSF, and the processes used at HWSF are described in this report. All nonradiological hazardous materials at the HWSF were identified (radiological hazardous materials are not stored at HWSF) and screened against threshold quantities according to DOE Order 5500.3A guidance. Two of the identified hazardous materials exceeded their specified threshold quantity. This report discusses the potential release scenarios and consequences associated with an accidental release for each of the two identified hazardous materials, lead and mercury. Emergency considerations, such as emergency planning zones, emergency classes, protective actions, and emergency action levels, are also discussed based on the analysis of potential consequences. Evaluation of the potential consequences indicated that the highest emergency class for operational emergencies at the HWSF would be a Site Area Emergency.

  2. IDENTIFICATION OF AIRCRAFT HAZARDS

    SciTech Connect

    K.L. Ashley

    2005-03-23

    Aircraft hazards were determined to be potentially applicable to a repository at Yucca Mountain in the ''Monitored Geological Repository External Events Hazards Screening Analysis'' (BSC 2004, Section 6.4.1). That determination was conservatively based on limited knowledge of flight data in the area of concern and on crash data for aircraft of the type flying near Yucca Mountain. The purpose of this report is to identify specific aircraft hazards that may be applicable to a Monitored Geologic Repository (MGR) at Yucca Mountain using NUREG-0800, ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987, Section 3.5.1.6), as guidance for the inclusion or exclusion of identified aircraft hazards. NUREG-0800 is being used here as a reference because some of the same considerations apply. The intended use of this report is to provide inputs for further screening and analysis of the identified aircraft hazards based on the criteria that apply to Category 1 and 2 event sequence analyses as defined in 10 CFR 63.2 (see Section 4). The scope of this technical report includes the evaluation of military, private, and commercial use of airspace in the 100-mile regional setting of the MGR at Yucca Mountain with the potential for reducing the regional setting to a more manageable size after consideration of applicable screening criteria (see Section 7).

  3. Identification of Aircraft Hazards

    SciTech Connect

    K. Ashley

    2006-12-08

    Aircraft hazards were determined to be potentially applicable to a repository at Yucca Mountain in ''Monitored Geological Repository External Events Hazards Screening Analysis'' (BSC 2005 [DIRS 174235], Section 6.4.1). That determination was conservatively based upon limited knowledge of flight data in the area of concern and upon crash data for aircraft of the type flying near Yucca Mountain. The purpose of this report is to identify specific aircraft hazards that may be applicable to a monitored geologic repository (MGR) at Yucca Mountain, using NUREG-0800, ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987 [DIRS 103124], Section 3.5.1.6), as guidance for the inclusion or exclusion of identified aircraft hazards. The intended use of this report is to provide inputs for further screening and analysis of identified aircraft hazards based upon the criteria that apply to Category 1 and Category 2 event sequence analyses as defined in 10 CFR 63.2 [DIRS 176544] (Section 4). The scope of this report includes the evaluation of military, private, and commercial use of airspace in the 100-mile regional setting of the repository at Yucca Mountain with the potential for reducing the regional setting to a more manageable size after consideration of applicable screening criteria (Section 7).

  4. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  5. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  6. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    SciTech Connect

    Matthews, Elizabeth C.; Sattler, Meredith; Friedland, Carol J.

    2014-11-15

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.

  7. Job load and hazard analysis: a method for the analysis of workplace conditions for occupational health care.

    PubMed Central

    Mattila, M K

    1985-01-01

    One requirement for successful occupational health care is reliable information on occupational hazards. The aim of this study was to develop a simple, standardised method for workplace investigations for use in occupational health care. The theoretical framework of the method comprises the stress-strain model, the hazard-danger model, and risk behaviour theory. The new method, termed job load and hazard analysis, includes four stages: identification of hazards, their evaluation, conclusions and proposals, and follow up. Different methods are available for hazard identification. The identification starts with a rough analysis of five factors, chemical hazards, physical hazards, physical load, mental stress, and accident risk. Hazards and stress factors are assessed with an ordinal scale. Specialised methods are used if all hazards cannot otherwise be identified. The analytical procedure comprises: detection of hazards through observations and interviews at the workplace and with a questionnaire; assessment of findings as teamwork; and evaluation of the results of these assessments to yield conclusions and proposals made by occupational health care personnel. A data processing system has been developed for data storage and future use. The method has functioned in practice, improving the contents of the occupational health care programme and generating preventive measures. The method offers many new possibilities for controlling occupational hazards and studying relations between working conditions and workers' health. PMID:4041383

  8. Hazard Detection Analysis for a Forward-Looking Interferometer

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Herkert, Ralph; Smith, William L.; Kireev, Stanislav; Schaffner, Philip R.; Daniels, Taumi S.; Cornman, Larry B.; Sharman, Robert; Weekley, Andrew; Perram, Glen; Gross, Kevin; Smith, Greg; Feltz, Wayne; Taylor, Joe; Olson, Erik

    2010-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining the measurements required to alert flight crews to potential weather hazards to safe flight. To meet the needs of the commercial fleet, such a sensor should address multiple hazards to warrant the costs of development, certification, installation, training, and maintenance. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing. These technologies have also been applied to the detection of aerosols and gases for other purposes. The FLI concept is being evaluated for its potential to address multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing during all phases of flight (takeoff, cruise, and landing). The research accomplished in this second phase of the FLI project was in three major areas: further sensitivity studies to better understand the potential capabilities and requirements for an airborne FLI instrument, field measurements that were conducted in an effort to provide empirical demonstrations of radiometric hazard detection, and theoretical work to support the development of algorithms to determine the severity of detected hazards

  9. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... hazards and each hazard control involved in the process. An analysis that complies with 29 CFR 1910.119(e... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of..., or for use in any real-time physics models used to ensure compliance with the toxic flight...

  10. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... hazards and each hazard control involved in the process. An analysis that complies with 29 CFR 1910.119(e... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of..., or for use in any real-time physics models used to ensure compliance with the toxic flight...

  11. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... hazards and each hazard control involved in the process. An analysis that complies with 29 CFR 1910.119(e... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of..., or for use in any real-time physics models used to ensure compliance with the toxic flight...

  12. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... 0910-0466)--Extension FDA's regulations in part 120 (21 CFR part 120) mandate the application of HACCP procedures to fruit and vegetable juice processing. HACCP is a preventative system of hazard control that...

  13. Preliminary fire hazard analysis for the PUTDR and TRU trenches in the Solid Waste Burial Ground

    SciTech Connect

    Gaschott, L.J.

    1995-06-16

    This document represents the Preliminary Fire Hazards Analysis for the Pilot Unvented TRU Drum Retrieval effort and for the Transuranic drum trenches in the low level burial grounds. The FHA was developed in accordance with DOE Order 5480.7A to address major hazards inherent in the facility.

  14. Overview of Mars Sample Hazard Analysis (Requirements Workshop Series)

    NASA Technical Reports Server (NTRS)

    Rummel, John D.

    2000-01-01

    Samples returned from Mars should be contained and treated as though potentially hazardous until proven otherwise. If sample containment cannot be verified en route to Earth, the sample and spacecraft should either be sterilized in space or not returned to Earth. Integrity of sample containment should be maintained through reentry and transfer to a receiving facility. Controlled distribution of unsterilized materials should only occur if analyses determine the sample not to contain a biological hazard. Planetary protection measures adopted for the first sample return should not be relaxed for subsequent missions without thorough scientific review and concurrence by an appropriate independent body.

  15. National Emission Standards for Hazardous Air Pollutants for Major Sources. Industrial, Commercial, and Institutional Boilers; Guidance for Calculating Emission Credits Resulting from Implementation of Energy Conservation Measures

    SciTech Connect

    Papar, Riyaz; Wright, Anthony; Cox, Daryl

    2012-07-01

    The purpose of this document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule (i.e., Subpart DDDDD of CFR Part 63).

  16. Uncertainty Analysis of Flash Flood Guidance: Topographic Data and Model Parameter Errors

    NASA Astrophysics Data System (ADS)

    Georgakakos, K. P.; Ntelekos, A. A.; Krajewski, W. F.

    2004-05-01

    Flash Flood Guidance (FFG) is the volume of rainfall required to generate bankfull flows at the outlet of a basin over a specified time interval and initial soil moisture conditions. Operationally the soil moisture conditions are generated every 6 hours by the execution of the Sacramento - Soil Moisture Accounting (SAC - SMA) model at the River Forecast Centers (RFC's). This guidance is used with actual radar rainfall data over the basin to assist with the production of flash flood warnings. The backbone of the FFG system is the Threshold Runoff (Thresh-R), the calculation of which is done offline as a one time task. Thersh-R is the volume of effective rainfall of a given duration needed to cause bankfull flows at the basin outlet. In this study, bankfull conditions from uniform steady flow and the Geomorphologic Unit Hydrograph theory are used for the calculation of Thresh-R for a basin located in Illinois River at Oklahoma. The uncertainty related with the GIS and channel data for the calculation of thresh-R is introduced and an ensemble of threshold runoff values is produced. Then, the FFG is modeled with the use of a time-continuous approximation of the upper zone of the SAC-SMA hydrologic model and quadratic function approximations. The thresh-R ensemble is fed into the FFG model to study the uncertainty in the FFG values due to the uncertainty in the GIS and channel data that contribute to the uncertainty of threshold runoff. The numerical experiments are then repeated but additional uncertainty in the key parameters of the analytical Sacramento model solution is added, to study the synergistic effect of both uncertainties. The results of analysis are presented and the parameters that affect more the FFG uncertainty are identified. The need of transforming the currently deterministic operational FFG system to a probabilistic or an ensemble one is also discussed.

  17. Occupational hazards control of hazardous substances in clean room of semiconductor manufacturing plant using CFD analysis.

    PubMed

    Li, Jianfeng; Zhou, Ya-Fei

    2015-02-01

    The manufacturing processes in chip industries are complex, and many kinds of raw materials and solvents of different nature are used, most of which are highly toxic and dangerous. During the machine preventive maintenance period, these toxic and harmful substances will escape from the sealed reaction chamber to the clean workshop environment and endanger the health of the workers on-site, resulting in occupational diseases. From the perspective of prevention, the spread and prediction of hydrochloric acid (HCl) that escaped from the metal-etching chamber during maintenance were studied in this article. The computational fluid dynamics technology was used for a three-dimensional numerical simulation of the indoor air velocity field and the HCl concentration field, and the simulation results were then compared with the on-site monitoring data to verify the correctness and feasibility. The occupational hazards and control measures were analyzed based on the numerical simulation, and the optimal control measure was obtained. In this article, using the method of ambient air to analyze the occupational exposure can provide a new idea to the field of occupational health research in the integrated circuit industry and had theoretical and practical significance. PMID:23293134

  18. Hazard analysis for 300 Area N Reactor Fuel Fabrication and Storage Facilty

    SciTech Connect

    Johnson, D.J.; Brehm, J.R.

    1994-01-25

    This hazard analysis (HA) has been prepared for the 300 Area N Reactor Fuel Fabrication and Storage Facility (Facility), in compliance with the requirements of Westinghouse Hanford Company (Westinghouse Hanford) controlled manual WHC-CM-4-46, Nonreactor Facility Safety Analysis Manual, and to the direction of WHC-IP-0690, Safety Analysis and Regulation Desk Instructions, (WHC 1992). An HA identifies potentially hazardous conditions in a facility and the associated potential accident scenarios. Unlike the Facility hazard classification documented in WHC-SD-NR-HC-004, Hazard Classification for 300 Area N Reactor Fuel Fabrication and Storage Facility, (Huang 1993), which is based on unmitigated consequences, credit is taken in an HA for administrative controls or engineered safety features planned or in place. The HA is the foundation for the accident analysis. The significant event scenarios identified by this HA will be further evaluated in a subsequent accident analysis.

  19. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    SciTech Connect

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  20. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... HACCP plan shall, at a minimum: (1) List the food safety hazards identified in accordance with paragraph... Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT...

  1. SOCIOECONOMIC ANALYSIS OF HAZARDOUS WASTE MANAGEMENT ALTERNATIVES: METHOLOLOGY AND DEMONSTRATION

    EPA Science Inventory

    A methodology for analyzing economic and social effects of alternatives in hazardous waste management is presented and demonstrated. The approach includes the use of environmental threat scenarios and evaluation of effects on and responses by parties-at-interest. The methodology ...

  2. Reliability analysis of common hazardous waste treatment processes

    SciTech Connect

    Waters, R.D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  3. An Exploratory Statistical Analysis of a Planet Approach-Phase Guidance Scheme Using Angular Measurements with Significant Error

    NASA Technical Reports Server (NTRS)

    Friedlander, Alan L.; Harry, David P., III

    1960-01-01

    An exploratory analysis of vehicle guidance during the approach to a target planet is presented. The objective of the guidance maneuver is to guide the vehicle to a specific perigee distance with a high degree of accuracy and minimum corrective velocity expenditure. The guidance maneuver is simulated by considering the random sampling of real measurements with significant error and reducing this information to prescribe appropriate corrective action. The instrumentation system assumed includes optical and/or infrared devices to indicate range and a reference angle in the trajectory plane. Statistical results are obtained by Monte-Carlo techniques and are shown as the expectation of guidance accuracy and velocity-increment requirements. Results are nondimensional and applicable to any planet within limits of two-body assumptions. The problem of determining how many corrections to make and when to make them is a consequence of the conflicting requirement of accurate trajectory determination and propulsion. Optimum values were found for a vehicle approaching a planet along a parabolic trajectory with an initial perigee distance of 5 radii and a target perigee of 1.02 radii. In this example measurement errors were less than i minute of arc. Results indicate that four corrections applied in the vicinity of 50, 16, 15, and 1.5 radii, respectively, yield minimum velocity-increment requirements. Thrust devices capable of producing a large variation of velocity-increment size are required. For a vehicle approaching the earth, miss distances within 32 miles are obtained with 90-percent probability. Total velocity increments used in guidance are less than 3300 feet per second with 90-percent probability. It is noted that the above representative results are valid only for the particular guidance scheme hypothesized in this analysis. A parametric study is presented which indicates the effects of measurement error size, initial perigee, and initial energy on the guidance

  4. Probabilistic Tsunami Hazard Analysis for Eastern Sicily (Italy)

    NASA Astrophysics Data System (ADS)

    Lorito, Stefano; Piatanesi, Alessio; Romano, Fabrizio; Basili, Roberto; Kastelic, Vanja; Tiberti, Mara Monica; Valensise, Gianluca; Selva, Jacopo

    2010-05-01

    We present preliminary results of a Probabilistic Tsunami Hazard Analysis (PTHA) for the coast of eastern Sicily. We only consider earthquake-generated tsunamis. We focus on important cities such as Messina, Catania, and Augusta. We consider different potentially tsunamigenic Source Zones (SZ) in the Mediterranean basin, basing on geological and seismological evidences. Considering many synthetic earthquakes for each SZ, we numerically simulate the entire tsunami propagation, from sea-floor displacement to inundation. We evaluate different tsunami damage metrics, as for example maximum runup, current speed, momentum and Froude number. We use a finite difference scheme in the shallow-water approximation for the tsunami propagation at open sea, and a finite volumes scheme for the inundation phase. For the shoaling and inundation stages, we have built a bathy-topo model by merging GEBCO database, multibeam soundings, and topographic data at 10 m of resolution. Accounting for their relative probability of occurrence, deterministic scenarios are merged together to assess PTHA at the selected target sites, expressed as a probability of exceedance of a given threshold (e.g. 1 m wave height) in a given time (e.g. 100 yr). First order epistemic and aleatory uncertainties are accessed through a logic tree, accounting for changes in the variables judged to have a major impact on PTHA, and for eventual incompleteness of the SZs. The SZs are located at short, intermediate and large distances with respect to the target coastlines. We thus highlight, for different source-target distances, the relative importance of the different source parameters, and/or the role of the uncertainties in the input parameters estimation. Our results suggest that in terms of inundation extent the Hellenic Arc SZ has the highest impact on the selected target coastlines. In terms of exceedance probability instead, there is a larger variability depending not only on location and recurrence but also on

  5. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  6. 77 FR 5857 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ...: On November 2, 2011 (76 FR 67764), the U.S. Nuclear Regulatory Commission (NRC) published for public comment Draft NUREG, ``Common- Cause Failure Analysis in Event and Condition Assessment: Guidance and...-2011-0254. Discussion On November 2, 2011 (76 FR 67764), the NRC published for public comment...

  7. Monte Carlo analysis of the Titan III/Transfer Orbit Stage guidance system for the Mars Observer mission

    NASA Technical Reports Server (NTRS)

    Bell, Stephen C.; Ginsburg, Marc A.; Rao, Prabhakara P.

    1993-01-01

    An important part of space launch vehicle mission planning for a planetary mission is the integrated analysis of guidance and performance dispersions for both booster and upper stage vehicles. For the Mars Observer mission, an integrated trajectory analysis was used to maximize the scientific payload and to minimize injection errors by optimizing the energy management of both vehicles. This was accomplished by designing the Titan III booster vehicle to inject into a hyperbolic departure plane, and the Transfer Orbit Stage (TOS) to correct any booster dispersions. An integrated Monte Carlo analysis of the performance and guidance dispersions of both vehicles provided sensitivities, an evaluation of their guidance schemes and an injection error covariance matrix. The polynomial guidance schemes used for the Titan III variable flight azimuth computations and the TOS solid rocket motor ignition time and burn direction derivations accounted for a wide variation of launch times, performance dispersions, and target conditions. The Mars Observer spacecraft was launched on 25 September 1992 on the Titan III/TOS vehicle. The post flight analysis indicated that a near perfect park orbit injection was achieved, followed by a trans-Mars injection with less than 2sigma errors.

  8. A cross-hazard analysis of terse message retransmission on Twitter

    PubMed Central

    Sutton, Jeannette; Gibson, C. Ben; Phillips, Nolan Edward; Spiro, Emma S.; League, Cedar; Johnson, Britta; Fitzhugh, Sean M.; Butts, Carter T.

    2015-01-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes—local network properties, message content, and message style—that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates. PMID:26627233

  9. A cross-hazard analysis of terse message retransmission on Twitter.

    PubMed

    Sutton, Jeannette; Gibson, C Ben; Phillips, Nolan Edward; Spiro, Emma S; League, Cedar; Johnson, Britta; Fitzhugh, Sean M; Butts, Carter T

    2015-12-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes--local network properties, message content, and message style--that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates. PMID:26627233

  10. Development of hazard analysis by critical control points (HACCP) procedures to control organic chemical hazards in the agricultural production of raw food commodities.

    PubMed

    Ropkins, Karl; Ferguson, Andrew; Beck, Angus J

    2003-01-01

    Hazard Analysis by Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards in the food chain. Effective HACCP requires the consideration of all chemical microbiological, and physical hazards. However, current procedures focus primarily on microbiological and physical hazards, while chemical aspects of HACCP have received relatively little attention. In this article we discuss the application of HACCP to organic chemical contaminants and the problems that are likely to be encountered in agriculture. We also present generic templates for the development of organic chemical contaminant HACCP procedures for selected raw food commodities, that is, cereal crops,raw meats, and milk. PMID:12822674

  11. Hazards assessment for the Waste Experimental Reduction Facility

    SciTech Connect

    Calley, M.B.; Jones, J.L. Jr.

    1994-09-19

    This report documents the hazards assessment for the Waste Experimental Reduction Facility (WERF) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. This hazards assessment describes the WERF, the area surrounding WERF, associated buildings and structures at WERF, and the processes performed at WERF. All radiological and nonradiological hazardous materials stored, used, or produced at WERF were identified and screened. Even though the screening process indicated that the hazardous materials could be screened from further analysis because the inventory of radiological and nonradiological hazardous materials were below the screening thresholds specified by DOE and DOE-ID guidance for DOE Order 5500.3A, the nonradiological hazardous materials were analyzed further because it was felt that the nonradiological hazardous material screening thresholds were too high.

  12. Laser hazard analysis for various candidate diode lasers associated with the high resolution pulsed scanner.

    SciTech Connect

    Augustoni, Arnold L.

    2004-10-01

    A laser hazard analysis and safety assessment was performed for each various laser diode candidates associated with the High Resolution Pulse Scanner based on the ANSI Standard Z136.1-2000, American National Standard for the Safe Use of Lasers. A theoretical laser hazard analysis model for this system was derived and an Excel{reg_sign} spreadsheet model was developed to answer the 'what if questions' associated with the various modes of operations for the various candidate diode lasers.

  13. Chemical hazards analysis of resilient flooring for healthcare.

    PubMed

    Lent, Tom; Silas, Julie; Vallette, Jim

    2010-01-01

    This article addresses resilient flooring, evaluating the potential health effects of vinyl flooring and the leading alternatives-synthetic rubber, polyolefin, and linoleum-currently used in the healthcare marketplace. The study inventories chemicals incorporated as components of each of the four material types or involved in their life cycle as feedstocks, intermediary chemicals, or emissions. It then characterizes those chemicals using a chemical hazard-based framework that addresses persistence and bioaccumulation, human toxicity, and human exposures. PMID:21165873

  14. Rockfall hazard analysis using LiDAR and spatial modeling

    NASA Astrophysics Data System (ADS)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  15. Sensitivity analysis for Probabilistic Tsunami Hazard Assessment (PTHA)

    NASA Astrophysics Data System (ADS)

    Spada, M.; Basili, R.; Selva, J.; Lorito, S.; Sorensen, M. B.; Zonker, J.; Babeyko, A. Y.; Romano, F.; Piatanesi, A.; Tiberti, M.

    2012-12-01

    In modern societies, probabilistic hazard assessment of natural disasters is commonly used by decision makers for designing regulatory standards and, more generally, for prioritizing risk mitigation efforts. Systematic formalization of Probabilistic Tsunami Hazard Assessment (PTHA) has started only in recent years, mainly following the giant tsunami disaster of Sumatra in 2004. Typically, PTHA for earthquake sources exploits the long-standing practices developed in probabilistic seismic hazard assessment (PSHA), even though important differences are evident. In PTHA, for example, it is known that far-field sources are more important and that physical models for tsunami propagation are needed for the highly non-isotropic propagation of tsunami waves. However, considering the high impact that PTHA may have on societies, an important effort to quantify the effect of specific assumptions should be performed. Indeed, specific standard hypotheses made in PSHA may prove inappropriate for PTHA, since tsunami waves are sensitive to different aspects of sources (e.g. fault geometry, scaling laws, slip distribution) and propagate differently. In addition, the necessity of running an explicit calculation of wave propagation for every possible event (tsunami scenario) forces analysts to finding strategies for diminishing the computational burden. In this work, we test the sensitivity of hazard results with respect to several assumptions that are peculiar of PTHA and others that are commonly accepted in PSHA. Our case study is located in the central Mediterranean Sea and considers the Western Hellenic Arc as the earthquake source with Crete and Eastern Sicily as near-field and far-field target coasts, respectively. Our suite of sensitivity tests includes: a) comparison of random seismicity distribution within area sources as opposed to systematically distributed ruptures on fault sources; b) effects of statistical and physical parameters (a- and b-value, Mc, Mmax, scaling laws

  16. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  17. Orion Crew Exploration Vehicle Launch Abort System Guidance and Control Analysis Overview

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Kim, Sungwan; Raney, David L.; Aubuchon, Vanessa V.; Sparks, Dean W.; Busan, Ronald C.; Proud, Ryan W.; Merritt, Deborah S.

    2008-01-01

    Aborts during the critical ascent flight phase require the design and operation of Orion Crew Exploration Vehicle (CEV) systems to escape from the Crew Launch Vehicle (CLV) and return the crew safely to the Earth. To accomplish this requirement of continuous abort coverage, CEV ascent abort modes are being designed and analyzed to accommodate the velocity, altitude, atmospheric, and vehicle configuration changes that occur during ascent. Aborts from the launch pad to early in the flight of the CLV second stage are performed using the Launch Abort System (LAS). During this type of abort, the LAS Abort Motor is used to pull the Crew Module (CM) safely away from the CLV and Service Module (SM). LAS abort guidance and control studies and design trades are being conducted so that more informed decisions can be made regarding the vehicle abort requirements, design, and operation. This paper presents an overview of the Orion CEV, an overview of the LAS ascent abort mode, and a summary of key LAS abort analysis methods and results.

  18. Guidance and control analysis of the entry of a lifting body personnel launch vehicle

    NASA Technical Reports Server (NTRS)

    Powell, Richard W.; Cruz, Christopher I.

    1991-01-01

    NASA is currently involved in definition studies of a Personnel Launch System (PLS) that could be used to transport people to and from low-earth orbit. This vehicle would serve both to complement the Space Shuttle and to provide alternative access to space in the event the Space Shuttle fleet were unavailable for a prolonged period. The PLS would consist of a manned spacecraft launched by an expendable vehicle, e.g., Titan 4. One promising candidate for the manned component of the PLS is the NASA Langley Research Center HL-20 lifting body. Many studies are currently underway to assess this vehicle, and one of the main areas of study is the development of the capability to successfully enter, glide to the landing site, and land. To provide this capability, guidance and control algorithms have been developed, incorporated into a six-degree-of-freedom simulation, and evaluation in the presence of off-nominal atmospheric conditions, consisting of both density variations and steady-state winds. In addition, the impact of atmospheric turbulence was examined for the portion of flight from Mach 3.5 to touchdown. This analysis showed that the vehicle remained controllable and could successfully land even in the presence of off-nominal atmospheric conditions.

  19. An analysis of the Hubble Space Telescope fine guidance sensor fine lock mode

    NASA Technical Reports Server (NTRS)

    Taff, L. G.

    1991-01-01

    There are two guiding modes of the Hubble Space Telescope (HST) used for the acquisition of astronomical data by one of its six scientific instruments. The more precise one is called Fine Lock. Command and control problems in the onboard electronics has limited Fine Lock to brighter stars, V less than 13.0 mag, instead of fulfilling its goal of V = 14.5 mag. Consequently, the less precise guiding mode of Coarse Track (approximately 40 milli-arc seconds) has to be used fairly frequently. Indeed, almost half of the scientific observations to have been made with the HST will be compromised. The only realistic or extensive simulations of the Fine Lock guidance mode are reported. The theoretical analysis underlying the Monte Carlo experiments and the numerical computations clearly show both that the control electronics are severely under-engineered and how to adjust the various control parameters to successfully extend Fine Lock guiding performance back to V = 14.0 mag and sometimes beyond.

  20. Solar Glaze Hazard Analysis Tool v. 3.0

    Energy Science and Technology Software Center (ESTSC)

    2015-08-25

    SGHAT predicts the occurrence and intensity of glare caused by a user-specified solar panel array when viewed from one or more observation points. An interactive mapping interface is used to determine the latitude, longitude and elevation of the array and observation points. The presence and intensity of glare is then calculated along a given time interval throughout the year, based on the position of the sun. The potential ocular hazard is also reported. The maximummore » energy production of the solar array is also estimated so that alternative designs can be compared to determine the design that yields the most energy production while mitigating glare.« less

  1. Solar Glaze Hazard Analysis Tool v. 3.0

    SciTech Connect

    Ho, Clifford K.; Sims, Cianan A.

    2015-08-25

    SGHAT predicts the occurrence and intensity of glare caused by a user-specified solar panel array when viewed from one or more observation points. An interactive mapping interface is used to determine the latitude, longitude and elevation of the array and observation points. The presence and intensity of glare is then calculated along a given time interval throughout the year, based on the position of the sun. The potential ocular hazard is also reported. The maximum energy production of the solar array is also estimated so that alternative designs can be compared to determine the design that yields the most energy production while mitigating glare.

  2. National Emission Standards for Hazardous Air Pollutants for Major Sources: Industrial, Commercial, and Institutional Boilers; Guidance for Calculating Efficiency Credits Resulting from Implementation of Energy Conservation Measures

    SciTech Connect

    Cox, Daryl; Papar, Riyaz; Wright, Dr. Anthony

    2013-02-01

    The purpose of this document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule (i.e., subpart DDDDD of CFR part 63). This document divides Boiler System conservation opportunities into four functional areas: 1) the boiler itself, 2) the condensate recovery system, 3) the distribution system, and 4) the end uses of the steam. This document provides technical information for documenting emissions credits proposed in the Implementation Plan for functional areas 2) though 4). This document does not include efficiency improvements related to the Boiler tune-ups.

  3. National Emission Standards for Hazardous Air Pollutants for Major Sources: Industrial, Commercial, and Institutional Boilers; Guidance for Calculating Emission Credits Resulting from Implementation of Energy Conservation Measures

    SciTech Connect

    Cox, Daryl; Papar, Riyaz; Wright, Dr. Anthony

    2012-07-01

    The purpose of this document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule (i.e., subpart DDDDD of CFR part 63). This document divides Boiler System conservation opportunities into four functional areas: 1) the boiler itself, 2) the condensate recovery system, 3) the distribution system, and 4) the end uses of the steam. This document provides technical information for documenting emissions credits proposed in the Implementation Plan for functional areas 2) though 4). This document does not include efficiency improvements related to the Boiler tune-ups.

  4. Defining geologic Hazards for natural resources management using tree-ring analysis

    USGS Publications Warehouse

    DeGraff, J.V.; Agard, S.S.

    1984-01-01

    Landslides, avalanches, floods, and other geologic hazards impair natural resources management by jeopardizing public safety, damaging or restricting resource utilization, and necessitating expenditures for corrective measures The negative impact of geologic hazard events can be reduced by tailoring resources management to hazard potential of an area This requires assessment of where and how frequently the events occur National forests and other managed wildlands often lack monitoring or historical records to compute frequency of hazard occurrence Tree-ring analysis, based on internal growth response to external events such as tilting and abrasion, can provide frequency data Two examples of the use of tree-ring analysis to date landslide activity illustrate advantages and limitations of the technique An example from the Fishlake National Forest in central Utah illustrates assessment for planning purposes An example from the Sierra National Forest in east-central California shows assessment applied to project design Many geologic hazards in addition to landslides are suited to tree-ring analysis to establish frequency of occurrence Hazard reduction efforts in natural resources management could be enhanced by careful application of tree-ring analysis ?? 1984 Springer-Verlag New York Inc.

  5. Fire hazard analysis for Project W-320 Tank 241-C-106 waste retrieval

    SciTech Connect

    Conner, J.C.

    1995-09-12

    This Fire Hazards Analysis (FHA) for Project W-320, `Tank 241-C-106 Waste Retrieval` addresses fire hazards or fire related concerns in accordance with DOE 5480.7A (DOE 1998), resulting from or related to the processes and equipment to be installed or modified under Project W-320 to ensure that there are no undue fire hazards to site personnel and the public; the potential for the occurrence of a fire is minimized, process control and safety systems are not damaged by fire or related perils; and property damage from fire and related perils does not exceed an acceptable level.

  6. North Portal Fuel Storage System Fire Hazard Analysis-ESF Surface Design Package ID

    SciTech Connect

    N.M. Ruonavaara

    1995-01-18

    The purpose of the fire hazard analysis is to comprehensively assess the risk from fire within the individual fire areas. This document will only assess the fire hazard analysis within the Exploratory Studies Facility (ESF) Design Package ID, which includes the fuel storage system area of the North Portal facility, and evaluate whether the following objectives are met: 1.1.1--This analysis, performed in accordance with the requirements of this document, will satisfy the requirements for a fire hazard analysis in accordance with U.S. Department of Energy (DOE) Order 5480.7A. 1.1.2--Ensure that property damage from fire and related perils does not exceed an acceptable level. 1.1.3--Provide input to the ESF Basis For Design (BFD) Document. 1.1.4 Provide input to the facility Safety Analysis Report (SAR) (Paragraph 3.8).

  7. Vertical guidance performance analysis of the L1-L5 dual-frequency GPS/WAAS user avionics sensor.

    PubMed

    Jan, Shau-Shiun

    2010-01-01

    This paper investigates the potential vertical guidance performance of global positioning system (GPS)/wide area augmentation system (WAAS) user avionics sensor when the modernized GPS and Galileo are available. This paper will first investigate the airborne receiver code noise and multipath (CNMP) confidence (σair). The σair will be the dominant factor in the availability analysis of an L1-L5 dual-frequency GPS/WAAS user avionics sensor. This paper uses the MATLAB Algorithm Availability Simulation Tool (MAAST) to determine the required values for the σair, so that an L1-L5 dual-frequency GPS/WAAS user avionics sensor can meet the vertical guidance requirements of APproach with Vertical guidance (APV) II and CATegory (CAT) I over conterminous United States (CONUS). A modified MAAST that includes the Galileo satellite constellation is used to determine under what user configurations WAAS could be an APV II system or a CAT I system over CONUS. Furthermore, this paper examines the combinations of possible improvements in signal models and the addition of Galileo to determine if GPS/WAAS user avionics sensor could achieve 10 m Vertical Alert Limit (VAL) within the service volume. Finally, this paper presents the future vertical guidance performance of GPS user avionics sensor for the United States' WAAS, Japanese MTSAT-based satellite augmentation system (MSAS) and European geostationary navigation overlay service (EGNOS). PMID:22319263

  8. Sensitivity analysis of seismic hazard for Western Liguria (North Western Italy): A first attempt towards the understanding and quantification of hazard uncertainty

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Spallarossa, Daniele; Bazzurro, Paolo; Eva, Claudio

    2007-05-01

    The use of logic trees in probabilistic seismic hazard analyses often involves a large number of branches that reflect the uncertainty in the selection of different models and in the selection of the parameter values of each model. The sensitivity analysis, as proposed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817], is an efficient tool that allows the construction of logic trees focusing attention on the parameters that have greater impact on the hazard. In this paper the sensitivity analysis is performed in order to identify the parameters that have the largest influence on the Western Liguria (North Western Italy) seismic hazard. The analysis is conducted for six strategic sites following the multi-parameter approach developed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817] and accounts for both mean hazard values and hazard values corresponding to different percentiles (e.g., 16%-ile and 84%-ile). The results are assessed in terms of the expected PGA with a 10% probability of exceedance in 50 years for rock conditions and account for both the contribution from specific source zones using the Cornell approach [Cornell, C.A., 1968. Engineering seismic risk analysis. Bull. Seismol. Soc. Am. 58, 1583-1606] and the spatially smoothed seismicity [Frankel, A., 1995. Mapping seismic hazard in the Central and Eastern United States. Seismol. Res. Lett. 66, 8-21]. The influence of different procedures for calculating seismic hazard, seismic catalogues (epicentral parameters), source zone models, frequency-magnitude parameters, maximum earthquake magnitude values and attenuation relationships is considered. As a result, the sensitivity analysis allows us to identify the parameters with higher influence on the hazard. Only these

  9. Guidance Matters

    ERIC Educational Resources Information Center

    Gartrell, Dan

    2005-01-01

    In this column, the author shares and comments on early childhood educators' use of guidance to foster young children's development and learning. He defines guidance as the commitment a teacher makes to teaching children how to solve their problems, rather than punishing them for having problems they haven't yet learned how to solve. The focus of…

  10. A noise-reduction GWAS analysis implicates altered regulation of neurite outgrowth and guidance in autism

    PubMed Central

    2011-01-01

    coherent pathway that regulates the directional protrusion of axons and dendrites to their appropriate synaptic targets. Conclusions As statistical noise is likely to particularly affect studies of complex disorders, where genetic heterogeneity or interaction between genes may confound the ability to detect association, GWAS-NR offers a powerful method for prioritizing regions for follow-up studies. Applying this method to autism datasets, GWAS-NR analysis indicates that a large subset of genes involved in the outgrowth and guidance of axons and dendrites is implicated in the aetiology of autism. PMID:21247446

  11. Guidance on risk analysis and safety implications of a large liquefied natural gas (LNG) spill over water.

    SciTech Connect

    Wellman, Gerald William; Melof, Brian Matthew; Luketa-Hanlin, Anay Josephine; Hightower, Marion Michael; Covan, John Morgan; Gritzo, Louis Alan; Irwin, Michael James; Kaneshige, Michael Jiro; Morrow, Charles W.

    2004-12-01

    While recognized standards exist for the systematic safety analysis of potential spills or releases from LNG (Liquefied Natural Gas) storage terminals and facilities on land, no equivalent set of standards or guidance exists for the evaluation of the safety or consequences from LNG spills over water. Heightened security awareness and energy surety issues have increased industry's and the public's attention to these activities. The report reviews several existing studies of LNG spills with respect to their assumptions, inputs, models, and experimental data. Based on this review and further analysis, the report provides guidance on the appropriateness of models, assumptions, and risk management to address public safety and property relative to a potential LNG spill over water.

  12. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  13. Hazard analysis and risk assessment in the development of biomedical drug formulation equipment.

    PubMed

    Johnson, David H; Bidez, Martha W; Delucas, Lawrence J

    2012-04-01

    Hazard analysis and risk assessment techniques are utilized within many private sector industries and government agencies, including the medical device and pharmaceutical industry, within a structured process to control human injuries and environmental and property damage. In the U.S. the Federal Drug Administration (FDA) requires a hazard analysis be performed on all medical devices. While there are biomedical engineering applications reported which deal with human hazards in clinical, patient care environment, no previous studies extend these traditional techniques to a university-based, research environment. This study applies a tiered approach to hazard analysis and risk assessment to a biomedical, university-based, research environment in the design of a high throughput platform that screens chemical excipients (additives) for their ability to increase protein solubility. Each design stage (conceptual, preliminary, system, and detailed) requires a unique hazard analysis technique based on available information. The analysis techniques applied here are evaluated for their use in a biomedical research environment where experiment accuracy is a primary concern. PMID:22068884

  14. Quality assurance guidance for field sampling and measurement assessment plates in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document is one of several guidance documents developed by the US Department of Energy (DOE) Office of Environmental Restoration and Waste Management (EM). These documents support the EM Analytical Services Program (ASP) and are based on applicable regulatory requirements and DOE Orders. They address requirements in DOE Orders by providing guidance that pertains specifically to environmental restoration and waste management sampling and analysis activities. DOE 5700.6C Quality Assurance (QA) defines policy and requirements to establish QA programs ensuring that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized. This is accomplished through the application of effective management systems commensurate with the risks imposed by the facility and the project. Every organization supporting EM`s environmental sampling and analysis activities must develop and document a QA program. Management of each organization is responsible for appropriate QA program implementation, assessment, and improvement. The collection of credible and cost-effective environmental data is critical to the long-term success of remedial and waste management actions performed at DOE facilities. Only well established and management supported assessment programs within each EM-support organization will enable DOE to demonstrate data quality. The purpose of this series of documents is to offer specific guidance for establishing an effective assessment program for EM`s environmental sampling and analysis (ESA) activities.

  15. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  16. Quality assurance guidance for laboratory assessment plates in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document is one of several guidance documents developed to support the EM (DOE Environmental Restoration and Waste Management) Analytical Services program. Its purpose is to introduce assessment plates that can be used to conduct performance assessments of an organization`s or project`s ability to meet quality goals for analytical laboratory activities. These assessment plates are provided as non-prescriptive guidance to EM-support organizations responsible for collection of environmental data for remediation and waste management programs at DOE facilities. The assessments evaluate objectively all components of the analytical laboratory process to determine their proper selection and use.

  17. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  18. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  19. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    PubMed

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods. PMID:26328545

  20. Preliminary fire hazards analysis for W-211, Initial Tank Retrieval Systems

    SciTech Connect

    Huckfeldt, R.A.

    1995-03-16

    A fire hazards analysis (FHA) was performed for Project W-211, Initial Tank Retrieval System (ITRS), at the Department of Energy (DOE) Hanford site. The objectives of this FHA was to determine (1) the fire hazards that expose the Initial Tank Retrieval System or are inherent in the process, (2) the adequacy of the fire-safety features planned, and (3) the degree of compliance of the project with specific fire safety provisions in DOE orders and related engineering codes and standards. The scope included the construction, the process hazards, building fire protection, and site wide fire protection. The results are presented in terms of the fire hazards present, the potential extent of fire damage, and the impact on employees and public safety. This study evaluated the ITRS with respect to its use at Tank 241-SY-101 only.

  1. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  2. Guidance on health effects of toxic chemicals. Safety Analysis Report Update Program

    SciTech Connect

    Foust, C.B.; Griffin, G.D.; Munro, N.B.; Socolof, M.L.

    1994-02-01

    Martin Marietta Energy Systems, Inc. (MMES), and Martin Marietta Utility Services, Inc. (MMUS), are engaged in phased programs to update the safety documentation for the existing US Department of Energy (DOE)-owned facilities. The safety analysis of potential toxic hazards requires a methodology for evaluating human health effects of predicted toxic exposures. This report provides a consistent set of health effects and documents toxicity estimates corresponding to these health effects for some of the more important chemicals found within MMES and MMUS. The estimates are based on published toxicity information and apply to acute exposures for an ``average`` individual. The health effects (toxicological endpoints) used in this report are (1) the detection threshold; (2) the no-observed adverse effect level; (3) the onset of irritation/reversible effects; (4) the onset of irreversible effects; and (5) a lethal exposure, defined to be the 50% lethal level. An irreversible effect is defined as a significant effect on a person`s quality of life, e.g., serious injury. Predicted consequences are evaluated on the basis of concentration and exposure time.

  3. CHARACTERIZATION OF DEFENSE NUCLEAR WASTE USING HAZARDOUS WASTE GUIDANCE. APPLICATIONS TO HANFORD SITE ACCELERATED HIGH-LEVEL WASTE TREATMENT AND DISPOSAL MISSION0

    SciTech Connect

    Hamel, William; Huffman, Lori; Lerchen, Megan; Wiemers, Karyn

    2003-02-27

    Federal hazardous waste regulations were developed for management of industrial waste. These same regulations are also applicable for much of the nation's defense nuclear wastes. At the U.S. Department of Energy's (DOE) Hanford Site in southeast Washington State, one of the nation's largest inventories of nuclear waste remains in storage in large underground tanks. The waste's regulatory designation and its composition and form constrain acceptable treatment and disposal options. Obtaining detailed knowledge of the tank waste composition presents a significant portion of the many challenges in meeting the regulatory-driven treatment and disposal requirements for this waste. Key in applying the hazardous waste regulations to defense nuclear wastes is defining the appropriate and achievable quality for waste feed characterization data and the supporting evidence demonstrating that applicable requirements have been met at the time of disposal. Application of a performance-based approach to demonstrating achievable quality standards will be discussed in the context of the accelerated high-level waste treatment and disposal mission at the Hanford Site.

  4. Information System Hazard Analysis: A Method for Identifying Technology-induced Latent Errors for Safety.

    PubMed

    Weber, Jens H; Mason-Blakley, Fieran; Price, Morgan

    2015-01-01

    Many health information and communication technologies (ICT) are safety-critical; moreover, reports of technology-induced adverse events related to them are plentiful in the literature. Despite repeated criticism and calls to action, recent data collected by the Institute of Medicine (IOM) and other organization do not indicate significant improvements with respect to the safety of health ICT systems. A large part of the industry still operates on a reactive "break & patch" model; the application of pro-active, systematic hazard analysis methods for engineering ICT that produce "safe by design" products is sparse. This paper applies one such method: Information System Hazard Analysis (ISHA). ISHA adapts and combines hazard analysis techniques from other safety-critical domains and customizes them for ICT. We provide an overview of the steps involved in ISHA and describe. PMID:25676999

  5. A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen

    2014-05-01

    Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho

  6. 77 FR 429 - Clarification and Further Guidance on the Fireworks Approvals Policy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-05

    ... June 29, 2011 (76 FR 38053). Specifically, this document provides clarification and additional guidance... Pipeline and Hazardous Materials Safety Administration 49 CFR Part 173 Clarification and Further Guidance on the Fireworks Approvals Policy AGENCY: Pipeline and Hazardous Materials Safety...

  7. Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning

    NASA Astrophysics Data System (ADS)

    Jeuken, Ad; Mendoza, Guillermo; Matthews, John; Ray, Patrick; Haasnoot, Marjolijn; Gilroy, Kristin; Olsen, Rolf; Kucharski, John; Stakhiv, Gene; Cushing, Janet; Brown, Casey

    2016-04-01

    over time. They are part of the Dutch adaptive planning approach Adaptive Delta Management, executed and develop by the Dutch Delta program. Both decision scaling and adaptation pathways have been piloted in studies worldwide. The objective of CRIDA is to mainstream effective climate adaptation for professional water managers. The CRIDA publication, due in april 2016, follows the generic water design planning design cycle. At each step, CRIDA describes stepwise guidance for incorporating climate robustness: problem definition, stress test, alternatives formulation and recommendation, evaluation and selection. In the presentation the origin, goal, steps and practical tools available at each step of CRIDA will be explained. In two other abstracts ("Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region" by Gilroy et al., "The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia, by Kucharski et al.), the application of CRIDA to cases is explained

  8. Natural Hazards

    NASA Astrophysics Data System (ADS)

    Bryant, Edward

    2005-02-01

    This updated new edition presents a comprehensive, inter-disciplinary analysis of the complete range of natural hazards. Edward Bryant describes and explains how hazards occur, examines prediction methods, considers recent and historical hazard events and explores the social impact of such disasters. Supported by over 180 maps, diagrams and photographs, this standard text is an invaluable guide for students and professionals in the field. First Edition Hb (1991): 0-521-37295-X First Edition Pb (1991): 0-521-37889-3

  9. System design impact of guidance and navigation analysis for a SEPS 1979 Encke flyby

    NASA Technical Reports Server (NTRS)

    Hong, P.

    1975-01-01

    Baseline guidance and navigation strategy for an ecliptic projection of the Encke flyby mission consider solar electric propulsion stage parameters in generating optimized and targeted trajectory control. Results show the Encke relative approach error to be dominated by ephemerical uncertainties, particularly the velocity components.

  10. A Reaction to an Analysis of Nine Computer-Assisted Career Guidance Systems.

    ERIC Educational Resources Information Center

    Garcia, Gonzalo, Jr.; Plansker, Christine M.

    1990-01-01

    Critiques the assessment of computer-assisted career guidance systems by Sampson et al. (CE 521 972), stating that the conceptual framework was sound and well developed and criteria were based on recognized experts. Limitations of the study are the assumption of equivalent effectiveness and the use of a checklist for comparing the systems. (SK)

  11. FIRE HAZARDS ANALYSIS FOR THE FUEL SUPPLY SYSTEM - ESF PACKAGE 1E

    SciTech Connect

    N.M. Ruonavaara

    1995-04-12

    The purpose of the fire hazards analysis is to comprehensively assess the risk from fire within individual fire areas in accordance with US. Department of Energy (DOE) Order 5480.7h (Reference 4.4.7.4). This document will assess the fire hazard risk within the Exploratory Studies Facility (ESF) fuel supply system, Package 1E, and evaluate whether the following objectives are met: (1) Ensure that property damage from fire and related perils do not exceed an acceptable level. (2) Provide input to the facility Safety Analysis Report (SAR).

  12. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  13. Wave-height hazard analysis in Eastern Coast of Spain - Bayesian approach using generalized Pareto distribution

    NASA Astrophysics Data System (ADS)

    Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.

    2005-03-01

    Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.

  14. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    PubMed

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-01

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition. PMID:26035281

  15. What's the evidence that NICE guidance has been implemented? Results from a national evaluation using time series analysis, audit of patients' notes, and interviews

    PubMed Central

    Sheldon, Trevor A; Cullum, Nicky; Dawson, Diane; Lankshear, Annette; Lowson, Karin; Watt, Ian; West, Peter; Wright, Dianne; Wright, John

    2004-01-01

    Objectives To assess the extent and pattern of implementation of guidance issued by the National Institute for Clinical Excellence (NICE). Design Interrupted time series analysis, review of case notes, survey, and interviews. Setting Acute and primary care trusts in England and Wales. Participants All primary care prescribing, hospital pharmacies; a random sample of 20 acute trusts, 17 mental health trusts, and 21 primary care trusts; and senior clinicians and managers from five acute trusts. Main outcome measures Rates of prescribing and use of procedures and medical devices relative to evidence based guidance. Results 6308 usable patient audit forms were returned. Implementation of NICE guidance varied by trust and by topic. Prescribing of some taxanes for cancer (P < 0.002) and orlistat for obesity (P < 0.001) significantly increased in line with guidance. Prescribing of drugs for Alzheimer's disease and prophylactic extraction of wisdom teeth showed trends consistent with, but not obviously a consequence of, the guidance. Prescribing practice often did not accord with the details of the guidance. No change was apparent in the use of hearing aids, hip prostheses, implantable cardioverter defibrillators, laparoscopic hernia repair, and laparoscopic colorectal cancer surgery after NICE guidance had been issued. Conclusions Implementation of NICE guidance has been variable. Guidance seems more likely to be adopted when there is strong professional support, a stable and convincing evidence base, and no increased or unfunded costs, in organisations that have established good systems for tracking guidance implementation and where the professionals involved are not isolated. Guidance needs to be clear and reflect the clinical context. PMID:15514342

  16. Additional guidance for including nuclear safety equivalency in the Canister Storage Building and Cold Vacuum Drying Facility final safety analysis report

    SciTech Connect

    Garvin, L.J.

    1997-05-20

    This document provides guidance for the production of safety analysis reports that must meet both DOE Order 5480.23 and STD 3009, and be in compliance with the DOE regulatory policy that imposes certain NRC requirements.

  17. Chloroform: exposure estimation, hazard characterization, and exposure-response analysis.

    PubMed

    Meek, M E; Beauchamp, R; Long, G; Moir, D; Turner, L; Walker, M

    2002-01-01

    inhalation and ingestion at lowest concentrations or doses. The mean rate of metabolism associated with a 5% increase in fatty cysts estimated on the basis of the PBPK model was compared with tissue dose measures resulting from the scenarios already described, and lowest concentrations reported to induce cellular proliferation in the nasal cavities of rats and mice were compared directly with midpoint and 95th percentile estimates of concentrations of chloroform in indoor air in Canada. The degree of confidence in the underlying database and uncertainties in estimates of exposure and in characterization of hazard and dose response are delineated. PMID:12162870

  18. Probabilistic seismic hazard analysis for offshore structures in the Santa Barbara Channel phase 2 report

    SciTech Connect

    Foxall, W; Savy, J

    1999-08-06

    This report summarizes progress through Phase 2 of the probabilistic seismic hazards analysis (PSHA) for the Santa Barbara Channel being carried out by the Lawrence Livermore National Laboratory (LLNL) for the Minerals Management Service (MMS) of the US Department of the Interior. The purpose of the PSHA is to provide a basis for development by MMS of regulations governing evaluation of applications to re-license existing oil platforms in federal waters within the Channel with respect to seismic loading. The final product of the analysis will be hazard maps of ground motion parameters at specified probability levels of exceedence. This report summarizes the characterization of local earthquake sources within the Channel and onshore areas of the Western Transverse Ranges, development of a ground motion attenuation model for the region, and presents preliminary hazard results at three selected sites.

  19. Fire hazards analysis for the uranium oxide (UO{sub 3}) facility

    SciTech Connect

    Wyatt, D.M.

    1994-12-06

    The Fire Hazards Analysis (FHA) documents the deactivation end-point status of the UO{sub 3} complex fire hazards, fire protection and life safety systems. This FHA has been prepared for the Uranium Oxide Facility by Westinghouse Hanford Company in accordance with the criteria established in DOE 5480.7A, Fire Protection and RLID 5480.7, Fire Protection. The purpose of the Fire Hazards Analysis is to comprehensively and quantitatively assess the risk from a fire within individual fire areas in a Department of Energy facility so as to ascertain whether the objectives stated in DOE Order 5480.7, paragraph 4 are met. Particular attention has been paid to RLID 5480.7, Section 8.3, which specifies the criteria for deactivating fire protection in decommission and demolition facilities.

  20. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard

  1. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    NASA Astrophysics Data System (ADS)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  2. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    NASA Technical Reports Server (NTRS)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  3. Damage functions for climate-related hazards: unification and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.

    2016-05-01

    Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.

  4. Liquefaction hazard analysis for infrastructure development in gulf of Jakarta

    NASA Astrophysics Data System (ADS)

    Dinata, Indra A.; Darlan, Yudi; Sadisun, Imam A.; Pindratno, Haris; Saryanto, Agus

    2016-05-01

    Gulf of Jakarta is an area of active sedimentation. There exist a wide sediment deposition area on the north coast of Jakarta. Generally, these sediments have not been consolidated, so that the conditions in these area is an important factor to determining liquefaction in these area. Liquefaction may occur because of earthquake that cause loss of strength and stiffness in soils. Analysis of liquefaction potential based from SPT data taken at gulf of Jakarta, include susceptibility rate and the factors that triggering. Liquefaction analysis methods compared with each other to get the factor of safety against liquefaction according to the characteristics of the soil. Liquefaction analysis at surface using susceptibility rating factor (SRF). SRF method controled by factors: history, geology, composition, and groundwater. Each factors have parameters that determine the value of SRF.From the analysis, Gulf of Jakarta has susceptibility rating from liquefaction with SRF value 12 - 35. The value shows that Gulf of Jakarta dominated by area that have susceptibility rating from medium to high. High susceptibility rating from liquefaction concentrated at coast area.

  5. Proposed powered explicit guidance thrust integrals derivation/implementation. Mission planning, mission analysis and software formulation

    NASA Technical Reports Server (NTRS)

    Jaggers, R. F.

    1976-01-01

    A new exoatmospheric, powered explicit guidance (PEG) thrust integral formulation and a simple method of implementation are presented. The new thrust integral formulation is significantly simpler than that currently used in PEG. Preliminary estimates indicate a computer storage savings of 220 words, which is approximately 10 percent of the current PEG ascent program. Alternate methods of implementation that could produce even more savings are noted.

  6. Flight Guidance System Requirements Specification

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Tribble, Alan C.; Carlson, Timothy M.; Danielson, Eric J.

    2003-01-01

    This report describes a requirements specification written in the RSML-e language for the mode logic of a Flight Guidance System of a typical regional jet aircraft. This model was created as one of the first steps in a five-year project sponsored by the NASA Langley Research Center, Rockwell Collins Inc., and the Critical Systems Research Group of the University of Minnesota to develop new methods and tools to improve the safety of avionics designs. This model will be used to demonstrate the application of a variety of methods and techniques, including safety analysis of system and subsystem requirements, verification of key properties using theorem provers and model checkers, identification of potential sources mode confusion in system designs, partitioning of applications based on the criticality of system hazards, and autogeneration of avionics quality code. While this model is representative of the mode logic of a typical regional jet aircraft, it does not describe an actual or planned product. Several aspects of a full Flight Guidance System, such as recovery from failed sensors, have been omitted, and no claims are made regarding the accuracy or completeness of this specification.

  7. BIOLOGICAL MONITORING SYSTEMS FOR HAZARDOUS WASTE SITES (PRODUCTION AND ANALYSIS OF ANALYTICAL REFERENCE MATERIALS)

    EPA Science Inventory

    EPA programs in pesticides, toxics, and hazardous waste require analytical reference materials. This project emphasized the collection of and analysis of urine, fat, and blood for ultimate use as reference samples. Another objective of the project concerned the practicality of us...

  8. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) plan. 120.8 Section 120.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS... the critical control points; (4) List the procedures, and the frequency with which they are to...

  9. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...) plan. 120.8 Section 120.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS... the critical control points; (4) List the procedures, and the frequency with which they are to...

  10. Example process hazard analysis of a Department of Energy water chlorination process

    SciTech Connect

    Not Available

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  11. Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map

    NASA Astrophysics Data System (ADS)

    Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.

    2016-03-01

    Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.

  12. Ocean thermal conversion (OTEC) project bottom cable protection study: environmental characteristics and hazards analysis

    SciTech Connect

    Chern, C.; Tudor, W.

    1981-10-01

    Seafloor cable-protection criteria and technology as applied to the four proposed OTEC plant sites and cable routes at Hawaii, Puerto Rico, Guam and Florida were examined. Study of environmental characteristics for each site covered: (A) natural factors of location, tide and currents, wind and wave, bottom soil type and seafloor movement; and (B) man-made factors such as ship traffic, fishing activities, ocean mining, government regulations. These characteristics were studied to determine the hazards which are potential sources of damage to a cable system. Hazards include: chafe and corrosion, hydrodynamic forces due to wave and current action, mudslides, earthquakes, trawler and/or dredge action and ship anchors. An analysis of the history of submarine-cable failures was conducted. Included are the probabilities of damage related to water depth. Probabilities become minimal for all hazards in water depths of 1500 feet and more. Chafe and corrosion had the highest probability of causing damage to a seafloor cable compared to the other hazards. Because of the hazards present at all sites, cable burial is recommended as the best means of protection.

  13. Considering both aleatory variability and epistemic variability in probabilistic seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Sung, Chih-Hsuan; Gao, Jia-Cian; Lee, Chyi-Tyi

    2015-04-01

    In the modern probabilistic seismic hazard analysis (PSHA), a standard deviation (sigma) of total variability was considered in the integration for seismic exceeding rate, and this lead to increased seismic hazard estimates. Epistemic uncertainty results from incomplete knowledge of the earthquake process and has nothing to do with neither the temporal variation nor the spatial variation of ground motions. It is not could be considered in the integration, epistemic variability may be included in the logic trees. This study uses Taiwan data as example to test a case in Taipei. Results reveal that if only the aleatory variability is considered in the integration, the hazard level could be reduced about 33% at the 475-year return period, and it reduced about 36% and 50% at 10000-year and 100000-year, respectively. However, if epistemic variability is considered in the logic trees besides the aleatory variability is considered in the integration, then the hazard level is similar to that from using total variability; it shows only a little bit smaller at long return period. Much effort in reducing the hazard level to a reasonable value still remains to be studied.

  14. A hazard and probabilistic safety analysis of a high-level waste transfer process

    SciTech Connect

    Bott, T.F.; Sasser, M.K.

    1996-09-01

    This paper describes a safety analysis of a transfer process for high-level radioactive and toxic waste. The analysis began with a hazard assessment that used elements of What If, Checklist, Failure Modes and Effects Analysis, and Hazards and Operability Study (HAZOP) techniques to identify and rough-in accident sequences. Based on this preliminary analysis, the most significant accident sequences were developed further using event trees. Quantitative frequency estimates for the accident sequences were based on operational data taken from the historical record of the site where the process is performed. Several modeling challenges were encountered in the course of the study. These included linked initiating and accident progression events, fire propagation modeling, accounting for administrative control violations, and handling mission-phase effects.

  15. SEPS guidance and navigation autonomy selection via mission analysis. [Solar Electric Propulsion Stage for Space Transportation System

    NASA Technical Reports Server (NTRS)

    Causey, W.; Sohoni, V.; Shenfish, K. L.; Wallace, C. T.

    1975-01-01

    A systematic rationale for selecting a cost-effective guidance and navigation (G & N) autonomy level for the solar electric propulsion stage (SEPS) vehicle is developed. After a definition of autonomy levels, a mission analysis is performed for representative SEPS missions using realistic G & N sensor hardware. Cost data for fabricating, integrating and refurbishing onboard avionics hardware and the ground costs corresponding to each autonomy level are generated. Results are presented that indicate performance of various G & N sensor hardware sets and the dominating factors which influence G & N autonomy level selection.

  16. Description of the Northwest hazardous waste site data base and preliminary analysis of site characteristics

    SciTech Connect

    Woodruff, D.L.; Hartz, K.E.; Triplett, M.B.

    1988-08-01

    The Northwest Hazardous Waste RD and D Center (the Center) conducts research, development, and demonstration (RD and D) activities for hazardous and radioactive mixed-waste technologies applicable to remediating sites in the states of Idaho, Montana, Oregon, and Washington. To properly set priorities for these RD and D activities and to target development efforts it is necessary to understand the nature of the sites requiring remediation. A data base of hazardous waste site characteristics has been constructed to facilitate this analysis. The data base used data from EPA's Region X Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) and from Preliminary Assessment/Site Investigation (PA/SI) forms for sites in Montana. The Center's data base focuses on two sets of sites--those on the National Priorities List (NPL) and other sites that are denoted as ''active'' CERCLIS sites. Active CERCLIS sites are those sites that are undergoing active investigation and analysis. The data base contains information for each site covering site identification and location, type of industry associated with the site, waste categories present (e.g., heavy metals, pesticides, etc.), methods of disposal (e.g., tanks, drums, land, etc.), waste forms (e.g., liquid, solid, etc.), and hazard targets (e.g., surface water, groundwater, etc.). As part of this analysis, the Northwest region was divided into three geographic subregions to identify differences in disposal site characteristics within the Northwest. 2 refs., 18 figs., 5 tabs.

  17. Tank farms hazards assessment

    SciTech Connect

    Broz, R.E.

    1994-09-30

    Hanford contractors are writing new facility specific emergency procedures in response to new and revised US Department of Energy (DOE) Orders on emergency preparedness. Emergency procedures are required for each Hanford facility that has the potential to exceed the criteria for the lowest level emergency, an Alert. The set includes: (1) a facility specific procedure on Recognition and Classification of Emergencies, (2) area procedures on Initial Emergency Response and, (3) an area procedure on Protective Action Guidance. The first steps in developing these procedures are to identify the hazards at each facility, identify the conditions that could release the hazardous material, and calculate the consequences of the releases. These steps are called a Hazards Assessment. The final product is a document that is similar in some respects to a Safety Analysis Report (SAR). The document could br produced in a month for a simple facility but could take much longer for a complex facility. Hanford has both types of facilities. A strategy has been adopted to permit completion of the first version of the new emergency procedures before all the facility hazards Assessments are complete. The procedures will initially be based on input from a task group for each facility. This strategy will but improved emergency procedures in place sooner and therefore enhance Hanford emergency preparedness. The purpose of this document is to summarize the applicable information contained within the Waste Tank Facility ``Interim Safety Basis Document, WHC-SD-WM-ISB-001`` as a resource, since the SARs covering Waste Tank Operations are not current in all cases. This hazards assessment serves to collect, organize, document and present the information utilized during the determination process.

  18. Post-Flight Analysis of the Guidance, Navigation, and Control Performance During Orion Exploration Flight Test 1

    NASA Technical Reports Server (NTRS)

    Barth, Andrew; Mamich, Harvey; Hoelscher, Brian

    2015-01-01

    The first test flight of the Orion Multi-Purpose Crew Vehicle presented additional challenges for guidance, navigation and control as compared to a typical re-entry from the International Space Station or other Low Earth Orbit. An elevated re-entry velocity and steeper flight path angle were chosen to achieve aero-thermal flight test objectives. New IMU's, a GPS receiver, and baro altimeters were flight qualified to provide the redundant navigation needed for human space flight. The guidance and control systems must manage the vehicle lift vector in order to deliver the vehicle to a precision, coastal, water landing, while operating within aerodynamic load, reaction control system, and propellant constraints. Extensive pre-flight six degree-of-freedom analysis was performed that showed mission success for the nominal mission as well as in the presence of sensor and effector failures. Post-flight reconstruction analysis of the test flight is presented in this paper to show whether that all performance metrics were met and establish how well the pre-flight analysis predicted the in-flight performance.

  19. In silico analysis of nanomaterials hazard and risk.

    PubMed

    Cohen, Yoram; Rallo, Robert; Liu, Rong; Liu, Haoyang Haven

    2013-03-19

    false positive relative to false negative predictions and the reliability of toxicity data. To establish the environmental impact of ENMs identified as toxic, researchers will need to estimate the potential level of environmental exposure concentration of ENMs in the various media such as air, water, soil, and vegetation. When environmental monitoring data are not available, models of ENMs fate and transport (at various levels of complexity) serve as alternative approaches for estimating exposure concentrations. Risk management decisions regarding the manufacturing, use, and environmental regulations of ENMs would clearly benefit from both the assessment of potential ENMs exposure concentrations and suitable toxicity metrics. The decision process should consider the totality of available information: quantitative and qualitative data and the analysis of nanomaterials toxicity, and fate and transport behavior in the environment. Effective decision-making to address the potential impacts of nanomaterials will require considerations of the relevant environmental, ecological, technological, economic, and sociopolitical factors affecting the complete lifecycle of nanomaterials, while accounting for data and modeling uncertainties. Accordingly, researchers will need to establish standardized data management and analysis tools through nanoinformatics as a basis for the development of rational decision tools. PMID:23138971

  20. Graded approach for initiating event selection in a facility hazard analysis

    SciTech Connect

    Majumdar, K.; Altenbach, T.

    1998-04-01

    This paper describes a methodology for selecting initiating events or event scenarios for the hazard analysis of a new Department of Energy (DOE) facility at the Nevada Test Site for nuclear explosive operations called the Device Assembly Facility (DAF). The selection process is a very important first step in conducting the hazard analysis for the facility, which in turn may feed into a quantitative risk analysis. A comprehensive risk analysis is dependent on the identification and inclusion of a complete set of initiating events in the analysis model. A systematic and logical method of grading or screening all the potential initiating events satisfies the needs for completeness within the bounds of efficiency and practicality. By applying the graded approach to the selection of the initiating events, the task and hazard analysis was able to focus its attention on only those events having the potential to develop into credible accident scenarios. Resources were concentrated into the understanding of those scenarios, and assuring that adequate positive measures are in place to control the risk associated with them.

  1. Incorporating Site Amplification into Seismic Hazard Analysis: A Fully Probabilistic Approach

    NASA Astrophysics Data System (ADS)

    Cramer, C. H.

    2001-12-01

    Developing site-specific amplification factors from geological, geophysical, and geotechnical information has been the state-of-practice for the last couple of decades. Now the state-of-the-art is to develop a distribution of possible site-specific amplification factors for a given input rock ground-motion. These state-of-the-art site-amplification distributions account for the uncertainty in soil properties and Vs structure at the site. Applying these site amplification distributions to a site-specific seismic hazard analysis requires a fully probabilistic approach. One such approach is to modify the generic ground-motion attenuation relations used in a probabilistic seismic hazard analysis to site-specific relations using a site amplification distribution developed for that site. The modification of the ground-motion attenuation relation is done prior to calculating probabilistic seismic hazard at the site. This approach has been implemented using the USGS National Seismic Hazard Mapping codes. Standard hazard models and hard-rock ground-motion attenuation relations are input into the modified codes along with a description of the site-specific amplification in the form of a lognormal probability-density-function (pdf). For each hard-rock ground-motion level, the pdf is specified by the median site-amplification factor and its natural-logarithmic standard deviation. The fully probabilistic ground-motion hazard curves are always above the hazard curve derived from multiplying the hard-rock hazard curve by the site's median site-amplification factors. At Paducah, Kentucky the difference is significant for 2%-in-50-year ground-motion estimates (0.7g vs. 0.5g for PGA and 1.3g vs. 0.9g for 1.0 s Sa). At Memphis, Tennessee the differences are less significant and may only be important at long periods (1.0 s and longer) on Mississippi flood-plain (lowlands) deposits (on the uplands deposits: 0.35g vs. 0.30g for PGA and 0.8g vs. 0.7g for 1.0 s Sa; on the lowlands

  2. Internet Guidance.

    ERIC Educational Resources Information Center

    Savage, Frank X.

    1999-01-01

    States that children need proper guidance and boundaries to reap the benefits of the Internet. Focuses on two issues: how parents can protect their children from the Internet's potential dangers and how they can help their children use the Internet to get work done. Includes suggestions for teachers to help parents meet these challenges. (VWC)

  3. GUIDANCE DOCUMENTS

    EPA Science Inventory

    Defn: Guidance Document - A peer-reviewed document stating overarching principles and practices to be followed (also includes handbook documents).

    Uncertain Characterization of Flood Hazard Using Bivariate Analysis Based on Copulas

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2015-04-01

    This study presents a methodology to derive probabilistic flood hazard map in flood prone areas taking into account uncertainties in the definition of design-hydrographs. Particularly, we present an innovative approach to obtain probabilistic inundation and flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from a bivariate statistical analysis, through the use of copulas. This study also aims to quantify the contribution of boundary conditions uncertainty in order to explore the impact of this uncertainty on probabilistic flood hazard mapping. The uncertainty of extreme flood events is considered in terms of different possible combinations of peak discharge and flood volume given by the copula. Further, we analyzed the role of a multivariate probability hydrological analysis on inundation and flood hazard maps highlighting the differences between deterministic and probabilistic approaches. The methodology has been applied to a study area located in Sicily that was subject to several flooding events in the past.

  4. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  5. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  6. A GIS-based software for lifeline reliability analysis under seismic hazard

    NASA Astrophysics Data System (ADS)

    Sevtap Selcuk-Kestel, A.; Sebnem Duzgun, H.; Oduncuoglu, Lutfi

    2012-05-01

    Lifelines are vital networks, and it is important that those networks are still functional after major natural disasters such as earthquakes. Assessing reliability of lifelines requires spatial analysis of lifelines with respect to a given earthquake hazard map. In this paper, a GIS-based software for the spatial assessment of lifeline reliability which is developed by using GeoTools environment is presented. The developed GIS-based software imports seismic hazard and lifeline network layers and then creates a gridded network structure. Finally, it adopts a network reliability algorithm to calculate the upper and lower bounds for system reliability of the lifeline under seismic hazard. The software enables user visualizing the reliability values in graphical form as well as thematic lifeline reliability map with colors indicating reliability level along with the link and the overall network. It also provides functions for saving the analysis results in shape file format. The software is tested and validated for an application taken from literature which is a part of water distribution system of Bursa in Turkey. The developed GIS-based software module that creates GIS-based reliability map of the lifelines under seismic hazard is user friendly, modifiable, fast in execution time, illustrative and validated for the existing literature studies.

  7. Multi-hazard risk analysis using the FP7 RASOR Platform

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew

    2014-10-01

    Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.

  8. Guidance for establishment and implementation of field sample management programs in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-02-18

    The role of the National Sample Management Program (NSMP) proposed by the Department of Energy`s Office of Environmental Management (EM) is to be a resource for EM programs and for local Field Sample Management Programs (FSMPs). It will be a source of information on sample analysis and data collection within the DOE complex. The purpose of this document is to establish the suggested scope of the FSMP activities to be performed under each Operations Office, list the drivers under which the program will operate, define terms and list references. This guidance will apply only to EM sampling and analysis activities associated with project planning, contracting, laboratory selection, sample collection, sample transportation, laboratory analysis and data management.

  9. Endangerment assessment guidance

    SciTech Connect

    Not Available

    1985-11-22

    The directive clarifies the requirement that an endangerment assessment be developed to support all administrative and judicial enforcement actions under Section 106 of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and Section 7003 of the Resource Conservation and Recovery Act (RCRA). Before taking enforcement action under these provisions to abate the hazards or potential hazards at a site, the Environmental Protection Agency (EPA) must be able to properly document and justify its assertion that an imminent and substantial endangerment to public health or welfare or the environment may exist. The endangerment assessment provides this documentation and justification. The endangerment assessment is not necessary to support Section 104 actions. It also provides guidance on the content, timing, level of detail, format, and resources required for the preparation of endangerment assessments.

  10. Hazard analysis of long term viewing of visible laser light off of fluorescent diffuse reflective surfaces (post-it).

    SciTech Connect

    Augustoni, Arnold L.

    2006-10-01

    A laser hazard analysis is performed to evaluate if the use of fluorescent diffuse reflectors to view incident laser beams (Coherent Verdi 10W) present a hazard based on the ANSI Standard Z136.1-2000, American National Standard for the Safe Use of Lasers. The use of fluorescent diffuse reflectors in the alignment process does not pose an increased hazard because of the fluorescence at a different wavelength than that of the incident laser.

  11. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  12. Analysis of Precipitation (Rain and Snow) Levels and Straight-line Wind Speeds in Support of the 10-year Natural Phenomena Hazards Review for Los Alamos National Laboratory

    SciTech Connect

    Kelly, Elizabeth J.; Dewart, Jean Marie; Deola, Regina

    2015-12-10

    This report provides site-specific return level analyses for rain, snow, and straight-line wind extreme events. These analyses are in support of the 10-year review plan for the assessment of meteorological natural phenomena hazards at Los Alamos National Laboratory (LANL). These analyses follow guidance from Department of Energy, DOE Standard, Natural Phenomena Hazards Analysis and Design Criteria for DOE Facilities (DOE-STD-1020-2012), Nuclear Regulatory Commission Standard Review Plan (NUREG-0800, 2007) and ANSI/ ANS-2.3-2011, Estimating Tornado, Hurricane, and Extreme Straight-Line Wind Characteristics at Nuclear Facility Sites. LANL precipitation and snow level data have been collected since 1910, although not all years are complete. In this report the results from the more recent data (1990–2014) are compared to those of past analyses and a 2004 National Oceanographic and Atmospheric Administration report. Given the many differences in the data sets used in these different analyses, the lack of statistically significant differences in return level estimates increases confidence in the data and in the modeling and analysis approach.

  13. Patient-specific meta-analysis for risk assessment using multivariate proportional hazards regression

    PubMed Central

    Crager, Michael R.; Tang, Gong

    2015-01-01

    We propose a method for assessing an individual patient’s risk of a future clinical event using clinical trial or cohort data and Cox proportional hazards regression, combining the information from several studies using meta-analysis techniques. The method combines patient-specific estimates of the log cumulative hazard across studies, weighting by the relative precision of the estimates, using either fixed- or random-effects meta-analysis calculations. Risk assessment can be done for any future patient using a few key summary statistics determined once and for all from each study. Generalizations of the method to logistic regression and linear models are immediate. We evaluate the methods using simulation studies and illustrate their application using real data. PMID:26664111

  14. Laser safety and hazard analysis for the temperature stabilized BSLT ARES laser system.

    SciTech Connect

    Augustoni, Arnold L.

    2003-08-01

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. As a result of temperature stabilization of the BSLT laser the operating parameters of the laser had changed requiring a hazard analysis based on the new operating conditions. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  15. Seismic Hazard Analysis of Aizawl, India with a Focus on Water System Fragilities

    NASA Astrophysics Data System (ADS)

    Belair, G. M.; Tran, A. J.; Dreger, D. S.; Rodgers, J. E.

    2015-12-01

    GeoHazards International (GHI) has partnered with the University of California, Berkeley in a joint Civil Engineering and Earth Science summer internship program to investigate geologic hazards. This year the focus was on Aizawl, the capital of India's Mizoram state, situated on a ridge in the Burma Ranges. Nearby sources have the potential for large (M > 7) earthquakes that would be devastating to the approximately 300,000 people living in the city. Earthquake induced landslides also threaten the population as well as the city's lifelines. Fieldwork conducted in June 2015 identified hazards to vital water system components. The focus of this abstract is a review of the seismic hazards that affect Aizawl, with special attention paid to water system locations. To motivate action to reduce risk, GHI created an earthquake scenario describing effects of a M7 right-lateral strike-slip intraplate earthquake occurring 30 km below the city. We extended this analysis by exploring additional mapped faults as well as hypothetical blind reverse faults in terms of PGA, PGV, and PSA. Ground motions with hanging wall and directivity effects were also examined. Several attenuation relationships were used in order to assess the uncertainty in the ground motion parameters. Results were used to determine the likely seismic performance of water system components, and will be applied in future PSHA studies.

  16. A formal analysis of attitudes toward siting a hazardous waste incinerator

    SciTech Connect

    Hatfield, T.H. )

    1989-07-01

    This paper presents an approach to the formal analysis of attitudes, and reviews an application of the approach to the siting of a hazardous waste incinerator in Southern California. The formal analysis uses tools from the field of decision analysis to assess attitudes, and uses statistical tools to determine differences and similarities among participants. The results suggest that attitudes toward three siting criteria are not polarized by group affiliation or county residence, but statistically significant differences were found between respondents contacted by mail versus public meetings. This approach demonstrates how issues may be better defined, and how strategies may be developed to improve research, education, and negotiation.

  17. Analysis of the selected optical parameters of filters protecting against hazardous infrared radiation

    PubMed Central

    Gralewicz, Grzegorz; Owczarek, Grzegorz

    2016-01-01

    The paper analyses the selected optical parameters of protective optic filters used for protection of the eyes against hazardous radiation within the visible (VIS) and near infrared (NIR) spectrum range. The indexes characterizing transmission and reflection of optic radiation incident on the filter are compared. As it follows from the completed analysis, the newly developed interference filters provide more effective blocking of infrared radiation in comparison with the currently used protective filters. PMID:26327153

  18. Analysis of the selected optical parameters of filters protecting against hazardous infrared radiation.

    PubMed

    Gralewicz, Grzegorz; Owczarek, Grzegorz

    2016-09-01

    The paper analyses the selected optical parameters of protective optic filters used for protection of the eyes against hazardous radiation within the visible (VIS) and near infrared (NIR) spectrum range. The indexes characterizing transmission and reflection of optic radiation incident on the filter are compared. As it follows from the completed analysis, the newly developed interference filters provide more effective blocking of infrared radiation in comparison with the currently used protective filters. PMID:26327153

  19. Laser Safety and Hazardous Analysis for the ARES (Big Sky) Laser System

    SciTech Connect

    AUGUSTONI, ARNOLD L.

    2003-01-01

    A laser safety and hazard analysis was performed for the ARES laser system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1,for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

    1. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

      SciTech Connect

      Lambert, H; Johnson, G

      2001-07-20

      The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

    2. Ethical guidance in the era of managed care: an analysis of the American College of Healthcare Executives' Code of Ethics.

      PubMed

      Higgins, W

      2000-01-01

      Market competition and the rise of managed care are transforming the healthcare system from a physician-dominated cottage industry into a manager-dominated corporate enterprise. The managed care revolution is also undermining the safe-guards offered by medical ethics and raising serious public concerns. These trends highlight the growing importance of ethical standards for managers. The most comprehensive ethical guidance for health service managers is contained in the American College of Healthcare Executives' (ACHE) Code of Ethics. An analysis of the ACHE Code suggests that it does not adequately address several ethical concerns associated with managed care. The ACHE may wish to develop a supplemental statement regarding ethical issues in managed care. A supplemental statement that provides more specific guidance in the areas of financial incentives to reduce utilization, social mission, consumer/patient information, and the health service manager's responsibility to patients could be extremely valuable in today's complex and rapidly changing environment. More specific ethical guidelines would not ensure individual or organizational compliance. However, they would provide professional standards that could guide decision making and help managers evaluate performance in managed care settings. PMID:11066951

    3. WHC-SD-W252-FHA-001, Rev. 0: Preliminary fire hazard analysis for Phase II Liquid Effluent Treatment and Disposal Facility, Project W-252

      SciTech Connect

      Barilo, N.F.

      1995-05-11

      A Fire Hazards Analysis was performed to assess the risk from fire and other related perils and the capability of the facility to withstand these hazards. This analysis will be used to support design of the facility.

    4. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

      SciTech Connect

      Klein, F.W.

      1994-03-28

      This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

    5. Inventory- or Consequence-Based Evaluation of Hazardous Chemicals: Recommendations for DOE Facility Safety Analysis

      SciTech Connect

      Blanchard, A.

      1999-06-09

      Two different methods are in use for establishing the safety of facilities, processes, or operations involving hazardous chemicals. One sets inventory limits using Occupational Safety and Health Administration (OSHA) threshold quantity (TQ), Environmental Protection Agency (EPA) Clean Air Act Amendment threshold quantity (CAA-TQ), threshold planning quantity (TPQ) or reportable quantity (RQ), values published in the Federal Register. The second method uses toxicological consequence estimates at different receptor points (e.g., facility boundary, 100 m, site boundary) of concentration limits established for this purpose. These include such parameters as EPA acute exposure guidance level (AEGL), emergency response planning guideline (ERPG), and immediately dangerous to life or health (IDLH) values. Estimating the potential downwind concentrations of all chemicals on the OSHA Process Safety Management regulation TQ list compared these two methods. EPA CAA-TQ, TPQ, and RQ, values were extracted for these chemicals. Only 61 of the 128 unique chemicals with TQs also have CAA-TQs, 60 have TPQs, and 78 have RQs. Only 8 of 60 TQs are less than TPQ values for that chemical. Conservative release fractions (at 25 degrees C), and dispersion conditions were used to calculate potential airborne concentrations at 100 m downwind of the assumed release of TQ quantities of each chemical. These calculations were repeated for all chemicals on the TQ list that also had CAA-TQs, TPQs or RQs. These concentrations were compared with ERPG values wherever possible. Every TPQ to ERPG ratio was greater than unity. For RQs, none of 24 RQ to ERPG-1, 6 of 33 RQ to ERPG-2, and 11 of 33 RQ to ERPG-3, ratios were less than ten and only one was less than unity. In other words, severe health consequences could result from potential releases of many of these chemicals. These results demonstrate the undesirability of using regulatory quantities established for different purposes to include these

    6. Hazard screening and proposals for prevention by occupational health service: an experiment with job load and hazard analysis at a Finnish construction company.

      PubMed

      Mattila, M; Kivi, P

      1991-01-01

      In this study a systematic method for workplace investigation was developed and then tested as part of the regular occupational health care procedures in the building trade. Workplace investigation is a concept which entails the analysis of hazards inherent in the work as well as assessment of their effects on workers' well-being. The aim of this paper is to evaluate the effectiveness of the workplace investigation method. The newly developed method, called Job Load and Hazard Analysis, has the following characteristics: a job analytic approach; the application of group problem-solving; and cooperation between occupational health professionals, occupational safety personnel, and line management. The method comprises the identification of health hazards, their assessment, and conclusions and proposals as to their prevention and follow-up. The method was tested as part of one constructor's actual occupational health care programme, over a 2.5-year period. The method worked well as a central component of preventive occupational health care. It yielded concrete data that could be applied to make the occupational health care programme better suited to preventing the hazards inherent in the building trade. The contents of the occupational health care programme were clearly enhanced, the number of preventive measures increased, and the organizational climate improved; the workers praised the increased emphasis on safety. More research is needed, eg in other production settings and to determine the most effective utilization of the data gathered by the method. PMID:2011001

    7. Workflow Management of the SCEC Computational Platforms for Physics-Based Seismic Hazard Analysis

      NASA Astrophysics Data System (ADS)

      Jordan, T. H.; Callaghan, S.; Maechling, P. J.; Juve, G.; Deelman, E.; Rynge, M.; Vahi, K.; Silva, F.

      2012-12-01

      Earthquake simulation has the potential to substantially improve seismic hazard and risk forecasting, but the practicality of using simulation results is limited by the scale and complexity of the computations. Here we will focus on the experience of the Southern California Earthquake Center (SCEC) in applying workflow management tools to facilitate physics-based seismic hazard analysis. This system-level problem can be partitioned into a series of computational pathways according to causal sequences described in terms of conditional probabilities. For example, the exceedance probabilities of shaking intensities at geographically distributed sites conditional on a particular fault rupture (a ground motion prediction model or GMPM) can be combined with the probabilities of different ruptures (an earthquake rupture forecast or ERF) to create a seismic hazard map. Deterministic simulations of ground motions from very large suites (millions) of ruptures, now feasible through high-performance computational facilities such as SCEC's CyberShake Platform, are allowing seismologists to replace empirical GMPMs with physics-based models that more accurately represent wave propagation through heterogeneous geologic structures, such as the sedimentary basins that amplify seismic shaking. One iteration of the current broadband CyberShake hazard model for the Los Angeles region, which calculates ground motions deterministically up to 0.5 Hz and stochastically up to 10 Hz, requires the execution of about 3.3 billion jobs, taking 12.8 million computer hours and producing 10 TB of simulation data. We will show how the scalability and reliability of CyberShake calculations on some of the nation's largest computers has been improved using the Pegasus Workflow Management System. We will also describe the current challenges of scaling these calculations up by an order of magnitude to create a California-wide hazard model, which will be based on the new Uniform California Earthquake

    8. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

      NASA Astrophysics Data System (ADS)

      Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

      2014-12-01

      The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered

    9. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

      PubMed Central

      2014-01-01

      Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

    10. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

      NASA Astrophysics Data System (ADS)

      Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

      2016-04-01

      The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics

    11. Hubble Space Telescope (HST) Fine Guidance Sensor Performance Analysis with Respect to Guide Stars

      NASA Astrophysics Data System (ADS)

      Armstrong, Amber; Taylor, Denise; Reinhart, Merle

      2016-06-01

      The Fine Guidance Sensors (FGS) onboard HST include one of Hubble’s first-generation instruments still in operation today. After several servicing missions and even some refurbishment to the FGSs, the interferometers, although aged, perform at a level exceeding expectations. Since launch in April 1990, FGS3 has performed within operational standards while FGS1 & 2 underwent replacements or refurbishments during SM2, SM3A and SM4. Up until 1999 FGS3 was used for astrometry science when FGS1R took over that role. Currently FGS1R is the only FGS used as a primary science instrument. While very few observing programs request the FGS as their prime instrument nearly all GO observations executed onboard HST use the interferometers to lock onto guide stars. Most observations execute successfully onboard HST while 1.93% of all orbits executed onboard HST have an issue that requires repeating the observations. Of all failed visits, roughly 65% are due to a variation of suboptimal guide star options that test the boundaries of FGS capabilities. In this poster we present the limitations of the FGS interferometers regarding guide stars, their surprising capabilities, and suggest specific guidelines to astronomers who are faced with “shared risk” opportunities for their observations. We hope to share this knowledge with the HST community to improve guide star execution onboard Hubble and to make comparisons to the future James Webb Space Telescope FGS instrument, where astronomers will take part in guide star selection.

    12. Simulation, design, and analysis for magnetic anchoring and guidance of instruments for minimally invasive surgery

      NASA Astrophysics Data System (ADS)

      Luo, Haifeng; Wilson, Emmanuel; Cleary, Kevin

      2012-02-01

      The exploration of natural orifice transluminal endoscopic surgery (NOTES) has brought considerable interest in magnetic anchoring of intracorporeal tools. Magnetic anchoring and guidance system (MAGS) is the concept of anchoring miniature in-vivo tools and device to the parietal peritoneum by coupling with an external magnetic holder module placed on the skin surface. MAGS has been shown to be effective in anchoring passive tools such as in-vivo cameras or tissue retractors. The strength of the magnetic field and magnet configurations employed depends on the size, shape and weight of the in-vivo tools, the coupling distance between internal and external modules, and physiological concerns such as tool interaction and tissue ischemia. This paper presents our effort to develop a better understanding of the coupling dynamic between a small in-vivo robot designed for tissue manipulation, and an external MAGS handle used to position the in-vivo robot. An electromagnetic simulation software (Vizimag 3.19) was used to simulate coupling forces between a two-magnet configuration of the MAGS handle. A prototype model of the in-vivo robot and a two-magnet configuration of a MAGS handle were fabricated. Based on this study, we were able to identify an optimal design solution for a MAGS module given the mechanical constraints of the internal module design.

    13. Design and analysis of field studies with bees: A critical review of the draft EFSA guidance.

      PubMed

      Bakker, Frank

      2016-07-01

      The specific protection goal, primary assessment endpoints, acceptable effect thresholds, and experimental design proposed in the European Food Safety Authority (EFSA) update of the bee guidance document are subjected to critical review. It is concluded that the negligible effect criteria were established without sufficient regulatory definition and without convincing scientific argumentation. For the assessment endpoints, effects on hive strength lack temporal definition and the reduction to numbers of bees is inappropriate to evaluate effects. Restricting mortality assessments to homing failure is not theoretically justified and specific criteria were incorrectly derived. The combination of acute effect estimates with models for chronic stressors is biased risk assessment and a temporal basis for the acceptability of effects is missing. Effects on overwintering success cannot be experimentally assessed using the proposed criteria. The experimental methodology proposed is inappropriate and the logistical consequences, in particular those related to replication and land use are such that field studies are no longer a feasible option for the risk assessment. It may be necessary to explore new lines of thought for the set-up of field studies and to clearly separate experimentation from monitoring. Integr Environ Assess Manag 2016;12:422-428. © 2015 SETAC. PMID:26436642

    14. Hazard analysis of EUCLIDIAN: an image-guided robotic brachytherapy system.

      PubMed

      Hu, Yida; Podder, Tarun; Buzurovic, Ivan; Yan, Kaiguo; Ng, Wan Sing; Yu, Yan

      2007-01-01

      Robotic assistance can help clinicians to improve the flexibility of needle insertion and accuracy of seed deposition. However, the robotic platform is a safety critical system for its automated operational mode. Thus, it is important to perform Hazard Identification & Safety Insurance Control (HISIC) for securing the safety of a medical robotic system. In this paper, we have performed HISIC for our robotic platform, called Endo-Uro Computer Lattice for Intratumoral Delivery, Implementation, and Ablation with Nanosensing (ECLIDIAN). The definition and requirements of the system are described by Unified Modeling Language (UML). Failure Mode and Effect Analysis (FMEA) are executed for the principles of HISIC, such as hazard identification, safety insurance control, safety critical limit, monitoring and control. FMEA combined with UML can also be implemented to ensure reliability of the human operation. On the basis of safety control index and fuzzy mathematics, safety effective value is outlined to assess the validity of safety insurance control for robotic system. The above principles and methods are feasible and effective for hazard analysis during the development of the robotic system. PMID:18002189

    15. Proportional-hazards models for improving the analysis of light-water-reactor-component failure data

      SciTech Connect

      Booker, J.B.; Johnson, M.E.; Easterling, R.G.

      1981-01-01

      The reliability of a power plant component may depend on a variety of factors (or covariates). If a single regression model can be specified to relate these factors to the failure rate, then all available data can be used to estimate and test for the effects of these covariates. One such model is a proportional hazards function that is specified as a product of two terms: a nominal hazard rate that is a function of time and a second term that is a function of the covariates. The purpose of this paper is to adapt two such models to LWR valve failure rate analysis, to compare the results, and to discuss the strengths and weaknesses of these applications.

    16. Remedial Investigation/Feasibility Study (RI/FS) process, elements and techniques guidance

      SciTech Connect

      Not Available

      1993-12-01

      This manual provides detailed guidance on Remedial Investigation/Feasibility Studies (RI/FSs) conducted pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) at Department of Energy (DOE) facilities. The purpose of the RI/FS, to assess the risk posed by a hazardous waste site and to determine the best way to reduce that risk, and its structure (site characterization, risk assessment, screening and detailed analysis of alternatives, etc.) is defined in the National Oil and Hazardous Substances Pollution Contingency Plan (NCP) and further explained in the Environmental Protection Agency`s (EPA`s) Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA (Interim Final) 540/G-89/004, OSWER Directive 9355.3-01, October 1988. Though issued in 1988, the EPA guidance remains an excellent source of information on the conduct and structure of an RI/FS. This document makes use of supplemental RI/FS-related guidance that EPA has developed since its initial document was issued in 1988, incorporates practical lessons learned in more than 12 years of experience in CERCLA hazardous site remediation, and drawing on those lessons, introduces the Streamlined Approach For Environmental Restoration (SAFER), developed by DOE as a way to proceed quickly and efficiently through the RI/FS process at DOE facilities. Thus as its title implies, this guidance is intended to describe in detail the process and component elements of an RI/FS, as well as techniques to manage the RI/FS effectively.

    17. Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

      NASA Astrophysics Data System (ADS)

      Necmioglu, Ocal; Meral Ozel, Nurcan

      2015-04-01

      Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the earthquakes resembling the

    18. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, North-Western Italy

      NASA Astrophysics Data System (ADS)

      Turconi, L.; Tropeano, D.; Savio, G.; De, S. Kumar; Mason, P. J.

      2015-04-01

      A Civil Protection Plan has been drafted for a 600 km2 mountainous region in NW Italy Consisting of Orco and Soana Valleys. It is a part of the oldest natural park in Italy and attracts several thousand tourists every year. The work is concerned with the analysis of relevant physiographic characteristics of this Alpine landscapehaving extremely variable geomorphology and possess a long history of instability. Thousands of records as well as digital maps (involving overlay and comparison of up to 90 GIS layers) have been analyzed and cross-correlated to find out the details of the events. The study area experienced different types of natural hazards, typical of the whole Alpine environment. Thus, the present area has been selected for such multi-hazard research in which several natural processes have been investigated, concerning their damaging effects over the land. Due to 36 different severe hazardous events at least 250 deaths have been recorded in the area since 18th Century, in the occasion of.

    19. Hazardous waste crime: a contextual analysis of the offense and the offender

      SciTech Connect

      Rebovich, D.J.

      1986-01-01

      The goal of this study is to analyze hazardous waste offense and offender characteristics. Criminal case data were collected from four sample states (Maine, Maryland, New Jersey, and Pennsylvania). Files of disposed criminal cases charged between 1977 and 1984 were content-analyzed, and interviews were conducted with prominent hazardous waste crime enforcement personnel from the sample states. Areas of analysis include methods of crime commission, skills required for crime commission, patterns of criminal network relationships, and degree of syndicate crime influence. While there has been some previous speculation that hazardous waste criminal behavior is directed through centralized racketeering, the present study of known offenders found little evidence of syndicate crime family infiltration. Crimes occurred within small, informal networks of waste generators, waste transporters, the employees of treatment/storage/disposal (TSD) facilities and certain non-industry peripheral actors. The study concludes that, while attempts have been made by syndicate crime operatives to infiltrate, these attempts have failed largely due to features of criminal commission methods and to the inherent fragmentation of hauling and TSD firm interests.

    20. Laser hazard analysis for airborne AURA (Big Sky variant) Proteus platform.

      SciTech Connect

      Augustoni, Arnold L.

      2004-02-01

      A laser safety and hazard analysis was performed for the airborne AURA (Big Sky Laser Technology) lidar system based on the 2000 version of the American National Standard Institute's (ANSI) Standard Z136.1, for the Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for the Safe Use of Lasers Outdoors. The AURA lidar system is installed in the instrument pod of a Proteus airframe and is used to perform laser interaction experiments and tests at various national test sites. The targets are located at various distances or ranges from the airborne platform. In order to protect personnel, who may be in the target area and may be subjected to exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength, calculate the Nominal Ocular Hazard Distance (NOHD), and determine the maximum 'eye-safe' dwell times for various operational altitudes and conditions. It was also necessary to calculate the appropriate minimum Optical Density (ODmin) of the laser safety eyewear used by authorized personnel who may receive hazardous exposures during ground base operations of the airborne AURA laser system (system alignment and calibration).

    1. A deterministic analysis of tsunami hazard and risk for the southwest coast of Sri Lanka

      NASA Astrophysics Data System (ADS)

      Wijetunge, J. J.

      2014-05-01

      This paper describes a multi-scenario, deterministic analysis carried out as a pilot study to evaluate the tsunami hazard and risk distribution in the southwest coast of Sri Lanka. The hazard and risk assessment procedure adopted was also assessed against available field records of the impact of the Indian Ocean tsunami in 2004. An evaluation of numerically simulated nearshore tsunami amplitudes corresponding to ‘maximum-credible' scenarios from different subduction segments in the Indian Ocean surrounding Sri Lanka suggests that a seismic event similar to that generated the tsunami in 2004 can still be considered as the ‘worst-case' scenario for the southwest coast. Furthermore, it appears that formation of edge waves trapped by the primary waves diffracting around the southwest significantly influences the nearshore tsunami wave field and is largely responsible for relatively higher tsunami amplitudes in certain stretches of the coastline under study. The extent of inundation from numerical simulations corresponding to the worst-case scenario shows good overall agreement with the points of maximum penetration of inundation from field measurements in the aftermath of the 2004 tsunami. It can also be seen that the inundation distribution is strongly influenced by onshore topography. The present study indicates that the mean depth of inundation could be utilised as a primary parameter to quantify the spatial distribution of the tsunami hazard. The spatial distribution of the risk of the tsunami hazard to the population and residential buildings computed by employing the standard risk formula shows satisfactory correlation with published statistics of the affected population and the damage to residential property during the tsunami in 2004.

    2. Google Earth Views of Probabilistic Tsunami Hazard Analysis Pilot Study, Seaside, Oregon

      NASA Astrophysics Data System (ADS)

      Wong, F. L.; Venturato, A. J.; Geist, E. L.

      2006-12-01

      Virtual globes such as Google Earth provide immediate geographic context for research data for coastal hazard planning. We present Google Earth views of data from a Tsunami Pilot Study conducted within and near Seaside and Gearhart, Oregon, as part of FEMA's Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). Two goals of the pilot study were to develop probabilistic 100- year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities along the Cascadia subduction zone that extends from Cape Mendocino, California, to the Strait of Juan de Fuca, Washington. State and local stakeholders also expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study report will be augmented by a separate geographic information systems (GIS) data publication that provides model data and results. In addition to traditional GIS data formats, Google Earth kmz files are available to provide rapid visualization of the data against the rich base map provided by the interface. The data include verbal and geologic observations of historic tsunami events, newly constructed DEMs, historic shorelines, earthquake sources, models of tsunami wave heights, and maps of the estimated 100- and 500-year probabilistic floods. Tsunami Pilot Study Working Group, 2006, Seaside, Oregon Tsunami Pilot Study - Modernization of FEMA Flood Hazard Maps: U.S. Geological Survey Open-file Report 2006

    3. The hazard analysis and critical control point system in food safety.

      PubMed

      Herrera, Anavella Gaitan

      2004-01-01

      The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series. PMID:15156035

    4. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

      NASA Astrophysics Data System (ADS)

      Candela, Angela; Tito Aronica, Giuseppe

      2014-05-01

      Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

    5. Seismicity and earthquake hazard analysis of the Teton-Yellowstone region, Wyoming

      NASA Astrophysics Data System (ADS)

      White, Bonnie J. Pickering; Smith, Robert B.; Husen, Stephan; Farrell, Jamie M.; Wong, Ivan

      2009-11-01

      hypocenters, unified magnitudes, and seismotectonic analysis helped refine the characterization of the background seismicity that was used as input into a probabilistic seismic hazards analysis. Our results reveals the highest seismic hazard is associated with the Teton fault because of its high slip-rate of approximately 1.3 mm/yr compared to the highest rate of 1.4 mm/yr in southern Yellowstone on the Mt. Sheridan fault. This study demonstrates that the Teton-Yellowstone area is among the regions highest seismic hazard in the western U.S.

    6. Risk Analysis of Coastal hazard Considering Sea-level Rise and Local Environment in Coastal Area

      NASA Astrophysics Data System (ADS)

      Sangjin, P.; Lee, D. K.; KIM, H.; Ryu, J. E.; Yoo, S.; Ryoo, H.

      2014-12-01

      Recently, natural hazards has been more unpredictable with increasing frequency and strength due to climate change. Especially, coastal areas would be more vulnerable in the future because of sea-level rise (SLR). In case of Korea, it is surrounded by oceans and has many big cities at coastal area, thus a hazard prevention plan in coastal area is absolutely necessary. However, prior to making the plan, finding areas at risk would be the first step. In order to find the vulnerable area, local characteristics of coastal areas should also be considered along with SLR. Therefore, the objective of the research is to find vulnerable areas, which could be damaged by coastal hazards considering local environment and SLR of coastal areas. Spatial scope of the research was set up as 1km from the coastline according to the 'coastal management law' in Korea. The assessment was done up to the year of 2050, and the highest sea level rise scenario was used. For risk analysis, biophysical and socioeconomic characteristics were considered as to represent local characteristics of coastal area. Risk analysis was carried out through the combination of 'possibility of hazard' and the 'level of damages', and both of them reflect the above-mentioned regional characteristics. Since the range of inundation was narrowed down to the inundation from typhoon in this research, the possibility of inundation caused by typhoon was estimated by using numerical model, which calculated the height of storm surge considering wave, tide, sea-level pressure and SLR. Also the level of damage was estimated by categorizing the socioeconomic character into four factors; human, infrastructure, ecology and socioeconomic. Variables that represent each factor were selected and used in damage estimation with their classification and weighting value. The result shows that the urban coastal areas are more vulnerable and hazardous than other areas because of socioeconomic factors. The east and the south coast are

    7. Transportation of Hazardous Evidentiary Material.

      SciTech Connect

      Osborn, Douglas.

      2005-06-01

      This document describes the specimen and transportation containers currently available for use with hazardous and infectious materials. A detailed comparison of advantages, disadvantages, and costs of the different technologies is included. Short- and long-term recommendations are also provided.3 DraftDraftDraftExecutive SummaryThe Federal Bureau of Investigation's Hazardous Materials Response Unit currently has hazardous material transport containers for shipping 1-quart paint cans and small amounts of contaminated forensic evidence, but the containers may not be able to maintain their integrity under accident conditions or for some types of hazardous materials. This report provides guidance and recommendations on the availability of packages for the safe and secure transport of evidence consisting of or contaminated with hazardous chemicals or infectious materials. Only non-bulk containers were considered because these are appropriate for transport on small aircraft. This report will addresses packaging and transportation concerns for Hazardous Classes 3, 4, 5, 6, 8, and 9 materials. If the evidence is known or suspected of belonging to one of these Hazardous Classes, it must be packaged in accordance with the provisions of 49 CFR Part 173. The anthrax scare of several years ago, and less well publicized incidents involving unknown and uncharacterized substances, has required that suspicious substances be sent to appropriate analytical laboratories for analysis and characterization. Transportation of potentially hazardous or infectious material to an appropriate analytical laboratory requires transport containers that maintain both the biological and chemical integrity of the substance in question. As a rule, only relatively small quantities will be available for analysis. Appropriate transportation packaging is needed that will maintain the integrity of the substance, will not allow biological alteration, will not react chemically with the substance being

    8. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

      PubMed Central

      Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

      2016-01-01

      Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

    9. Multi-factor Constrained Information Analysis Method for Landslide Hazard Risk

      NASA Astrophysics Data System (ADS)

      Tao, Kunwang; Wang, Liang; Qian, Xinlin

      2015-04-01

      Landslide hazard cause enormous damage to human life, property, and the environment. The most effective way to mitigate the effects of landslides is to evaluate the risk of the landslides, and take measures to avoid the loss in advance. Various factors should be considered for the landslides' risk assessment, so the assessment has great complexity and uncertainty. According to landslides, the multiple factors constrained method for landslides risk assessment was proposed at the same time, which with three step to carried it out, first using the GIS technology to divide the analysis grid as the base analysis unit, second, according to the available information, slope, lithology, faults, landuse, etc. as the multiple evaluation factors, finally, based on years of landslides in the observed data, the risk assessment analysis of landslides were realized with of multiple factors constrained assessment model, which the weight value of every factor was determined by the information model. The region of Gongliu was selected as the experimental area which located in Xinjiang Ili River basin and the altitude of 600 to 4000 meters, with the special terrain characteristics of long at the east to west, narrow at north to south. The unique topography characters is the abundant rainfall which causing frequent landslides. By selecting 500m * 500m as an analysis grid which covering the whole study area, based on the multiple factors constrained method for the landslides risk assessment, a comprehensive assessment of the landslides risk in this region were computed out, and the landslide hazard classification map was realized at the end. From the experimental results of the statistical perspective, the proportion of landslide hazard point is 94.04% at a little high risk and high risk areas. And the proportion of 4.64% at the low risk zone, 1.32% at the most low risk zone. The results showed a high probability of landslides at high level of the assess region, which showed that

    10. Frequency agile laser safety & hazard analysis for the Sandia Remote Sensing System LIDAR.

      SciTech Connect

      Augustoni, Arnold L.

      2009-05-01

      A laser safety and hazard analysis was performed for the Raytheon Frequency Agile Laser (FAL) to be used with the Sandia Remote Sensing System (SRSS) B-70 Trailer based on the 2007 version of the American National Standards Institute's (ANSI) Standard 136.1, for Safe Use of Lasers and the 2005 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The B-70 SRSS LIDAR system is a portable platform, which is used to perform laser interaction experiments and tests at various national test sites.

    11. [Incorporation of the Hazard Analysis and Critical Control Point system (HACCP) in food legislation].

      PubMed

      Castellanos Rey, Liliana C; Villamil Jiménez, Luis C; Romero Prada, Jaime R

      2004-01-01

      The Hazard Analysis and Critical Control Point system (HACCP), recommended by different international organizations as the Codex Alimentarius Commission, the World Trade Organization (WTO), the International Office of Epizootics (OIE) and the International Convention for Vegetables Protection (ICPV) amongst others, contributes to ensuring the innocuity of food along the agro-alimentary chain and requires of Good Manufacturing Practices (GMP) for its implementation, GMP's which are legislated in most countries. Since 1997, Colombia has set rules and legislation for application of HACCP system in agreement with international standards. This paper discusses the potential and difficulties of the legislation enforcement and suggests some policy implications towards food safety. PMID:15656068

    12. School food safety program based on hazard analysis and critical control point principles. Final rule.

      PubMed

      2009-12-15

      This final rule implements a legislative provision which requires school food authorities participating in the National School Lunch Program (NSLP) or the School Breakfast Program (SBP) to develop a school food safety program for the preparation and service of school meals served to children. The school food safety program must be based on the hazard analysis and critical control point (HACCP) system established by the Secretary of Agriculture. The food safety program will enable schools to take systematic action to prevent or minimize the risk of foodborne illness among children participating in the NSLP and SBP. PMID:20169679

    13. Application of hazard analysis and critical control point system in the dairy industry.

      PubMed

      Kassem, M; Salem, E; Ahwal, A M; Saddik, M; Gomaa, N F

      2002-01-01

      This study aimed to assess the hygiene quality of some packaged milk (pasteurized or sterilized) and dairy products before and after application of a hazard analysis and critical control point (HACCP) system at a milk and dairy products company in Cairo, Egypt. The steps taken to put HACCP in place are described and the process was monitored to assess its impact. Assessment of the hygiene quality of the milk and dairy products before and after HACCP showed an improvement in quality and an overall improvement in the conditions at the company. PMID:15330567

    14. Formal analysis of values under conditions of uncertainty applied to the siting of hazardous waste facilities

      SciTech Connect

      Hatfield, T.H.

      1985-01-01

      A major impediment to the siting of hazardous waste facilities is the reaction of the public to such a locally unwanted land use. The siting controversy is exacerbated by uncertainty with information available to evaluate candidate sites. In response to this problem, a normal analysis of values has been developed that treats the siting of hazardous waste facilities as a multiattribute group decision under conditions of uncertainty. The design phase of the formal analysis includes the selection of participants, the structuring of objectives and criteria, and the verification of independence conditions that determine the aggregation rules for the evaluation criteria. The evaluation phase of the formal analysis utilizes a questionnaire to assess the following attitudes of each participant for each criterion: the relative weighting and ranking of each criterion; the participant's values under conditions of certainty for the various levels of a criterion, given as a measurable value function v(x); values under conditions of uncertainty, given as a Von Neumann-Morgenstern utility function, u(x); and risk attitudes towards the uncertain performance of a site, given as the relative risk premium or RRP, where RRP =u(x) - v(x). The aggregation of individual evaluations is also a part of the evaluation phase.

    15. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

      NASA Astrophysics Data System (ADS)

      Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

      2016-04-01

      Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

    16. A hazards-model analysis of the covariates of infant and child mortality in Sri Lanka.

      PubMed

      Trussell, J; Hammerslough, C

      1983-02-01

      The purpose of this paper is twofold: (a) to provide a complete self-contained exposition of estimating life tables with covariates through the use of hazards models, and (b) to illustrate this technique with a substantive analysis of child mortality in Sri Lanka, thereby demonstrating that World Fertility Survey data are a valuable source for the study of child mortality. We show that life tables with covariates can be easily estimated with standard computer packages designed for analysis of contingency tables. The substantive analysis confirms and supplements an earlier study of infant and child mortality in Sri Lanka by Meegama. Those factors found to be strongly associated with mortality are mother's and father's education, time period of birth, urban/rural/estate residence, ethnicity, sex, birth order, age of the mother at the birth, and type of toilet facility. PMID:6832431

    17. Analysis of Flood Hazards for the Materials and Fuels Complex at the Idaho National Laboratory Site

      SciTech Connect

      Skaggs, Richard; Breithaupt, Stephen A.; Waichler, Scott R.; Kim, Taeyun; Ward, Duane L.

      2010-11-01

      Researchers at Pacific Northwest National Laboratory conducted a flood hazard analysis for the Materials and Fuels Complex (MFC) site located at the Idaho National Laboratory (INL) site in southeastern Idaho. The general approach for the analysis was to determine the maximum water elevation levels associated with the design-basis flood (DBFL) and compare them to the floor elevations at critical building locations. Two DBFLs for the MFC site were developed using different precipitation inputs: probable maximum precipitation (PMP) and 10,000 year recurrence interval precipitation. Both precipitation inputs were used to drive a watershed runoff model for the surrounding upland basins and the MFC site. Outflows modeled with the Hydrologic Engineering Centers Hydrologic Modeling System were input to the Hydrologic Engineering Centers River Analysis System hydrodynamic flood routing model.

    18. Uncertainty in the estimates of peak ground acceleration in seismic hazard analysis

      NASA Astrophysics Data System (ADS)

      Pavlenko, V. A.

      2015-11-01

      Probabilistic seismic hazard analysis has become a standard procedure preceding the antiseismic construction. An important component of the relevant calculations is the allowance for the uncertainty in the strong motion parameters (e.g., peak ground acceleration (PGA)). In the present-day approaches of probabilistic analysis, this uncertainty is modeled by a random variable (a residual) which has a lognormal distribution. With this model, the extrapolation into the area of long return periods yields nonzero probabilities of unrealistically high PGA. In the present work, the distribution of the logarithmic PGA residuals is modeled by different parametric distributions. From the set of these distributions, the one which provides the closest approximation of the empirical data is selected by the statistical criteria. The analysis shows that the generalized extreme value distribution (GEVD) most accurately reproduces the residuals of the logarithmic PGA, and the tail of the distribution is approximated by the generalized Pareto distribution (GPD).

    19. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

      NASA Astrophysics Data System (ADS)

      Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

      2015-08-01

      Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by

    20. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

      USGS Publications Warehouse

      Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.

      2004-01-01

      -motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.

    1. A first hazard analysis of the Quaternary Harrat Al-Madinah volcanic field, Saudi Arabia

      NASA Astrophysics Data System (ADS)

      El Difrawy, M. A.; Runge, M. G.; Moufti, M. R.; Cronin, S. J.; Bebbington, M.

      2013-11-01

      The northern portion of the 20,000 km2 Harrat Rahat basaltic field in NW Saudi Arabia (Harrat Al-Madinah) has hosted some of the most recent volcanic eruptions in the country. Rapid growth of the city of Al-Madinah has spread out onto the lava flows and scoria cones of the Harrat, increasing exposure to any potential renewed volcanism. We present here a first-order probabilistic hazard analysis related to new vent formation and subsequent lava flow from this volcanic field. The 501 visible eruption vent sites were integrated with aeromagnetic survey data (as representative of potential regions of buried volcanic vents) to develop a probability density function for new eruption sites using Gaussian kernel smoothing. This revealed a NNW striking zone of high spatial hazard terminating < 10 km south of the city. Using the properties of the AD1256 eruption lava flows and the spatial PDF, an analysis of lava hazard was carried out. Assuming a future lava-producing eruption, around 25% of the city of Al-Madinah is exposed to a probability of 0.001 to 0.005 of lava inundation. The temporal eruption recurrence rate is estimated at approximately one vent per 3300 years, but the temporal record of the field is so poorly constrained that the lower and upper bounds for the recurrence interval are 13,300 yrs and 70 yrs, respectively. Applying a Poisson temporal model, this results in a worst-case lava inundation recurrence interval of approximately 14,300 years.

    2. Strong Ground-Motion Prediction in Seismic Hazard Analysis: PEGASOS and Beyond

      NASA Astrophysics Data System (ADS)

      Scherbaum, F.; Bommer, J. J.; Cotton, F.; Bungum, H.; Sabetta, F.

      2005-12-01

      The SSHAC Level 4 approach to probabilistic seismic hazard analysis (PSHA), which could be considered to define the state-of-the-art in PSHA using multiple expert opinions, has been fully applied only twice, firstly in the multi-year Yucca Mountain study and subsequently (2002-2004) in the PEGASOS project. The authors of this paper participated as ground-motion experts in this latter project, the objective of which was comprehensive seismic hazard analysis for four nuclear power plant sites in Switzerland, considering annual exceedance frequencies down to 1/10000000. Following SSHAC procedure, particular emphasis was put on capturing both the aleatory and epistemic uncertainties. As a consequence, ground motion prediction was performed by combining several empirical ground motion models within a logic tree framework with the weights on each logic tree branch expressing the personal degree-of-belief of each ground-motion expert. In the present paper, we critically review the current state of ground motion prediction methodology in PSHA in particular for regions of low seismicity. One of the toughest lessons from PEGASOS was that in systematically and rigorously applying the laws of uncertainty propagation to all of the required conversions and adjustments of ground motion models, a huge price has to be paid in an ever-growing aleatory variability. Once this path has been followed, these large sigma values will drive the hazard, particularly for low annual frequencies of exceedance. Therefore, from a post-PEGASOS perspective, the key issues in the context of ground-motion prediction for PSHA for the near future are to better understand the aleatory variability of ground motion and to develop suites of ground-motion prediction equations that employ the same parameter definitions. The latter is a global rather than a regional challenge which might be a desirable long-term goal for projects similar to the PEER NGA (Pacific Earthquake Engineering Research Center, Next

    3. Communicating Pacific Rim Risk: A GIS Analysis of Hazard, Vulnerability, Population, and Infrastructure

      NASA Astrophysics Data System (ADS)

      Yurkovich, E. S.; Howell, D. G.

      2002-12-01

      Exploding population and unprecedented urban development within the last century helped fuel an increase in the severity of natural disasters. Not only has the world become more populated, but people, information and commodities now travel greater distances to service larger concentrations of people. While many of the earth's natural hazards remain relatively constant, understanding the risk to increasingly interconnected and large populations requires an expanded analysis. To improve mitigation planning we propose a model that is accessible to planners and implemented with public domain data and industry standard GIS software. The model comprises 1) the potential impact of five significant natural hazards: earthquake, flood, tropical storm, tsunami and volcanic eruption assessed by a comparative index of risk, 2) population density, 3) infrastructure distribution represented by a proxy, 4) the vulnerability of the elements at risk (population density and infrastructure distribution) and 5) the connections and dependencies of our increasingly 'globalized' world, portrayed by a relative linkage index. We depict this model with the equation, Risk = f(H, E, V, I) Where H is an index normalizing the impact of five major categories of natural hazards; E is one element at risk, population or infrastructure; V is a measure of the vulnerability for of the elements at risk; and I pertains to a measure of interconnectivity of the elements at risk as a result of economic and social globalization. We propose that future risk analysis include the variable I to better define and quantify risk. Each assessment reflects different repercussions from natural disasters: losses of life or economic activity. Because population and infrastructure are distributed heterogeneously across the Pacific region, two contrasting representations of risk emerge from this study.

    4. Mission operations data analysis tools for Mars Observer guidance and control

      NASA Technical Reports Server (NTRS)

      Kan, Edwin P.

      1994-01-01

      Mission operations for the Mars Observer (MO) Project at the Jet Propulsion Laboratory were supported by a variety of ground data processing software and analysis tools. Some of these tools were generic to multimission spacecraft mission operations, some were specific to the MO spacecraft, and others were custom tailored to the operation and control of the Attitude and Articulation Control Subsystem (AACS). The focus of this paper is on the data analysis tools for the AACS. Four different categories of analysis tools are presented; with details offered for specific tools. Valuable experience was gained from the use of these tools and through their development. These tools formed the backbone and enhanced the efficiency of the AACS Unit in the Mission Operations Spacecraft Team. These same tools, and extensions thereof, have been adopted by the Galileo mission operations, and are being designed into Cassini and other future spacecraft mission operations.

    5. System hazards in managing laboratory test requests and results in primary care: medical protection database analysis and conceptual model

      PubMed Central

      Bowie, Paul; Price, Julie; Hepworth, Neil; Dinwoodie, Mark; McKay, John

      2015-01-01

      Objectives To analyse a medical protection organisation's database to identify hazards related to general practice systems for ordering laboratory tests, managing test results and communicating test result outcomes to patients. To integrate these data with other published evidence sources to inform design of a systems-based conceptual model of related hazards. Design A retrospective database analysis. Setting General practices in the UK and Ireland. Participants 778 UK and Ireland general practices participating in a medical protection organisation's clinical risk self-assessment (CRSA) programme from January 2008 to December 2014. Main outcome measures Proportion of practices with system risks; categorisation of identified hazards; most frequently occurring hazards; development of a conceptual model of hazards; and potential impacts on health, well-being and organisational performance. Results CRSA visits were undertaken to 778 UK and Ireland general practices of which a range of systems hazards were recorded across the laboratory test ordering and results management systems in 647 practices (83.2%). A total of 45 discrete hazard categories were identified with a mean of 3.6 per practice (SD=1.94). The most frequently occurring hazard was the inadequate process for matching test requests and results received (n=350, 54.1%). Of the 1604 instances where hazards were recorded, the most frequent was at the ‘postanalytical test stage’ (n=702, 43.8%), followed closely by ‘communication outcomes issues’ (n=628, 39.1%). Conclusions Based on arguably the largest data set currently available on the subject matter, our study findings shed new light on the scale and nature of hazards related to test results handling systems, which can inform future efforts to research and improve the design and reliability of these systems. PMID:26614621

    6. Teacher Guidance of Algebraic Formula Building: Functional Grammatical Analysis of a Whole-Class Conversation

      ERIC Educational Resources Information Center

      Zolkower, Betina; Shreyar, Sam; Pérez, Silvia

      2015-01-01

      How does teacher-guided whole-class interaction contribute to expanding students' potential for making and exchanging mathematical meanings? We address this question through an interpretative analysis of a whole-group conversation in a sixth grade class taught by an experienced teacher in a school in Southern Argentina. The extended interaction…

    7. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

      NASA Technical Reports Server (NTRS)

      Massie, Michael J.; Morris, A. Terry

      2010-01-01

      Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

    8. Seismic hazard analysis of Western Venezuela methodology and data collection process

      SciTech Connect

      Savy, J.

      1988-06-01

      The purpose of this project, initiated by INTEVEP is to develop a seismic hazard characterization for the region of the Lake Maracaibo basin in Western Venezuela (WV) with an emphasis at the sites of dikes built on the eastern side of the lake. One task of the project was to assess the seismicity of this region and to describe it in a form which can be used as input to a seismic hazard analysis. The seismic parameters of interests were seismo-tectonic zonation, distribution of earthquake magnitudes, and the largest earthquake, i.e., upper magnitude cutoff. Because it is difficult, or perhaps impossible, to precisely quantify such seismic parameters using only the sparse historical record, expert judgment is crucial. Thus two panels of experts were assembled, to supplement the lack of opinions in the possible models of zonation, seismicity, (S-panel) and ground motion attenuation (G-panel). In addition to the ground motion models described in the ground motion questionnaire handed to the G-panel experts, we performed some analysis, at LLNL, to develop a new breed of models where the parameter of interest is now the number of cycles of a given fraction of the peak ground acceleration, as a function of magnitude and distance from the source of an earthquake, rather than the peak ground acceleration (PGA) as a function of magnitude and distance. 20 refs.

    9. Seismic source models for probabilistic hazard analysis of Georgia (Southern Caucasus)

      NASA Astrophysics Data System (ADS)

      Javakhishvili, Z.; Godoladze, T.; Gamkrelidze, E.; Sokhadze, G.

      2014-12-01

      Seismic Source model is one of the main components of probabilistic seismic-hazard analysis. Active faults and tectonics of Georgia (Sothern Caucasus) have been investigated in numerous scientific studies. The Caucasus consists of different geological structures with complex interactions. The major structures trend WNW-ESE, and focal mechanisms indicate primarily thrust faults striking parallel to the mountains. It is a part of the Alpine - Himalayan collision belt and it is well known for its high seismicity. Although the geodynamic activity of the region, caused by the convergence of the Arabian and the Eurasian plates at a rate of several cm/year, is well known, different tectonic models were proposed as an explanation for the seismic process in the region. The recent model on seismic sources for the Caucasus and derives from recent seismotectonic studies performed in Georgia in the framework of different international projects.We have analyzed previous studies and recent investigations on the bases of new seismic (spatial distribution, moment tensor solution etc), GPS and other data. As a result data base of seismic source models was compiled. Seismic sources are modeled as lines representing the surface projection of active faults or as wide areas (source zones), where the earthquakes can occur randomly. Each structure or zone was quantified on the basis of different parameters. Recent experience for harmonization of cross-border structures was used. As a result new seismic source model of Georgia (Southern Caucasus) for hazard analysis was created.

    10. Fire hazards analysis for W-413, West Area Tank Farm Storage and Staging Facility

      SciTech Connect

      Huckfeldt, R.A.; Lott, D.T.

      1994-12-14

      In accordance with DOE Order 5480.7A, a Fire Hazards Analysis must be performed for all new facilities. The purpose of the analysis is to comprehensively assess the risk from fire within individual fire areas in relation to proposed fire protection so as to ascertain whether the fire protection objectives of the Order are met. The Order acknowledges a graded approach commensurate with the hazards involved. Tank Farms Operations must sore/stage material and equipment such as pipes, fittings, conduit, instrumentation and others related items until work packages are ready to work. Consumable materials, such as nut, bolts and welding rod, are also requires to be stored for routine and emergency work. Connex boxes and open storage is currently used for much of the storage because of the limited space at and 272WA. Safety issues based on poor housekeeping and material deteriorating due to weather damage has resulted from this inadequate storage space. It has been determined that a storage building in close proximity to the Tank Farm work force would be cost effective. This facility is classified as a safety class 4 building.

    11. Analysis, biomedicine, collaboration, and determinism challenges and guidance: wish list for biopharmaceuticals on the interface of computing and statistics.

      PubMed

      Goodman, Arnold F

      2011-11-01

      I have personally witnessed processing advance from desk calculators and mainframes, through timesharing and PCs, to supercomputers and cloud computing. I have also witnessed resources grow from too little data into almost too much data, and from theory dominating data into data beginning to dominate theory while needing new theory. Finally, I have witnessed problems advance from simple in a lone discipline into becoming almost too complex in multiple disciplines, as well as approaches evolve from analysis driving solutions into solutions by data mining beginning to drive the analysis itself. How we do all of this has transitioned from competition overcoming collaboration into collaboration starting to overcome competition, as well as what is done being more important than how it is done has transitioned into how it is done becoming as important as what is done. In addition, what or how we do it being more important than what or how we should actually do it has shifted into what or how we should do it becoming just as important as what or how we do it, if not more so. Although we have come a long way in both our methodology and technology, are they sufficient for our current or future complex and multidisciplinary problems with their massive databases? Since the apparent answer is not a resounding yes, we are presented with tremendous challenges and opportunities. This personal perspective adapts my background and experience to be appropriate for biopharmaceuticals. In these times of exploding change, informed perspectives on what challenges should be explored with accompanying guidance may be even more valuable than the far more typical literature reviews in conferences and journals of what has already been accomplished without challenges or guidance. Would we believe that an architect who designs a skyscraper determines the skyscraper's exact exterior, interior and furnishings or only general characteristics? Why not increase dependability of conclusions in

    12. Hazards analysis and prediction from remote sensing and GIS using spatial data mining and knowledge discovery: a case study for landslide hazard zonation

      NASA Astrophysics Data System (ADS)

      Hsu, Pai-Hui; Su, Wen-Ray; Chang, Chy-Chang

      2011-11-01

      Due to the particular geographical location and geological condition, Taiwan suffers from many natural hazards which often cause series property damages and life losses. To reduce the damages and casualty, an effective real-time system for hazard prediction and mitigation is necessary. In this study, a case study for Landslide Hazard Zonation (LHZ) is tested in accordance with Spatial Data Mining and Knowledge Discovery (SDMKD) from database. Many different kinds of geospatial data, such as the terrain elevation, land cover types, the distance to roads and rivers, geology maps, NDVI, and monitoring rainfall data etc., are collected into the database for SDMKD. In order to guarantee the data quality, the spatial data cleaning is essential to remove the noises, errors, outliers, and inconsistency hiding in the input spatial data sets. In this paper, the Kriging interpolation is used to calibrate the QPESUMS rainfall data to the rainfall observations from rain gauge stations to remove the data inconsistency. After the data cleaning, the artificial neural networks (ANNs) is applied to generate the LHZ map throughout the test area. The experiment results show that the accuracy of LHZ is about 92.3% with the ANNs analysis, and the landslides induced by heavy-rainfall can be mapped efficiently from remotely sensed images and geospatial data using SDMKD technologies.

    13. Earthquake Cluster Analysis for Turkey and its Application for Seismic Hazard Assessment

      NASA Astrophysics Data System (ADS)

      Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

      2015-04-01

      Earthquake clusters are an important element in general seismology and also for the application in seismic hazard assessment. In probabilistic seismic hazard assessment, the occurrence of earthquakes is often linked to an independent Monte Carlo process, following a stationary Poisson model. But earthquakes are dependent and constrained, especially in terms of earthquake swarms, fore- and aftershocks or even larger sequences as observed for the Landers sequence in California or the Darfield-Christchurch sequence in New Zealand. For earthquake catalogues, the element of declustering is an important step to capture earthquake frequencies by avoiding a bias towards small magnitudes due to aftershocks. On the other hand, declustered catalogues for independent probabilistic seismic activity will underestimate the total number of earthquakes by neglecting dependent seismicity. In this study, the effect of clusters on probabilistic seismic hazard assessment is investigated in detail. To capture the features of earthquake clusters, a uniform framework for earthquake cluster analysis is introduced using methodologies of geostatistics and machine learning. These features represent important cluster characteristics like cluster b-values, temporal decay, rupture orientations and many more. Cluster parameters are mapped in space using kriging. Furthermore, a detailed data analysis is undertaken to provide magnitude-dependent relations for various cluster parameters. The acquired features are used to introduce dependent seismicity within stochastic earthquake catalogues. In addition, the development of smooth seismicity maps based on historic databases is in general biased to the more complete recent decades. A filling methodology is introduced which will add dependent seismicity in catalogues where none has been recorded to avoid the above mentioned bias. As a case study, Turkey has been chosen due to its inherent seismic activity and well-recorded data coverage. Clustering

    14. Fire Hazard Analysis for the Cold Vacuum Drying facility (CVD) Facility

      SciTech Connect

      SINGH, G.

      2000-09-06

      The CVDF is a nonreactor nuclear facility that will process the Spent Nuclear Fuels (SNF) presently stored in the 105-KE and 105-KW SNF storage basins. Multi-canister overpacks (MCOs) will be loaded (filled) with K Basin fuel transported to the CVDF. The MCOs will be processed at the CVDF to remove free water from the fuel cells (packages). Following processing at the CVDF, the MCOs will be transported to the CSB for interim storage until a long-term storage solution can be implemented. This operation is expected to start in November 2000. A Fire Hazard Analysis (FHA) is required for all new facilities and all nonreactor nuclear facilities, in accordance with U.S. Department of Energy (DOE) Order 5480.7A, Fire Protection. This FHA has been prepared in accordance with DOE 5480.7A and HNF-PRO-350, Fire Hazard Analysis Requirements. Additionally, requirements or criteria contained in DOE, Richland Operations Office (RL) RL Implementing Directive (RLID) 5480.7, Fire Protection, or other DOE documentation are cited, as applicable. This FHA comprehensively assesses the risk of fire at the CVDF to ascertain whether the specific objectives of DOE 5480.7A are met. These specific fire protection objectives are: (1) Minimize the potential for the occurrence of a fire. (2) Ensure that fire does not cause an onsite or offsite release of radiological and other hazardous material that will threaten the public health and safety or the environment. (3) Establish requirements that will provide an acceptable degree of life safety to DOE and contractor personnel and ensure that there are no undue hazards to the public from fire and its effects in DOE facilities. (4) Ensure that vital DOE programs will not suffer unacceptable delays as a result of fire and related perils. (5) Ensure that property damage from fire and related perils does not exceed an acceptable level. (6) Ensure that process control and safety systems are not damaged by fire or related perils. This FHA is based on the

    15. Mission planning, mission analysis and software formulation. Level C requirements for the shuttle mission control center orbital guidance software

      NASA Technical Reports Server (NTRS)

      Langston, L. J.

      1976-01-01

      The formulation of Level C requirements for guidance software was reported. Requirements for a PEG supervisor which controls all input/output interfaces with other processors and determines which PEG mode is to be utilized were studied in detail. A description of the two guidance modes for which Level C requirements have been formulated was presented. Functions required for proper execution of the guidance software were defined. The requirements for a navigation function that is used in the prediction logic of PEG mode 4 were discussed. It is concluded that this function is extracted from the current navigation FSSR.

    16. Current practices in cancer spatial data analysis: a call for guidance

      PubMed Central

      Pickle, Linda Williams; Waller, Lance A; Lawson, Andrew B

      2005-01-01

      There has long been a recognition that place matters in health, from recognition of clusters of yellow fever and cholera in the 1800s to modern day analyses of regional and neighborhood effects on cancer patterns. Here we provide a summary of discussions about current practices in the spatial analysis of georeferenced cancer data by a panel of experts recently convened at the National Cancer Institute. PMID:15649320

    17. A prototype web-GIS application for risk analysis of natural hazards in Switzerland

      NASA Astrophysics Data System (ADS)

      Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien

      2016-04-01

      Following changes in the system of Swiss subsidy in January 2008, the Swiss cantons and the Federal Office for the Environment (FOEN) were forced to prioritize different natural hazard protection projects based on their cost-effectiveness, as a response to limited financial resources (Bründl et al., 2009). For this purpose, applications such as EconoMe (OFEV, 2016) and Valdorisk (DGE, 2016) were developed for risk evaluation and prioritization of mitigation projects. These tools serve as a useful decision-making instrument to the community of practitioners and responsible authorities for natural hazard risk management in Switzerland. However, there are several aspects which could be improved, in particular, the integration and visualization of spatial information interactively through a web-GIS interface for better risk planning and evaluation. Therefore, in this study, we aim to develop an interactive web-GIS application based on the risk concepts applied in Switzerland. The purpose of this tool is to provide a rapid evaluation of risk before and after protection measures, and to test the efficiency of measures by using a simplified cost-benefit analysis within the context of different protection projects. This application allows to integrate different layers which are necessary to calculate risk, in particular, hazard intensity (vector) maps for different scenarios (such as 30, 100 and 300 years of return periods based on Swiss guidelines), exposed objects (such as buildings) and vulnerability information of these objects. Based on provided information and additional parameters, risk is calculated automatically and results are visualized within the web-GIS interface of the application. The users can modify these input information and parameters to create different risk scenarios. Based on the resultant risk scenarios, the users can propose and visualize (preliminary) risk reduction measures before realizing the actual design and dimensions of such protective

    18. Variability and Uncertainty in Probabilistic Seismic Hazard Analysis for the Island of Montreal

      NASA Astrophysics Data System (ADS)

      Elkady, Ahmed Mohamed Ahmed

      The current seismic design process for structures in Montreal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montreal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "epsilon" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montreal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.

    19. Guidance for performing multivariate data analysis of bioprocessing data: pitfalls and recommendations.

      PubMed

      Rathore, Anurag S; Mittal, Shachi; Pathak, Mili; Arora, Arushi

      2014-01-01

      Biotech unit operations are often characterized by a large number of inputs (operational parameters) and outputs (performance parameters) along with complex correlations among them. A typical biotech process starts with the vial of the cell bank, ends with the final product, and has anywhere from 15 to 30 such unit operations in series. Besides the above-mentioned operational parameters, raw material attributes can also impact process performance and product quality as well as interact among each other. Multivariate data analysis (MVDA) offers an effective approach to gather process understanding from such complex datasets. Review of literature suggests that the use of MVDA is rapidly increasing, fuelled by the gradual acceptance of quality by design (QbD) and process analytical technology (PAT) among the regulators and the biotech industry. Implementation of QbD and PAT requires enhanced process and product understanding. In this article, we first discuss the most critical issues that a practitioner needs to be aware of while performing MVDA of bioprocessing data. Next, we present a step by step procedure for performing such analysis. Industrial case studies are used to elucidate the various underlying concepts. With the increasing usage of MVDA, we hope that this article would be a useful resource for present and future practitioners of MVDA. PMID:24778085

    20. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

      NASA Astrophysics Data System (ADS)

      Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

      This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

    1. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

      NASA Astrophysics Data System (ADS)

      Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

      2011-12-01

      The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed

    2. An analysis of approach navigation accuracy and guidance requirements for the grand tour mission to the outer planets

      NASA Technical Reports Server (NTRS)

      Jones, D. W.

      1971-01-01

      The navigation and guidance process for the Jupiter, Saturn and Uranus planetary encounter phases of the 1977 Grand Tour interior mission was simulated. Reference approach navigation accuracies were defined and the relative information content of the various observation types were evaluated. Reference encounter guidance requirements were defined, sensitivities to assumed simulation model parameters were determined and the adequacy of the linear estimation theory was assessed. A linear sequential estimator was used to provide an estimate of the augmented state vector, consisting of the six state variables of position and velocity plus the three components of a planet position bias. The guidance process was simulated using a nonspherical model of the execution errors. Computation algorithms which simulate the navigation and guidance process were derived from theory and implemented into two research-oriented computer programs, written in FORTRAN.

    3. Development of guidance for states transitioning to new safety analysis tools

      NASA Astrophysics Data System (ADS)

      Alluri, Priyanka

      With about 125 people dying on US roads each day, the US Department of Transportation heightened the awareness of critical safety issues with the passage of SAFETEA-LU (Safe Accountable Flexible Efficient Transportation Equity Act---a Legacy for Users) legislation in 2005. The legislation required each of the states to develop a Strategic Highway Safety Plan (SHSP) and incorporate data-driven approaches to prioritize and evaluate program outcomes: Failure to do so resulted in funding sanctioning. In conjunction with the legislation, research efforts have also been progressing toward the development of new safety analysis tools such as IHSDM (Interactive Highway Safety Design Model), SafetyAnalyst, and HSM (Highway Safety Manual). These software and analysis tools are comparatively more advanced in statistical theory and level of accuracy, and have a tendency to be more data intensive. A review of the 2009 five-percent reports and excerpts from the nationwide survey revealed astonishing facts about the continuing use of traditional methods including crash frequencies and rates for site selection and prioritization. The intense data requirements and statistical complexity of advanced safety tools are considered as a hindrance to their adoption. In this context, this research aims at identifying the data requirements and data availability for SafetyAnalyst and HSM by working with both the tools. This research sets the stage for working with the Empirical Bayes approach by highlighting some of the biases and issues associated with the traditional methods of selecting projects such as greater emphasis on traffic volume and regression-to-mean phenomena. Further, the not-so-obvious issue with shorter segment lengths, which effect the results independent of the methods used, is also discussed. The more reliable and statistically acceptable Empirical Bayes methodology requires safety performance functions (SPFs), regression equations predicting the relation between crashes

    4. Evaluation of the Potential of NASA Multi-satellite Precipitation Analysis in Global Landslide Hazard Assessment

      NASA Technical Reports Server (NTRS)

      Hong, Yang; Adler, Robert F.; Huffman, George J.

      2007-01-01

      Landslides are one of the most widespread natural hazards on Earth, responsible for thousands of deaths and billions of dollars in property damage every year. In the U.S. alone landslides occur in every state, causing an estimated $2 billion in damage and 25- 50 deaths each year. Annual average loss of life from landslide hazards in Japan is 170. The situation is much worse in developing countries and remote mountainous regions due to lack of financial resources and inadequate disaster management ability. Recently, a landslide buried an entire village on the Philippines Island of Leyte on Feb 17,2006, with at least 1800 reported deaths and only 3 houses left standing of the original 300. Intense storms with high-intensity , long-duration rainfall have great potential to trigger rapidly moving landslides, resulting in casualties and property damage across the world. In recent years, through the availability of remotely sensed datasets, it has become possible to conduct global-scale landslide hazard assessment. This paper evaluates the potential of the real-time NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA) system to advance our understanding of and predictive ability for rainfall-triggered landslides. Early results show that the landslide occurrences are closely associated with the spatial patterns and temporal distribution of rainfall characteristics. Particularly, the number of landslide occurrences and the relative importance of rainfall in triggering landslides rely on the influence of rainfall attributes [e.g. rainfall climatology, antecedent rainfall accumulation, and intensity-duration of rainstorms). TMPA precipitation data are available in both real-time and post-real-time versions, which are useful to assess the location and timing of rainfall-triggered landslide hazards by monitoring landslide-prone areas while receiving heavy rainfall. For the purpose of identifying rainfall-triggered landslides, an empirical global rainfall intensity

    5. SCEC/CME CyberShake: Probabilistic Seismic Hazard Analysis Using 3D Seismic Waveform Modeling

      NASA Astrophysics Data System (ADS)

      Callaghan, S.; Maechling, P. J.; Cui, Y.; Faerman, M.; Field, E.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T. H.; Kesselman, C.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

      2005-12-01

      Researchers on the SCEC Community Modeling Environment (SCEC/CME) Project are calculating Probabilistic Seismic Hazard Curves for several sites in the Los Angeles area. The hazard curves calculated in this study use Intensity Measure Relationships (IMRs) based on 3D ground motion simulations rather than on attenuation relationships. State-of-the-art Probabilistic Seismic Hazard Analysis (PSHA) is currently conducted using IMRs that use empirically-based attenuation relationships. These attenuation relationships represent relatively simple analytical models based on the regression of observed data. However, it is widely believed that significant improvements in SHA will rely on the use of more physics-based, waveform modeling. In fact, a more physics-based approach to PSHA was endorsed in a recent assessment of earthquake science by National Research Council (2003). In order to introduce the use of 3D seismic waveform modeling into PSHA hazard curve calculations, the SCEC/CME CyberShake group is integrating state-of-the-art PSHA software tools (OpenSHA), SCEC-developed geophysical models (SCEC CVM3.0), validated anelastic wave modeling (AWM) software, and state-of-the-art computational technologies including high performance computing and grid-based scientific workflows in an effort to develop an OpenSHA-compatible 3D waveform-based IMR component. This will allow researchers to combine a new class of waveform-based IMRs with the large number of existing PSHA components, such as Earthquake Rupture Forecasts (ERF's), that are currently implemented in the OpenSHA system. To calculate a probabilistic hazard curve for a site of interest, we use the OpenSHA implementation of the NSHMP-2002 ERF and identify all ruptures within 200km of the site of interest. For each of these ruptures, we convert the NSHMP-2002 rupture definition into one, or more, Ruptures with Slip Time History (Rupture Variations) using newly developed Rupture Generator software. Strain Green Tensors are

    6. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

      PubMed

      Clark, Renee M; Besterfield-Sacre, Mary E

      2009-03-01

      We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk. PMID:19087232

    7. Explosion hazard analysis for an enclosure partially filled with a flammable gas

      SciTech Connect

      Ogle, R.A.

      1999-11-01

      The constant volume explosion of a flammable gas is one of the more common accidental explosions. The explosion pressure at the stoichiometric condition is approximately 50 times greater than the failure pressure of most industrial structures. Observations from accident scenes suggest that some explosions are caused by a quantity of fuel significantly less than the stoichiometric amount required to fill an enclosure. This paper presents a method for analyzing the explosion hazard in an enclosure which is only partially filled with a flammable gas. The method, called the adiabatic mixing model, is based on thermodynamics and can be used to calculate the minimum fuel quantity which will yield a specified explosion pressure. Results are presented for a set of representative fuels and are compared with alternative explosion models. The results demonstrate that catastrophic structural damage can be achieved with a volume of flammable gas which is less than one percent of the enclosure volume. The method can be a useful tool for both hazard analysis and accident investigations.

    8. Safety analysis for the use of hazardous production materials in photovoltaic applications

      SciTech Connect

      Moskowitz, P.D.; Fthenakis, V.M.; Crandall, R.S.; Nelson, B.P.

      1993-12-31

      A wide range of hazardous production materials (HPMs) are used in industrial and university facilities engaged in research and development (R&D) related to semiconductor and photovoltaic devices. Because of the nature of R&D facilities where research activities are constantly changing, it is important for facility managers to pro-actively control the storage, distribution, use and disposal of these HPMs. As part of this control process, facility managers must determine the magnitude of the risk presented by their operations and the protection afforded by the administrative, engineering and personnel controls that have been implemented to reduce risks to life and property to acceptable levels. Facility auditing combined with process hazard analysis (PHA), provides a mechanism for identifying these risks and evaluating their magnitude. In this paper, the methods and results of a PHA for a photovoltaic R&D facility handling HPMs are presented. Of the 30 potential accidents identified, none present High or even Moderate Risks; 18 present Low Risks; and, 12 present Routine Risks. Administrative, engineering and personal safety controls associated with each accident are discussed. 15 refs., 2 figs., 6 tabs.

    9. Safety analysis for the use of hazardous production materials in photovoltaic applications

      SciTech Connect

      Moskowitz, P.D.; Fthenakis, V.M.; Crandall, R.S.; Nelson, B.P.

      1993-11-01

      A wide range of hazardous production materials (HPMs) are used in industrial and university facilities engaged in research and development (R and D) related to semiconductor and photovoltaic devices. Because of the nature of R and D facilities where research activities are constantly changing, it is important for facility managers to pro-actively control the storage, distribution, use and disposal of these HPMs. As part of this control process, facility managers must determine the magnitude of the risk presented by their operations and the protection afforded by the administrative, engineering and personnel controls that have been implemented to reduce risks to life and property to acceptable levels. Facility auditing combined with process hazard analysis (PHA), provides a mechanism for identifying these risks and evaluating their magnitude. In this paper, the methods and results of a PHA for a photovoltaic R and D facility handling HPMs are presented. Of the 30 potential accidents identified, none present High or even Moderate Risks; 18 present Low Risks; and, 12 present Routine Risks. Administrative, engineering and personal safety controls associated with each accident are discussed.

    10. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

      NASA Astrophysics Data System (ADS)

      Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

      2013-12-01

      During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

    11. Grand Junction projects office mixed-waste treatment program, VAC*TRAX mobile treatment unit process hazards analysis

      SciTech Connect

      Bloom, R.R.

      1996-04-01

      The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changes to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

    12. Meta-Analysis-Based Guidance for Dietary Management in Eosinophilic Esophagitis.

      PubMed

      Lucendo, Alfredo J

      2015-10-01

      Eosinophilic esophagitis (EoE) is a prevalent chronic esophageal disorder, triggered and maintained by immunologically mediated responses against dietary antigens. EoE represents the most recent form of food allergy, and its control by avoiding offending foods has increasingly appeared as a therapeutic alternative to achieve and maintain remission. Dietary therapies have proved equally effective in pediatric and adult EoE patients, among whom various types of interventions to eliminate or reduce food antigens exposure have been evaluated. A recent meta-analysis showed elemental diet as the most effective option to induce disease remission, but with a limited application in clinical practice. Inconsistency and wide variability in results from skin allergy testing-directed food restriction contributed to dissatisfaction with implementation of this option, which subsequently was displaced by empiric elimination of common food allergens. Such empiric elimination of common food allergens is now recognized as the best alternative for dietary treatment, with moderate-to-high efficiency and reproducible results. This review provides evidence-based insights into the dietary management of EoE. PMID:26292666

    13. Toward standardized mapping for left atrial analysis and cardiac ablation guidance

      NASA Astrophysics Data System (ADS)

      Rettmann, M. E.; Holmes, D. R.; Linte, C. A.; Packer, D. L.; Robb, R. A.

      2014-03-01

      In catheter-based cardiac ablation, the pulmonary vein ostia are important landmarks for guiding the ablation procedure, and for this reason, have been the focus of many studies quantifying their size, structure, and variability. Analysis of pulmonary vein structure, however, has been limited by the lack of a standardized reference space for population based studies. Standardized maps are important tools for characterizing anatomic variability across subjects with the goal of separating normal inter-subject variability from abnormal variability associated with disease. In this work, we describe a novel technique for computing flat maps of left atrial anatomy in a standardized space. A flat map of left atrial anatomy is created by casting a single ray through the volume and systematically rotating the camera viewpoint to obtain the entire field of view. The technique is validated by assessing preservation of relative surface areas and distances between the original 3D geometry and the flat map geometry. The proposed methodology is demonstrated on 10 subjects which are subsequently combined to form a probabilistic map of anatomic location for each of the pulmonary vein ostia and the boundary of the left atrial appendage. The probabilistic map demonstrates that the location of the inferior ostia have higher variability than the superior ostia and the variability of the left atrial appendage is similar to the superior pulmonary veins. This technique could also have potential application in mapping electrophysiology data, radio-frequency ablation burns, or treatment planning in cardiac ablation therapy.

    14. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

      NASA Astrophysics Data System (ADS)

      Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

      2011-12-01

      Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

    15. A regional analysis of elements at risk exposed to mountain hazards in the Eastern European Alps

      NASA Astrophysics Data System (ADS)

      Fuchs, Sven; Zischg, Andreas

      2014-05-01

      We present a method to quantify the number and value of buildings exposed to torrents and snow avalanches in the Austrian Alps, as well as the number of exposed people. Based on a unique population and building register dataset, a relational SQL database was developed that allows in combination with GIS data a rule-based nation-wide automated analysis. Furthermore, possibilities and challenges are discussed with respect to the use of such data in vulnerability assessment and with respect to resilience measures. We comprehensively address the challenge of data accuracy, scale and uncertainties. From the total of approximately 2.4 million buildings with a clearly attributable geographical location, around 120,000 are exposed to torrent processes (5 %) and snow avalanches (0.4 %); exposition was defined here as located within the digitally available hazard maps of the Austrian Torrent and Avalanche Control Service. Around 5 % of the population (360,000 out of 8.5 million inhabitants), based on those people being compulsory listed in the population register, are located in these areas. The analysis according to the building category resulted in 2.05 million residential buildings in Austria (85 %), 93,000 of which (4.5 %) are exposed to these hazards. In contrast, 37,300 buildings (1.6 %) throughout the country belong to the category of accommodation facilities, 5,600 of which are exposed (15 %). Out of the 140,500 commercial buildings, 8,000 (5 %) are exposed. A considerable spatial variation was detectable within the communities and Federal States. In general, an above-average exposition of buildings to torrent process and snow avalanches was detectable in communities located in the Federal State of Salzburg, Styria and Vorarlberg (torrents), and Tyrol and Vorarlberg (snow avalanches). In the alpine part of Austria, the share of exposed accommodation buildings was two times (Salzburg) and three times (Vorarlberg) higher than the regional average of exposed buildings

    16. Rapid, reliable geodetic data analysis for hazard response: Results from the Advanced Rapid Imaging and Analysis (ARIA) project

      NASA Astrophysics Data System (ADS)

      Owen, S. E.; Simons, M.; Hua, H.; Yun, S.; Cruz, J.; Webb, F.; Rosen, P. A.; Fielding, E. J.; Moore, A. W.; Polet, J.; Liu, Z.; Agram, P. S.; Lundgren, P.

      2013-12-01

      ARIA is a joint JPL/Caltech coordinated project to automate InSAR and GPS imaging capabilities for scientific understanding, hazard response, and societal benefit. Geodetic imaging's unique ability to capture surface deformation in high spatial and temporal resolution allows us to resolve the fault geometry and distribution of slip associated with earthquakes in high spatial & temporal detail. In certain cases, it can be complementary to seismic data, providing constraints on location, geometry, or magnitude that is difficult to determine with seismic data alone. In addition, remote sensing with SAR provides change detection and damage assessment capabilities for earthquakes, floods and other disasters that can image even at night or through clouds. We have built an end-to-end prototype geodetic imaging data system that forms the foundation for a hazard response and science analysis capability that integrates InSAR, high-rate GPS, seismology, and modeling to deliver monitoring, science, and situational awareness products. This prototype incorporates state-of-the-art InSAR and GPS analysis algorithms from technologists and scientists. The products have been designed and a feasibility study conducted in collaboration with USGS scientists in the earthquake and volcano science programs. We will present results that show the capabilities of this data system in terms of latency, data processing capacity, quality of automated products, and feasibility of use for analysis of large SAR and GPS data sets and for earthquake response activities.

    17. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

      NASA Astrophysics Data System (ADS)

      van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

      2015-04-01

      Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

    18. Application of Hazard Analysis Critical Control Point (HACCP) to the handling of expressed breast milk on a neonatal unit.

      PubMed

      Hunter, P R

      1991-02-01

      Hazard Analysis Critical Control Point (HACCP) is a powerful procedure for ensuring quality and safety and has gained widespread use in industry. This paper describes the basic method which is a four-stage process including process analysis, hazard identification, identification of critical control points and devising effective control options. Suggestions as to how the method can be adapted to the control of hospital infection are made. A HACCP analysis, on the supply of expressed human breast milk to babies on a Special Care Baby Unit, is presented. Although only the mother's own milk was given to babies on the unit, several potential hazards were identified and suggestions have been made for their control. PMID:1674260

    19. Vertical Guidance Performance Analysis of the L1–L5 Dual-Frequency GPS/WAAS User Avionics Sensor

      PubMed Central

      Jan, Shau-Shiun

      2010-01-01

      This paper investigates the potential vertical guidance performance of global positioning system (GPS)/wide area augmentation system (WAAS) user avionics sensor when the modernized GPS and Galileo are available. This paper will first investigate the airborne receiver code noise and multipath (CNMP) confidence (σair). The σair will be the dominant factor in the availability analysis of an L1–L5 dual-frequency GPS/WAAS user avionics sensor. This paper uses the MATLAB Algorithm Availability Simulation Tool (MAAST) to determine the required values for the σair, so that an L1–L5 dual-frequency GPS/WAAS user avionics sensor can meet the vertical guidance requirements of APproach with Vertical guidance (APV) II and CATegory (CAT) I over conterminous United States (CONUS). A modified MAAST that includes the Galileo satellite constellation is used to determine under what user configurations WAAS could be an APV II system or a CAT I system over CONUS. Furthermore, this paper examines the combinations of possible improvements in signal models and the addition of Galileo to determine if GPS/WAAS user avionics sensor could achieve 10 m Vertical Alert Limit (VAL) within the service volume. Finally, this paper presents the future vertical guidance performance of GPS user avionics sensor for the United States’ WAAS, Japanese MTSAT-based satellite augmentation system (MSAS) and European geostationary navigation overlay service (EGNOS). PMID:22319263

    20. Use of bioassays in assessing health hazards from complex mixtures: A RASH analysis

      SciTech Connect

      Jones, T.D.

      1993-10-14

      The Finney harmonic mean model for joint toxicity of ingredients in mixtures can be used to estimate the toxicity of the neat compound if one component can be substituted in potency-adjusted-doses for each of the other components. Chemical analysis data and relative potency values (computed according to the Rapid Screening of Hazard (RASH) method) were used to compare the toxicities as predicted from ingredients of cigarette smoke, PAHs in diesel exhaust, asphalt, coal tar, pitch, and creosote with the measured toxicities of the neat mixtures. Accuracy for cigarette smoke condensate, coal tar, pitch, and creosote were within a factor of three; asphalt within a factor of 18; but the PAC content of diesel particulate was inadequate to accurately describe the toxicity of diesel emissions.

    1. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

      SciTech Connect

      Suzette Payne

      2007-08-01

      This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

    2. Developing Sustainable Modeling Software and Necessary Data Repository for Volcanic Hazard Analysis -- Some Lessons Learnt

      NASA Astrophysics Data System (ADS)

      Patra, A. K.; Connor, C.; Webley, P.; Jones, M.; Charbonnier, S. J.; Connor, L.; Gallo, S.; Bursik, M. I.; Valentine, G.; Hughes, C. G.; Aghakhani, H.; Renschler, C. S.; Kosar, T.

      2014-12-01

      We report here on an effort to improve the sustainability, robustness and usability of the core modeling and simulation tools housed in the collaboratory VHub.org and used in the study of complex volcanic behavior. In particular, we focus on tools that support large scale mass flows (TITAN2D), ash deposition/transport and dispersal (Tephra2 and PUFF), and lava flows (Lava2). These tools have become very popular in the community especially due to the availability of an online usage modality. The redevelopment of the tools ot take advantage of new hardware and software advances was a primary thrust for the effort. However, as we start work we have reoriented the effort to also take advantage of significant new opportunities for supporting the complex workflows and use of distributed data resources that will enable effective and efficient hazard analysis.

    3. Use of bioassays in assessing health hazards from complex mixtures: a rash analysis.

      PubMed

      Jones, T D

      1995-07-01

      The Finney harmonic mean model for joint toxicity of ingredients in mixtures can be used to estimate the toxicity of the neat compound if each component can be substituted in potency-adjusted-doses for any of the other components. Chemical analysis of constituent substances and their associated concentrations and relative toxicological potency values (computed according to the RApid Screening of Hazard (RASH) method) were used to compare the toxicities as predicted from ingredients of cigarette smoke, PAHs in diesel exhaust, asphalt, coal tar, pitch, and creosote with the measured toxicities of the corresponding neat mixtures. Accuracy for cigarette smoke condensate, coal tar, pitch, and creosote were within a factor of three based on the PAH fraction; asphalt was within a factor of 18; but the PAH fraction of diesel particulate was again found to be inadequate to describe the composite toxicity of diesel emissions. PMID:7545524

    4. Human body impedance for electromagnetic hazard analysis in the VLF to MF band

      SciTech Connect

      Kanai, H.; Chatterjee, I.; Gandhi, O.P.

      1984-08-01

      A knowledge of the average electrical impedance of the human body is essential for the analysis of electromagnetic hazards in the VLF to MF band. The purpose of the measurements was to determine the average body impedance of several human subjects as a function of frequency. Measurements were carried out with the subjects standing barefoot on a ground plane and touching various metal electrodes with the hand or index finger. The measured impedance includes the electrode polarization and skin impedances, spread impedance near the electrode, body impedance, stray capacitance between the body surface and ground, and inductance due to the body and grounding strap. These components are separated and simplified equivalent circuits are presented for body impedance of humans exposed to free-space electromagnetic waves as well as in contact with large ungrounded metallic objects therein.

    5. Expressed breast milk on a neonatal unit: a hazard analysis and critical control points approach.

      PubMed

      Cossey, Veerle; Jeurissen, Axel; Thelissen, Marie-José; Vanhole, Chris; Schuermans, Annette

      2011-12-01

      With the increasing use of human milk and growing evidence of the benefits of mother's milk for preterm and ill newborns, guidelines to ensure its quality and safety are an important part of daily practice in neonatal intensive care units. Operating procedures based on hazard analysis and critical control points can standardize the handling of mother's expressed milk, thereby improving nutrition and minimizing the risk of breast milk-induced infection in susceptible newborns. Because breast milk is not sterile, microorganisms can multiply when the milk is not handled properly. Additional exogenous contamination should be prevented. Strict hygiene and careful temperature and time control are important during the expression, collection, transport, storage, and feeding of maternal milk. In contrast to formula milk, no legal standards exist for the use of expressed maternal milk. The need for additional measures, such as bacteriological screening or heat treatment, remains unresolved. PMID:21640434

    6. 230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis

      USGS Publications Warehouse

      Paces, James B.

      2014-01-01

      This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

    7. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

      SciTech Connect

      Paces, James B.

      2014-08-31

      This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

    8. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

      SciTech Connect

      Suzette Payne

      2006-04-01

      This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

    9. Preliminary analysis of hazardous air pollutant emission inventories from three major urban areas

      SciTech Connect

      Jones, J.W.; Campbell, D.; Murphy, P.; Smith, R.

      1993-01-01

      The paper reports EPA/AEERL's progress on emissions inventory evaluation and improvement under a hazardous air pollutant (HAP) emissions research program in support of the Urban Area Source Program required under Title III of the Clean Air Act Amendments of 1990 (CAAA). The paper summarizes results of three current projects and indicates HAP emissions inventory needs. HAP inventories for three urban areas--Chicago, San Francisco, and Seattle/Tacoma--were analyzed to identify area sources as defined in the CAAA. One inventory focused on area sources; the other two were basically point source inventories that had facilities that met the area source definition. The HAPs that contribute most of the area source emissions in each inventory were identified, and 22 HAPs that were common to the inventories were selected for further analysis.

    10. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

      NASA Astrophysics Data System (ADS)

      Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

      2016-04-01

      Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

    11. An OSHA based approach to safety analysis for nonradiological hazardous materials

      SciTech Connect

      Yurconic, M.

      1992-08-01

      The PNL method for chemical hazard classification defines major hazards by means of a list of hazardous substances (or chemical groups) with associated trigger quantities. In addition, the functional characteristics of the facility being classified is also be factored into the classification. In this way, installations defined as major hazard will only be those which have the potential for causing very serious incidents both on and off site. Because of the diversity of operations involving chemicals, it may not be possible to restrict major hazard facilities to certain types of operations. However, this hazard classification method recognizes that in the industrial sector major hazards are most commonly associated with activities involving very large quantities of chemicals and inherently energetic processes. These include operations like petrochemical plants, chemical production, LPG storage, explosives manufacturing, and facilities which use chlorine, ammonia, or other highly toxic gases in bulk quantities. The basis for this methodology is derived from concepts used by OSHA in its proposed chemical process safety standard, the Dow Fire and Explosion Index Hazard Classification Guide, and the International Labor Office`s program on chemical safety. For the purpose of identifying major hazard facilities, this method uses two sorting criteria, (1) facility function and processes and (2) quantity of substances to identify facilities requiringclassification. Then, a measure of chemical energy potential (material factor) is used to identify high hazard class facilities.

    12. An OSHA based approach to safety analysis for nonradiological hazardous materials

      SciTech Connect

      Yurconic, M.

      1992-08-01

      The PNL method for chemical hazard classification defines major hazards by means of a list of hazardous substances (or chemical groups) with associated trigger quantities. In addition, the functional characteristics of the facility being classified is also be factored into the classification. In this way, installations defined as major hazard will only be those which have the potential for causing very serious incidents both on and off site. Because of the diversity of operations involving chemicals, it may not be possible to restrict major hazard facilities to certain types of operations. However, this hazard classification method recognizes that in the industrial sector major hazards are most commonly associated with activities involving very large quantities of chemicals and inherently energetic processes. These include operations like petrochemical plants, chemical production, LPG storage, explosives manufacturing, and facilities which use chlorine, ammonia, or other highly toxic gases in bulk quantities. The basis for this methodology is derived from concepts used by OSHA in its proposed chemical process safety standard, the Dow Fire and Explosion Index Hazard Classification Guide, and the International Labor Office's program on chemical safety. For the purpose of identifying major hazard facilities, this method uses two sorting criteria, (1) facility function and processes and (2) quantity of substances to identify facilities requiringclassification. Then, a measure of chemical energy potential (material factor) is used to identify high hazard class facilities.

    13. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

      SciTech Connect

      Sharifi, Mozafar Hadidi, Mosslem Vessali, Elahe Mosstafakhani, Parasto Taheri, Kamal Shahoie, Saber Khodamoradpour, Mehran

      2009-10-15

      The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

    14. The value of integrating information from multiple hazards for flood risk analysis and management

      NASA Astrophysics Data System (ADS)

      Castillo-Rodríguez, J. T.; Escuder-Bueno, I.; Altarejos-García, L.; Serrano-Lombillo, A.

      2014-02-01

      This article presents a methodology for estimating flood risk in urban areas integrating pluvial flooding, river flooding and failure of both small and large dams. The first part includes a review of basic concepts on flood risk analysis, evaluation and management. Flood risk analyses may be developed at local, regional and national level, however a general methodology to perform a quantitative flood risk analysis including different flood hazards is still required. The second part describes the proposed methodology, which presents an integrated approach - combining pluvial, river flooding and flooding from dam failure, as applied to a case study: an urban area located downstream of a dam under construction. The methodology enhances the approach developed within the SUFRI project ("Sustainable Strategies of Urban Flood Risk Management to cope with the residual risk", 2009-2011). This article also shows how outcomes from flood risk analysis provide better and more complete information to inform authorities, local entities and the stakeholders involved in decision-making with regard to flood risk management.

    15. DOE Waste Treatability Group Guidance

      SciTech Connect

      Kirkpatrick, T.D.

      1995-01-01

      This guidance presents a method and definitions for aggregating U.S. Department of Energy (DOE) waste into streams and treatability groups based on characteristic parameters that influence waste management technology needs. Adaptable to all DOE waste types (i.e., radioactive waste, hazardous waste, mixed waste, sanitary waste), the guidance establishes categories and definitions that reflect variations within the radiological, matrix (e.g., bulk physical/chemical form), and regulated contaminant characteristics of DOE waste. Beginning at the waste container level, the guidance presents a logical approach to implementing the characteristic parameter categories as part of the basis for defining waste streams and as the sole basis for assigning streams to treatability groups. Implementation of this guidance at each DOE site will facilitate the development of technically defined, site-specific waste stream data sets to support waste management planning and reporting activities. Consistent implementation at all of the sites will enable aggregation of the site-specific waste stream data sets into comparable national data sets to support these activities at a DOE complex-wide level.

    16. Guidance on priority setting in health care (GPS-Health): the inclusion of equity criteria not captured by cost-effectiveness analysis

      PubMed Central

      2014-01-01

      This Guidance for Priority Setting in Health Care (GPS-Health), initiated by the World Health Organization, offers a comprehensive map of equity criteria that are relevant to health care priority setting and should be considered in addition to cost-effectiveness analysis. The guidance, in the form of a checklist, is especially targeted at decision makers who set priorities at national and sub-national levels, and those who interpret findings from cost-effectiveness analysis. It is also targeted at researchers conducting cost-effectiveness analysis to improve reporting of their results in the light of these other criteria. The guidance was develop through a series of expert consultation meetings and involved three steps: i) methods and normative concepts were identified through a systematic review; ii) the review findings were critically assessed in the expert consultation meetings which resulted in a draft checklist of normative criteria; iii) the checklist was validated though an extensive hearing process with input from a range of relevant stakeholders. The GPS-Health incorporates criteria related to the disease an intervention targets (severity of disease, capacity to benefit, and past health loss); characteristics of social groups an intervention targets (socioeconomic status, area of living, gender; race, ethnicity, religion and sexual orientation); and non-health consequences of an intervention (financial protection, economic productivity, and care for others). PMID:25246855

    17. Limited variance control in statistical low thrust guidance analysis. [stochastic algorithm for SEP comet Encke flyby mission

      NASA Technical Reports Server (NTRS)

      Jacobson, R. A.

      1975-01-01

      Difficulties arise in guiding a solar electric propulsion spacecraft due to nongravitational accelerations caused by random fluctuations in the magnitude and direction of the thrust vector. These difficulties may be handled by using a low thrust guidance law based on the linear-quadratic-Gaussian problem of stochastic control theory with a minimum terminal miss performance criterion. Explicit constraints are imposed on the variances of the control parameters, and an algorithm based on the Hilbert space extension of a parameter optimization method is presented for calculation of gains in the guidance law. The terminal navigation of a 1980 flyby mission to the comet Encke is used as an example.

    18. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

      NASA Astrophysics Data System (ADS)

      Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

      2016-06-01

      We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

    19. The dilemma in prioritizing chemicals for environmental analysis: known versus unknown hazards.

      PubMed

      Anna, Sobek; Sofia, Bejgarn; Christina, Rudén; Magnus, Breitholtz

      2016-08-10

      A major challenge for society is to manage the risks posed by the many chemicals continuously emitted to the environment. All chemicals in production and use cannot be monitored and science-based strategies for prioritization are essential. In this study we review available data to investigate which substances are included in environmental monitoring programs and published research studies reporting analyses of chemicals in Baltic Sea fish between 2000 and 2012. Our aim is to contribute to the discussion of priority settings in environmental chemical monitoring and research, which is closely linked to chemical management. In total, 105 different substances or substance groups were analyzed in Baltic Sea fish. Polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans (PCDD/Fs) and polychlorinated biphenyls (PCBs) were the most studied substances or substance groups. The majority, 87%, of all analyses comprised 20% of the substances or substance groups, whereas 46 substance groups (44%) were analyzed only once. Almost three quarters of all analyses regarded a POP-substance (persistent organic pollutant). These results demonstrate that the majority of analyses on environmental contaminants in Baltic Sea fish concern a small number of already regulated chemicals. Legacy pollutants such as POPs pose a high risk to the Baltic Sea due to their hazardous properties. Yet, there may be a risk that prioritizations for chemical analyses are biased based on the knowns of the past. Such biases may lead to society failing in identifying risks posed by yet unknown hazardous chemicals. Alternative and complementary ways to identify priority chemicals are needed. More transparent communication between risk assessments performed as part of the risk assessment process within REACH and monitoring programs, and information on chemicals contained in consumer articles, would offer ways to identify chemicals for environmental analysis. PMID:27222376

    20. Use of hazard analysis critical control point and alternative treatments in the production of apple cider.

      PubMed

      Senkel, I A; Henderson, R A; Jolbitado, B; Meng, J

      1999-07-01

      The purpose of this study was to evaluate the practices of Maryland cider producers and determine whether implementing hazard analysis critical control point (HACCP) would reduce the microbial contamination of cider. Cider producers (n = 11) were surveyed to determine existing manufacturing practices and sanitation. A training program was then conducted to inform operators of safety issues, including contamination with Escherichia coli O157:H7, and teach HACCP concepts and principles, sanitation procedures, and good manufacturing practice (GMP). Although all operators used a control strategy from one of the model HACCP plans provided, only one developed a written HACCP plan. None developed specific GMP, sanitation standard operating procedures, or sanitation monitoring records. Six operators changed or added production controls, including the exclusion of windfall apples, sanitizing apples chemically and by hot dip, and cider treatment with UV light or pasteurization. Facility inspections indicated improved sanitation and hazard control but identified ongoing problems. Microbiological evaluation of bottled cider before and after training, in-line apples, pomace, cider, and inoculated apples was conducted. E. coli O157:H7, Salmonella, or Staphylococcus aureus were not found in samples of in-line apple, pomace, and cider, or bottled cider. Generic E. coli was not isolated on in-coming apples but was found in 4 of 32 (13%) in-line samples and 3 of 17 (18%) bottled fresh cider samples, suggesting that E. coli was introduced during in-plant processing. To produce pathogen-free cider, operators must strictly conform to GMP and sanitation procedures in addition to HACCP controls. Controls aimed at preventing or eliminating pathogens on source apples are critical but alone may not be sufficient for product safety. PMID:10419272

    1. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

      NASA Astrophysics Data System (ADS)

      Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

      2016-03-01

      We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: 1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; 2) numerical computation of tsunami generation and propagation up to a given offshore isobath; 3) (optional) site-specific quantification of inundation; 4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1), the event tree, specifically for SPTHA, focussing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that: i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

    2. The ARIA project: Advanced Rapid Imaging and Analysis for Natural Hazard Monitoring and Response

      NASA Astrophysics Data System (ADS)

      Owen, S. E.; Webb, F.; Simons, M.; Rosen, P. A.; Cruz, J.; Yun, S.; Fielding, E. J.; Moore, A. W.; Hua, H.; Agram, P.; Lundgren, P.

      2012-12-01

      ARIA is a joint JPL/Caltech coordinated effort to automate geodetic imaging capabilities for hazard response and societal benefit. Over the past decade, space-based geodetic measurements such as InSAR and GPS have provided new assessment capabilities and situational awareness on the size and location of earthquakes following seismic disasters and on volcanic eruptions following magmatic events. Geodetic imaging's unique ability to capture surface deformation in high spatial and temporal resolution allow us to resolve the fault geometry and distribution of slip associated with any given earthquake in correspondingly high spatial & temporal detail. In addition, remote sensing with radar provides change detection and damage assessment capabilities for earthquakes, floods and other disasters that can image even at night or through clouds. These data sets are still essentially hand-crafted, and thus are not generated rapidly and reliably enough for informing decision-making agencies and the public following an earthquake. We are building an end-to-end prototype geodetic imaging data system that would form the foundation for an envisioned operational hazard response center integrating InSAR, GPS, seismology, and modeling to deliver monitoring, actionable science, and situational awareness products. This prototype exploits state-of-the-art analysis algorithms from technologists and scientists, These algorithms enable the delivery of actionable products from larger data sets with enhanced modeling and interpretation, and the development of next generation techniques. We are collaborating with USGS scientists in both the earthquake and volcano science program for our initial data product infusion. We present our progress to date on development of prototype data system and demonstration data products, and example responses we have run such as generating products for the 2011 M9.0 Tohoku-oki, M6.3 Christchurch earthquakes, the 2011 M7.1 Van earthquake, and several simulated

    3. Safety and Hazard Analysis for the Coherent/Acculite Laser Based Sandia Remote Sensing System (Trailer B70).

      SciTech Connect

      Augustoni, Arnold L.

      2005-09-01

      A laser safety and hazard analysis is presented, for the Coherent(r) driven Acculite(r) laser central to the Sandia Remote Sensing System (SRSS). The analysis is based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The trailer (B70) based SRSS laser system is a mobile platform which is used to perform laser interaction experiments and tests at various national test sites. The trailer based SRSS laser system is generally operated on the United State Air Force Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances. In order to protect personnel who work inside the Nominal Hazard Zone (NHZ) from hazardous laser exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (ODmin) necessary for the laser safety eyewear used by authorized personnel. Also, the Nominal Ocular Hazard Distance (NOHD) and The Extended Ocular Hazard Distance (EOHD) are calculated in order to protect unauthorized personnel who may have violated the boundaries of the control area and might enter into the laser's NHZ for testing outside the trailer. 4Page intentionally left blank

    4. Guidance for performing site inspections under CERCLA

      SciTech Connect

      Not Available

      1992-09-01

      This guidance presents EPA`s site inspection (SI) strategy. The strategy discusses procedural guidelines to investigate potential Superfund (CERCLA) sites for evaluation pursuant to the Hazard Ranking System (HRS), revised in accordance with the Superfund Amendments and Reauthorization Act of 1986. The HRS is the primary means by which EPA evaluates sites for superfund`s National Priorities List (NPL).

    5. Risk analysis procedure for post-wildfire natural hazards in British Columbia

      NASA Astrophysics Data System (ADS)

      Jordan, Peter

      2010-05-01

      Following a severe wildfire season in 2003, and several subsequent damaging debris flow and flood events, the British Columbia Forest Service developed a procedure for analysing risks to public safety and infrastructure from such events. At the same time, the Forest Service undertook a research program to determine the extent of post-wildfire hazards, and examine the hydrologic and geomorphic processes contributing to the hazards. The risk analysis procedure follows the Canadian Standards Association decision-making framework for risk management (which in turn is based on international standards). This has several steps: identification of risk, risk analysis and estimation, evaluation of risk tolerability, developing control or mitigation strategies, and acting on these strategies. The Forest Service procedure deals only with the first two steps. The results are passed on to authorities such as the Provincial Emergency Program and local government, who are responsible for evaluating risks, warning residents, and applying mitigation strategies if appropriate. The objective of the procedure is to identify and analyse risks to public safety and infrastructure. The procedure is loosely based on the BAER (burned area emergency response) program in the USA, with some important differences. Our procedure focuses on identifying risks and warning affected parties, not on mitigation activities such as broadcast erosion control measures. Partly this is due to limited staff and financial resources. Also, our procedure is not multi-agency, but is limited to wildfires on provincial forest land; in British Columbia about 95% of forest land is in the publicly-owned provincial forest. Each fire season, wildfires are screened by size and proximity to values at risk such as populated areas. For selected fires, when the fire is largely contained, the procedure begins with an aerial reconnaissance of the fire, and photography with a hand-held camera, which can be used to make a

    6. Hazard function theory for nonstationary natural hazards

      NASA Astrophysics Data System (ADS)

      Read, L.; Vogel, R. M.

      2015-12-01

      Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

    7. A Critical Analysis of UNESCO's International Technical Guidance on School-Based Education for Puberty and Sexuality

      ERIC Educational Resources Information Center

      Goldman, Juliette D. G.

      2012-01-01

      Preparing children and adolescents for sexual safety and reproductive responsibility lies at the heart and purpose of puberty/sexuality education. The document of International Technical Guidance released by UNESCO in December 2009 aims to provide an evidence-based and rights-based platform offering children and adolescents vital knowledge about…

    8. 77 FR 9946 - Draft Guidance for Industry on Drug Interaction Studies-Study Design, Data Analysis, Implications...

      Federal Register 2010, 2011, 2012, 2013, 2014

      2012-02-21

      ... Federal Register of September 12, 2006 (71 FR 53696), FDA announced the availability of a draft guidance... in vivo studies of drug metabolism, drug transport, and drug-drug, or drug-therapeutic protein... metabolism and/or drug transport abruptly in individuals who previously had been receiving and tolerating...

    9. Analysis of targeting method for closed-loop guidance of a multi-stage space launch vehicle

      NASA Astrophysics Data System (ADS)

      Song, Eun-Jung; Cho, Sangbum; Roh, Woong-Rae

      2016-04-01

      This study considers the orbit insertion targeting problem for guidance of a multi-stage space launch vehicle. To design guidance algorithm for a particular mission target, the orbit designation method should be determined first, as the target orbit of closed-loop guidance during the upper stage flight is usually specified at the end of the final stage. We investigate some advantages of this single target approach over the one with intermediate targets in addition to the final target by comparing them in various flight conditions using an explicit guidance algorithm. We notice that in some cases application of the single targeting can be rather tricky and thus propose here an extended version of the single targeting algorithm that can be applied to problems where the target orbit plane has to be changed during the upper stages flight. We also demonstrate that this pseudo-single target method can be applied to these kinds of problems smoothly in contrast to the conventional single target algorithms by conducting numerical simulations for disturbed off-nominal conditions as well as for the nominal condition.

    10. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

      Federal Register 2010, 2011, 2012, 2013, 2014

      2010-02-24

      ... food authorities participating in the National School Lunch Program (NSLP) or the School Breakfast... participating in the National School Lunch Program (NSLP) or the School Breakfast Program (SBP) to develop a... Service 7 CFR Parts 210 and 220 RIN 0584-AD65 School Food Safety Program Based on Hazard Analysis...

    11. 78 FR 64425 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

      Federal Register 2010, 2011, 2012, 2013, 2014

      2013-10-29

      ... Meeting on Proposed Rule AGENCY: Food and Drug Administration, HHS. ACTION: Notification of public meeting... proposed rule to establish requirements for current good manufacturing practice and hazard analysis and risk-based preventive controls for animal food. This proposed rule is one of several proposed...

    12. PROTOCOL FOR THE COLLECTION AND ANALYSIS OF VOLATILE POHCS (PRINCIPAL ORGANIC HAZARDOUS CONSTITUENTS) USING VOST (VOLATILE ORGANIC SAMPLING TRAIN)

      EPA Science Inventory

      The document is a state-of-the-art operating protocol for sampling and analysis of volatile organic constituents of flue gas from hazardous waste incinerators or other similar combustor systems using the Volatile Organic Sampling Train (VOST). It is intended to be used for guidan...

    13. Ranking of several ground-motion models for seismic hazard analysis in Iran

      NASA Astrophysics Data System (ADS)

      Ghasemi, H.; Zare, M.; Fukushima, Y.

      2008-09-01

      In this study, six attenuation relationships are classified with respect to the ranking scheme proposed by Scherbaum et al (2004 Bull. Seismol. Soc. Am. 94 1-22). First, the strong motions recorded during the 2002 Avaj, 2003 Bam, 2004 Kojour and 2006 Silakhor earthquakes are consistently processed. Then the normalized residual sets are determined for each selected ground-motion model, considering the strong-motion records chosen. The main advantage of these records is that corresponding information about the causative fault plane has been well studied for the selected events. Such information is used to estimate several control parameters which are essential inputs for attenuation relations. The selected relations (Zare et al (1999 Soil Dyn. Earthq. Eng. 18 101-23) Fukushima et al (2003 J. Earthq. Eng. 7 573-98) Sinaeian (2006 PhD Thesis International Institute of Earthquake Engineering and Seismology, Tehran, Iran); Boore and Atkinson (2007 PEER, Report 2007/01); Campbell and Bozorgnia (2007 PEER, Report 2007/02); and Chiou and Youngs (2006 PEER Interim Report for USGS Review)) have been deemed suitable for predicting peak ground-motion amplitudes in the Iranian plateau. Several graphical techniques and goodness-of-fit measures are also applied for statistical distribution analysis of the normalized residual sets. Such analysis reveals ground-motion models, developed using Iranian strong-motion records as the most appropriate ones in the Iranian context. The results of the present study are applicable in seismic hazard assessment projects in Iran.

    14. Damage functions for climate-related hazards: unification and uncertainty analysis

      NASA Astrophysics Data System (ADS)

      Prahl, B. F.; Rybski, D.; Boettle, M.; Kropp, J. P.

      2015-11-01

      Most climate change impacts manifest in the form of natural hazards. For example, sea-level rise and changes in storm climatology are expected to increase the frequency and magnitude of flooding events. In practice there is a need for comprehensive damage assessment at an intermediate level of complexity. Answering this need, we reveal the common grounds of macroscale damage functions employed in storm damage, coastal-flood damage, and heat mortality assessment. The universal approach offers both bottom-up and top-down damage evaluation, employing either an explicit or an implicit portfolio description. Putting emphasis on the treatment of data uncertainties, we perform a sensitivity analysis across different scales. We find that the behaviour of intrinsic uncertainties on the microscale level (i.e. single item) does still persist on the macroscale level (i.e. portfolio). Furthermore, the analysis of uncertainties can reveal their specific relevance, allowing for simplification of the modelling chain. Our results shed light on the role of uncertainties and provide useful insight for the application of a unified damage function.

    15. Bayesian analysis of uncertainty in predisposing and triggering factors for landslides hazard assessment

      NASA Astrophysics Data System (ADS)

      Sandric, I.; Petropoulos, Y.; Chitu, Z.; Mihai, B.

      2012-04-01

      The landslide hazard analysis models takes into consideration both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. The latter is expressed not as land use classes, as for example CORINE, but as leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index was derived from Landsat time series images, starting from 1984 and up to 2011. All the images available for the Panatau administrative unit in Buzau County, Romania, have been downloaded from http://earthexplorer.usgs.gov, including the images with cloud cover. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created

    16. Hazardous Waste

      MedlinePlus

      ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

    17. Combined fluvial and pluvial urban flood hazard analysis: concept development and application to Can Tho city, Mekong Delta, Vietnam

      NASA Astrophysics Data System (ADS)

      Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen

      2016-04-01

      Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of

    18. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA): towards PTHA assessment for the coasts of Italy

      NASA Astrophysics Data System (ADS)

      Selva, Jacopo; Tonini, Roberto; Molinari, Irene; Tiberti, Mara M.; Romano, Fabrizio; Grezio, Anita; Melini, Daniele; Piatanesi, Alessio; Basili, Roberto; Lorito, Stefano

      2016-04-01

      We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes. Differently from classical approaches that commonly adopt the hazard integral and logic tree, we use an event tree approach and ensemble modelling. The procedure was developed in the framework of the EC projects ASTARTE and STREST, of the Italian National Flagship project RITMARE, and of the agreement between Italian Civil Protection and INGV. A total of about 2 × 107 different potential seismic sources covering the entire Mediterranean Sea, and more than 1 × 105 alternative model implementations have been considered to quantify both the aleatory variability and the epistemic uncertainty. A set of hazard curves is obtained along the coasts of the entire Italian territory. They are the prototype of the first homogeneous Italian national SPTHA map.

    19. Hazard analysis of compressed natural gas fueling systems and fueling procedures used at retail gasoline service stations. Final report

      SciTech Connect

      1995-04-28

      An evaluation of the hazards associated with operations of a typical compressed natural gas (CNG) fueling station is presented. The evaluation includes identification of a typical CNG fueling system; a comparison of the typical system with ANSI/NFPA (American National Standards Institute/National Fire Protection Association) Standard 52, Compressed Natural Gas (CNG) Vehicular Fuel System, requirements; a review of CNG industry safety experience as identified in current literature; hazard identification of potential internal (CNG system-specific causes) and external (interface of co-located causes) events leading to potential accidents; and an analysis of potential accident scenarios as determined from the hazard evaluation. The study considers CNG dispensing equipment and associated equipment, including the compressor station, storate vessels, and fill pressure sensing system.

    20. Flood frequency analysis with uncertainty estimation and its application for hazard assessment - a case study in the Mekong Delta

      NASA Astrophysics Data System (ADS)

      Nguyen Viet, Dung; Apel, Heiko; Merz, Bruno; Bárdossy, András

      2015-04-01

      In many flood-prone regions on earth, the nature of the floods calls for a multivariate approach as analyzing flood frequency, which provides a basic for a sound flood hazard and risk assessment. That is because the flood severity is determined not only by the peak flood discharge as usually considered but also by other aspects such as the volume and even the hydrograph shape of the flood. However, the multivariate flood frequency analysis taking into account its associated uncertainty sources has rarely been studied. The Mekong Delta is one of the largest and most densely populated deltas on Earth. It witnesses annual large scale inundations which are associated to the SE-Asian monsoons. These floods are the basis for the livelihoods of the population of the Delta, but they are also the major hazard. This hazard has, however, not been studied within the frame of a probabilistic flood hazard analysis. Thus this study focuses on the identification of a suitable statistical model for the estimation of flood frequencies considering two important flood aspects peak Q and volume V and exemplifies its applicability for a sound flood hazard assessment for the case study in the Mekong Delta. A copula-based bivariate statistical model with bootstrapping-based uncertainty estimation is, hence, developed for a flood frequency analysis of peak flow and volume. The analysis reveals that even with the available - in a hydrological context - quite long data series (e.g. 88 years in the Mekong Delta), large uncertainties are associated to the bivariate quantiles (Q, V), even for rather frequent events. The main uncertainty source is the sampling uncertainty, thus a direct consequence of the limited length of the data series. However, we still advocate for applying the proposed bivariate frequency method for flood frequency estimation in the Mekong Delta because a) it reflects the essential aspects of floods in this region, b) the uncertainties are inherent for every multivariate

    1. Hazardous materials car placement in a train consist. Volume 1. Review and analysis. Final report, 18 Feb-17 Nov 88

      SciTech Connect

      Thompson, R.E.; Zamejc, E.R.; Ahlbeck, D.R.

      1992-06-01

      In response to major derailments involving hazardous materials cars, the Federal Railroad Administration (FRA) initiated the review of the consequences of hazardous materials car placement in a train consist. The review and analysis consisted of six task items: (1) review of accident trends and regulations, (2) an analysis of hazardous materials compatibility, (3) railroad operational constraints, (4) a cost/benefit analysis, (5) recommendations, and (6) preparation of a final report. A review of the 1982-1985 Railroad Accident/Icident Reporting System (RAIRS) data showed the rear quarter to be statistically the 'safest' location in a mainline freight train. Also, the top 101 hazardous commodities (by volume movement) plus fuming nitric acid were analyzed for chemical incompatibility, a total of 5,151 binary combinations. Consequence-based and risk-based rankings were established. Calculations established a post-derailment separation distance of 40 meters minimum to prevent mixing of incompatible chemicals. It was noted that mixing of hazmat materials was not cited in any NTSB accident report as a specific problem.

    2. Recommendations for dealing with waste contaminated with Ebola virus: a Hazard Analysis of Critical Control Points approach

      PubMed Central

      Edmunds, Kelly L; Elrahman, Samira Abd; Bell, Diana J; Brainard, Julii; Dervisevic, Samir; Fedha, Tsimbiri P; Few, Roger; Howard, Guy; Lake, Iain; Maes, Peter; Matofari, Joseph; Minnigh, Harvey; Mohamedani, Ahmed A; Montgomery, Maggie; Morter, Sarah; Muchiri, Edward; Mudau, Lutendo S; Mutua, Benedict M; Ndambuki, Julius M; Pond, Katherine; Sobsey, Mark D; van der Es, Mike; Zeitoun, Mark

      2016-01-01

      Abstract Objective To assess, within communities experiencing Ebola virus outbreaks, the risks associated with the disposal of human waste and to generate recommendations for mitigating such risks. Methods A team with expertise in the Hazard Analysis of Critical Control Points framework identified waste products from the care of individuals with Ebola virus disease and constructed, tested and confirmed flow diagrams showing the creation of such products. After listing potential hazards associated with each step in each flow diagram, the team conducted a hazard analysis, determined critical control points and made recommendations to mitigate the transmission risks at each control point. Findings The collection, transportation, cleaning and shared use of blood-soiled fomites and the shared use of latrines contaminated with blood or bloodied faeces appeared to be associated with particularly high levels of risk of Ebola virus transmission. More moderate levels of risk were associated with the collection and transportation of material contaminated with bodily fluids other than blood, shared use of latrines soiled with such fluids, the cleaning and shared use of fomites soiled with such fluids, and the contamination of the environment during the collection and transportation of blood-contaminated waste. Conclusion The risk of the waste-related transmission of Ebola virus could be reduced by the use of full personal protective equipment, appropriate hand hygiene and an appropriate disinfectant after careful cleaning. Use of the Hazard Analysis of Critical Control Points framework could facilitate rapid responses to outbreaks of emerging infectious disease. PMID:27274594

    3. Endangerment assessment guidance. Final report

      SciTech Connect

      Not Available

      1985-10-21

      The directive clarifies the requirement that an endangerment assessment be developed to support all administrative and judicial enforcement actions under Section 106 of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and Section 7003 of the Resource Conservation and Recovery Act (RCRA). Before taking enforcement action under these provisions to abate the hazards or potential hazards at a site, the Environmental Protection Agency (EPA) must be able to properly document and justify its assertion that an imminent and substantial endangerment to public health or welfare or the environment may exist. The endangerment assessment provides the documentation and justification. The endangerment assessment is not necessary to support Section 104 actions. It also provides guidance on the content, timing, level of detail, format, and resources required for the preparation of endangerment assessments.

    4. Landslide hazard analysis for pipelines: The case of the Simonette river crossing

      SciTech Connect

      Grivas, D.A.; Schultz, B.C.; O`Neil, G.; Rizkalla, M.; McGuffey, V.C.

      1995-12-31

      The overall objective of this study is to develop a probabilistic methodology to analyze landslide hazards and their effects on the safety of buried pipelines. The methodology incorporates a range of models that can accommodate differences in the ground movement modes and the amount and type of information available at various site locations. Two movement modes are considered, namely (a) instantaneous (catastrophic) slides, and (b) gradual ground movement which may result in cumulative displacements over the pipeline design life (30--40 years) that are in excess of allowable values. Probabilistic analysis is applied in each case to address the uncertainties associated with important factors that control slope stability. Availability of information ranges from relatively well studied, instrumented installations to cases where data is limited to what can be derived from topographic and geologic maps. The methodology distinguishes between procedures applied where there is little information and those that can be used when relatively extensive data is available. important aspects of the methodology are illustrated in a case study involving a pipeline located in Northern Alberta, Canada, in the Simonette river valley.

    5. Portable total reflection x-ray fluorescence analysis in the identification of unknown laboratory hazards

      SciTech Connect

      Liu, Ying Imashuku, Susumu; Sasaki, Nobuharu; Ze, Long; Kawai, Jun; Takano, Shotaro; Sohrin, Yoshiki; Seki, Hiroko; Miyauchi, Hiroya

      2014-05-15

      In this study, a portable total reflection x-ray fluorescence (TXRF) spectrometer was used to analyze unknown laboratory hazards that precipitated on exterior surfaces of cooling pipes and fume hood pipes in chemical laboratories. With the aim to examine the accuracy of TXRF analysis for the determination of elemental composition, analytical results were compared with those of wavelength-dispersive x-ray fluorescence spectrometry, scanning electron microscope and energy-dispersive x-ray spectrometry, energy-dispersive x-ray fluorescence spectrometry, inductively coupled plasma atomic emission spectrometry, x-ray diffraction spectrometry (XRD), and x-ray photoelectron spectroscopy (XPS). Detailed comparison of data confirmed that the TXRF method itself was not sufficient to determine all the elements (Z > 11) contained in the samples. In addition, results suggest that XRD should be combined with XPS in order to accurately determine compound composition. This study demonstrates that at least two analytical methods should be used in order to analyze the composition of unknown real samples.

    6. Testing to fulfill HACCP (Hazard Analysis Critical Control Points) requirements: principles and examples.

      PubMed

      Gardner, I A

      1997-12-01

      On-farm HACCP (hazard analysis critical control points) monitoring requires cost-effective, yet accurate and reproducible tests that can determine the status of cows, milk, and the dairy environment. Tests need to be field-validated, and their limitations need to be established so that appropriate screening strategies can be initiated and test results can be rationally interpreted. For infections and residues of low prevalence, tests or testing strategies that are highly specific help to minimize false-positive results and excessive costs to the dairy industry. The determination of the numbers of samples to be tested in HACCP monitoring programs depends on the specific purpose of the test and the likely prevalence of the agent or residue at the critical control point. The absence of positive samples from a herd test should not be interpreted as freedom from a particular agent or residue unless the entire herd has been tested with a test that is 100% sensitive. The current lack of field-validated tests for most of the chemical and infectious agents of concern makes it difficult to ensure that the stated goals of HACCP programs are consistently achieved. PMID:9436129

    7. Enclosure fire hazard analysis using relative energy release criteria. [burning rate and combustion control

      NASA Technical Reports Server (NTRS)

      Coulbert, C. D.

      1978-01-01

      A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.

    8. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

      NASA Astrophysics Data System (ADS)

      Liu, Aiwen; Chen, Kun; Wu, Jian

      2010-06-01

      The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

    9. Probabilistic tsunami hazard analysis (PTHA) of Taiwan region by stochastic model

      NASA Astrophysics Data System (ADS)

      Sun, Y. S.; Chen, P. F.; Chen, C. C.

      2014-12-01

      We conduct probabilistic tsunami hazard analysis (PTHA) of Taiwan region for earthquake sources in the Ryukyu trench. The PTHA estimates the probabilities of a site hit by tsunamis with certain amplitudes threshold. The probabilities were integrated over earthquakes of various magnitudes from potential fault zones in the Ryukyu trench. The annual frequencies of earthquakes in a fault zone are determined or extrapolated by magnitude-frequency distributions of earthquakes (Gutenberg-Richter law) of the zone. Given moment (or magnitude) of an earthquake, we first synthesize patterns of differently complex and heterogeneous slip distributions on the fault using stochastic model. Assuming the slip and stress drop distribution are processes of fractional Brownian motion and described by Hurt exponent. According to ω-2 model of earthquakes and following Fourier transform, slip distributions of earthquake are determined by randomly distributing phase spectrum of those with greater than corner wave number kc. Finally, the vertical seafloor displacements induced by each slip distribution are used by COMCOT for simulation of tsunami to assess the impacts on various coasts in Taiwan.

    10. Descriptive characteristics and cluster analysis of male veteran hazardous drinkers in an alcohol moderation intervention.

      PubMed

      Walker, Robrina; Hunt, Yvonne M; Olivier, Jake; Grothe, Karen B; Dubbert, Patricia M; Burke, Randy S; Cushman, William C

      2012-01-01

      Current efforts underway to develop the fifth edition of the Diagnostic and Statistical Manual (DSM-5) have reignited discussions for classifying the substance use disorders. This study's aim was to contribute to the understanding of abusive alcohol use and its validity as a diagnosis. Cluster analysis was used to identify relatively homogeneous groups of hazardous, nondependent drinkers by using data collected from the Prevention and Treatment of Hypertension Study (PATHS), a multisite trial that examined the ability of a cognitive-behavioral-based alcohol reduction intervention, compared to a control condition, to reduce alcohol use. Participants for this study (N = 511) were male military veterans. Variables theoretically associated with alcohol use (eg, demographic, tobacco use, and mental health) were used to create the clusters and a priori, empirically based external criteria were used to assess discriminant validity. Bivariate correlations among cluster variables were generally consistent with previous findings in the literature. Analyses of internal and discriminant validity of the identified clusters were largely nonsignificant, suggesting meaningful differences between clusters could not be identified. Although the typology literature has contributed supportive validity for the alcohol dependence diagnosis, this study's results do not lend supportive validity for the construct of alcohol abuse. PMID:22691012

    11. Subsystem Hazard Analysis Methodology for the Ares I Upper Stage Source Controlled Items

      NASA Technical Reports Server (NTRS)

      Mitchell, Michael S.; Winner, David R.

      2010-01-01

      This article describes processes involved in developing subsystem hazard analyses for Source Controlled Items (SCI), specific components, sub-assemblies, and/or piece parts, of the NASA ARES I Upper Stage (US) project. SCIs will be designed, developed and /or procured by Boeing as an end item or an off-the-shelf item. Objectives include explaining the methodology, tools, stakeholders and products involved in development of these hazard analyses. Progress made and further challenges in identifying potential subsystem hazards are also provided in an effort to assist the System Safety community in understanding one part of the ARES I Upper Stage project.

    12. Safety in earth orbit study. Volume 2: Analysis of hazardous payloads, docking, on-board survivability

      NASA Technical Reports Server (NTRS)

      1972-01-01

      Detailed and supporting analyses are presented of the hazardous payloads, docking, and on-board survivability aspects connected with earth orbital operations of the space shuttle program. The hazards resulting from delivery, deployment, and retrieval of hazardous payloads, and from handling and transport of cargo between orbiter, sortie modules, and space station are identified and analyzed. The safety aspects of shuttle orbiter to modular space station docking includes docking for assembly of space station, normal resupply docking, and emergency docking. Personnel traffic patterns, escape routes, and on-board survivability are analyzed for orbiter with crew and passenger, sortie modules, and modular space station, under normal, emergency, and EVA and IVA operations.

    13. Subsystem Hazard Analysis Methodology for the ARES I Upper Stage Source Controlled Items

      NASA Astrophysics Data System (ADS)

      Mitchell, Michael S.; Winner, David R.

      2010-09-01

      This article describes processes involved in developing subsystem hazard analyses for Source Controlled Items(SCI), specific components, sub-assemblies, and / or piece parts, of the NASA ARES I Upper Stage(US) project. SCIs will be designed, developed and /or procured by Boeing as an end item or an off-the-shelf item. Objectives include explaining the methodology, tools, stakeholders and products involved in development of these hazard analyses. Progress made and further challenges in identifying potential subsystem hazards are also provided in an effort to assist the System Safety community in understanding one part of the ARES I Upper Stage project.

    14. Elementary School Guidance Bibliography.

      ERIC Educational Resources Information Center

      Biasco, Frank

      This bibliography lists articles and books dealing with elementary school guidance. The bibliography is arranged alphabetically by author. No annotations are given. Approximately 70 titles are given, covering most phases of elementary guidance. (KJ)

    15. Médecins Sans Frontières' Clinical Guidance mobile application: analysis of a new electronic health tool

      PubMed Central

      Wright, V.; Dalwai, M.; Smith, R. Vincent

      2015-01-01

      Many health care workers lack access to clinical support tools in rural and resource-limited settings. To address this gap, the Médecins Sans Frontières (MSF) Clinical Guidelines manual was converted into a static mobile health reference application (app) entitled MSF Guidance. The app's utility and growth was examined, and within 6 months of its launch 150 countries had downloaded the app, with demonstrated retention among new and existing users. With over 3500 downloads and 36 000 sessions amounting to 250 000 screen views, MSF Guidance is a new mobile health platform with widely demonstrated utility, including potential use as an epidemiological tool, where clinical conditions investigated by app users were found to correlate with geographical outbreaks. These findings show that mobile apps can be used to disseminate health information effectively. PMID:26767171

    16. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

      NASA Astrophysics Data System (ADS)

      Dunbar, P. K.; McCullough, H. L.

      2011-12-01

      The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

    17. Hazard analysis in active landslide areas in the State of Veracruz, Mexico

      NASA Astrophysics Data System (ADS)

      Wilde, Martina; Morales Barrera, Wendy V.; Rodriguez Elizarrarás, Sergio R.; Solleiro Rebolledo, Elizabeth; Sedov, Sergey; Terhorst, Birgit

      2016-04-01

      mass movements are analyzed in order to reconstruct complex interrelations of the causes and effects of landslide events. One of the major objectives of this research is to evaluate the potential hazard of active landslide areas. Detailed field analyzes were performed to investigate the situations and dynamics of the slope movements. Therefore, geomorphological mapping, sediment characterization as well as geophysical methods are applied. On the one hand, a detailed sediment characterization aims to identify the type of material (e.g. geotechnical attributes), on the other sediments can provide information on different activity phases, respectively movement processes in slide masses. Furthermore, the focus is placed on the determination of landslide relevant parameters and thresholds. Digital elevation models, which were generated before the onset of slope movements, are integrated in the geomorphological analysis. The poster presents the specific study sites in Veracruz and the situation of endangered slopes before and after the landslide events. It is planned to use this knowledge to model susceptibility maps for the region in the future. Moreover, field data will be used as basic information for further monitoring plans. Resulting susceptibility maps will be provided to the responsible authorities in order to support sustainable planning of settlements and infrastructure in hazardous regions.

    18. Unified powered flight guidance

      NASA Technical Reports Server (NTRS)

      Brand, T. J.; Brown, D. W.; Higgins, J. P.

      1973-01-01

      A complete revision of the orbiter powered flight guidance scheme is presented. A unified approach to powered flight guidance was taken to accommodate all phases of exo-atmospheric orbiter powered flight, from ascent through deorbit. The guidance scheme was changed from the previous modified version of the Lambert Aim Point Maneuver Mode used in Apollo to one that employs linear tangent guidance concepts. This document replaces the previous ascent phase equation document.

    19. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

      NASA Astrophysics Data System (ADS)

      Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

      2005-12-01

      The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy

    20. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

      NASA Astrophysics Data System (ADS)

      Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

      2016-04-01

      In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

    1. Regional-scale analysis of high-mountain multi-hazard and risk in the Pamir (Tajikistan) with GRASS GIS

      NASA Astrophysics Data System (ADS)

      Gruber, F. E.; Mergili, M.

      2013-04-01

      We present a model framework for the regional-scale analysis of high-mountain multi-hazard and -risk, implemented with the Open Source software package GRASS GIS. This framework is applied to a 98 300 km2 study area centred in the Pamir (Tajikistan). It includes (i) rock slides, (ii) ice avalanches, (iii) periglacial debris flows, and (iv) lake outburst floods. First, a hazard indication score is assigned to each relevant object (steep rock face, glacier or periglacial slope, lake). This score depends on the susceptibility and on the expected event magnitude. Second, the possible travel distances, impact areas and, consequently, impact hazard indication scores for all types of processes are computed using empirical relationships. These scores are finally superimposed with an exposure score derived from the type of land use, resulting in a raster map of risk indication scores finally discretized at the community level. The analysis results are presented and discussed at different spatial scales. The major outcome of the study, a set of comprehensive regional-scale hazard and risk indication maps, shall represent an objective basis for the prioritization of target communities for further research and risk mitigation measures.

    2. A comparative analysis of hazard models for predicting debris flows in Madison County, VA

      USGS Publications Warehouse

      Morrissey, Meghan M.; Wieczorek, Gerald F.; Morgan, Benjamin A.

      2001-01-01

      During the rainstorm of June 27, 1995, roughly 330-750 mm of rain fell within a sixteen-hour period, initiating floods and over 600 debris flows in a small area (130 km2) of Madison County, Virginia. Field studies showed that the majority (70%) of these debris flows initiated with a thickness of 0.5 to 3.0 m in colluvium on slopes from 17 o to 41 o (Wieczorek et al., 2000). This paper evaluated and compared the approaches of SINMAP, LISA, and Iverson's (2000) transient response model for slope stability analysis by applying each model to the landslide data from Madison County. Of these three stability models, only Iverson's transient response model evaluated stability conditions as a function of time and depth. Iverson?s model would be the preferred method of the three models to evaluate landslide hazards on a regional scale in areas prone to rain-induced landslides as it considers both the transient and spatial response of pore pressure in its calculation of slope stability. The stability calculation used in SINMAP and LISA is similar and utilizes probability distribution functions for certain parameters. Unlike SINMAP that only considers soil cohesion, internal friction angle and rainfall-rate distributions, LISA allows the use of distributed data for all parameters, so it is the preferred model to evaluate slope stability over SINMAP. Results from all three models suggested similar soil and hydrologic properties for triggering the landslides that occurred during the 1995 storm in Madison County, Virginia. The colluvium probably had cohesion of less than 2KPa. The root-soil system is above the failure plane and consequently root strength and tree surcharge had negligible effect on slope stability. The result that the final location of the water table was near the ground surface is supported by the water budget analysis of the rainstorm conducted by Smith et al. (1996).

    3. Analysis of pregnancy-related calls to an occupational hazard hot line.

      PubMed Central

      Wright, C; Quint, J

      1989-01-01

      Since 1980 the Hazard Evaluation System and Information Service has responded to over 11,000 inquiries regarding workplace health hazards. Of 2,424 inquiries in 1986, 593 (24%) concerned hazards to pregnancy. This represents a 17-fold increase since 1981. Most pregnancy-related inquiries were from employees (70%) and health care providers (23%). Referral sources for the employees calls were almost exclusively individual health care providers or institutions that provide health care or health counseling, or both. These data suggest that pregnant employees seek information on their own or from health care providers instead of from employers. Of the inquiries, 80% were for general pregnancy hazard information; 20% involved symptomatic pregnant employees. Most inquiries concerned employment in the services (58%) and manufacturing (26%) sectors. Organic solvents, pesticides, acrylic nail-grooming products, lead, and video display terminals were among the agents about which callers inquired most frequently. Images PMID:2618048

    4. Analysis of the level of comprehension of chemical hazard labels: a case for Zambia.

      PubMed

      Banda, Samuel F; Sichilongo, Kwenga

      2006-06-15

      We have surveyed the impact of chemical hazard label elements on four target sectors, i.e. the agricultural, industrial, transport and the consumer (the general public) sectors, in order to assess the type of reactions the respondents perceive to a given chemical label element such as symbol, hazard phrase, color, and hazard signal word. The survey revealed that the level of education, gender and/or age did not influence the respondents' perception of the extent of hazard but rather familiarity or frequency of use of the chemicals and acquaintance with chemical label elements was significant in the assessment of the extent of perceived hazard posed by a given chemical. Symbols such as the St Andrews Cross--though common--is virtually not understood by more than 80% of the respondents in all the sectors. We noted that respondents appreciate symbols they can relate to, which are flame-like, ghost-like and exert immediate impacts to respondents. Color codes have found use in the agriculture sector because of their ease to be recalled especially by the majority of illiterate farm workers. The survey revealed that red in agricultural circles is well associated with high toxicity while other colors such as yellow and blue can not clearly be associated with hazard. The word "toxic" is not used in the industry and transport sectors where the most hazard signal word is "danger". The Globally Harmonized System (GHS) classification adopted "danger" and "warning" for use as signal words. The survey revealed that effective chemical hazard symbols must not be too abstract to the client but should contain features that are known or easily comprehended. PMID:16426665

    5. Development and implementation of hazard analysis and critical control point plans by several U.S. feed manufacturers.

      PubMed

      Herrman, Timothy J; Langemeier, Michael R; Frederking, Matt

      2007-12-01

      The commitment to consumer food safety, global trade, and proposed new regulations by the U.S. Food and Drug Administration Center for Veterinary Medicine has led to increased adoption of hazard analysis and critical control point (HACCP) by the U.S. feed industry. A project supported by the U.S. Department of Agriculture Integrated Food Safety Initiate titled "Development and Implementation of a Voluntary HACCP Program for the US Feed Industry" enabled faculty from three land grant universities to assist individuals from 14 feed companies that collectively manufacture 15 million metric tons of feed in 100 facilities to develop HACCP plans. The process flow in these plans averaged 20 steps, and the most detailed plan included 60 process steps. Chemical hazards were more commonly identified in HACCP plans (average of four hazards per plan) than were biological hazards (average of one per plan). The most prevalent chemical hazards were cross-contamination of type A medicated articles and type B medicated feeds, aflatoxin, and wrong ingredient inclusion in feed. The most common biological hazard was mammalian protein contamination of feed ingredients and finished feed for cattle. An assessment of time and costs associated with developing HACCP plans revealed that approximately 29% of the companies needed additional personnel or additional equipment to implement a HACCP plan, and on average 268 additional person hours were needed to develop and implement a HACCP plan. Plan design, compliance monitoring, and record keeping were the three most time-consuming activities needed for developing and implementing a HACCP plan. The average cost of additional equipment needed to implement a HACCP plan was $250. PMID:18095436

    6. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

      PubMed

      Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

      2015-12-15

      As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. PMID:26327640

    7. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

      SciTech Connect

      Kimball, J.K.; Chander, H.

      1997-02-01

      The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

    8. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, Northwest Italy

      NASA Astrophysics Data System (ADS)

      Turconi, L.; Tropeano, D.; Savio, G.; De, S. K.; Mason, P. J.

      2015-09-01

      The study area (600 km2), consisting of Orco and Soana valleys in the Western Italian Alps, experienced different types of natural hazards, typical of the whole Alpine environment. Some of the authors have been requested to draw a civil protection plan for such mountainous regions. This offered the special opportunity (1) to draw a lot of unpublished historical data, dating back several centuries mostly concerning natural hazard processes and related damages, (2) to develop original detailed geo-morphological studies in a region still poorly known, (3) to prepare detailed thematic maps illustrating landscape components related to natural conditions and hazards, (4) to thoroughly check present-day situations in the area compared to the effects of past events and (5) to find adequate natural hazard scenarios for all sites exposed to risk. The method of work has been essentially to compare archival findings with field evidence in order to assess natural hazard processes, their occurrence and magnitude, and to arrange all such elements in a database for GIS-supported thematic maps. Several types of natural hazards, such as landslides, rockfalls, debris flows, stream floods and snow avalanches cause huge damage to lives and properties (housings, roads, tourist sites). We aim to obtain newly acquired knowledge in this large, still poorly understood area as well as develop easy-to-interpret products such as natural risk maps.

    9. 16 CFR 1500.230 - Guidance for lead (Pb) in consumer products.

      Code of Federal Regulations, 2012 CFR

      2012-01-01

      ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Guidance for lead (Pb) in consumer products. 1500.230 Section 1500.230 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS HAZARDOUS SUBSTANCES AND ARTICLES; ADMINISTRATION AND ENFORCEMENT REGULATIONS § 1500.230 Guidance for lead (Pb)...

    10. 16 CFR 1500.230 - Guidance for lead (Pb) in consumer products.

      Code of Federal Regulations, 2013 CFR

      2013-01-01

      ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Guidance for lead (Pb) in consumer products. 1500.230 Section 1500.230 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS HAZARDOUS SUBSTANCES AND ARTICLES; ADMINISTRATION AND ENFORCEMENT REGULATIONS § 1500.230 Guidance for lead (Pb)...

    11. 78 FR 24439 - Compliance With Information Request, Flooding Hazard Reevaluation

      Federal Register 2010, 2011, 2012, 2013, 2014

      2013-04-25

      ...The U.S. Nuclear Regulatory Commission (NRC) is issuing draft Japan Lessons-Learned Project Directorate Interim Staff Guidance (JLD- ISG), JLD-ISG-2013-01, ``Guidance for Estimating Flooding Hazards due to Dam Failure.'' This draft JLD-ISG provides guidance acceptable to the NRC staff for reevaluating flooding hazards due to dam failure for the purpose of responding to enclosure 2 of a March......

    12. A guide to CERCLA site assessment. Environmental Guidance

      SciTech Connect

      Not Available

      1994-02-01

      This Pocket Guide is a condensed version of information provided in three EPA documents: Guidance for Performing Preliminary Assessments Under CERCLA, Guidance for Performing Site Inspections Under CERCLA, and Hazard Ranking System Guidance Manual. Additionally the guide provides a DOE perspective on site assessment issues and information on the Federal Agency Hazardous Waste Compliance Docket as well as data sources for DOE site assessments. The guide is intended to present this information in a simple, portable, and direct manner that will allow the user to effectively focus on those aspects of the site assessment process of interest. The guide is not intended as a substitute for the three EPA guidance documents mentioned previously. DOE investigators should be thoroughly familiar with the EPA guidance before conducting site assessments. Use this pocketguide as an overview of procedures and requirements and as a field guide.

    13. AschFlow - A dynamic landslide run-out model for medium scale hazard analysis.

      NASA Astrophysics Data System (ADS)

      Luna, Byron Quan; Blahut, Jan; van Asch, Theo; van Westen, Cees; Kappes, Melanie

      2015-04-01

      Landslides and debris flow hazard assessments require a scale-dependent analysis in order to mitigate damage and other negative consequences at the respective scales of occurrence. Medium or large scale landslide run-out modelling for many possible landslide initiation areas has been a cumbersome task in the past. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the run-out models to compute the displacement with a large amount of individual initiation areas (computational exhaustive). Most of the existing physically based run-out models have complications in handling such situations and therefore empirical methods have been used as a practical mean to predict landslides mobility at a medium scale (1:10,000 to 1:50,000). In this context, a simple medium scale numerical model for rapid mass movements in urban and mountainous areas was developed. The deterministic nature of the approach makes it possible to calculate the velocity, height and increase in mass by erosion, resulting in the estimation of various forms of impacts exerted by debris flows at the medium scale The established and implemented model ("AschFlow") is a 2-D one-phase continuum model that simulates, the entrainment, spreading and deposition process of a landslide or debris flow at a medium scale. The flow is thus treated as a single phase material, whose behavior is controlled by rheology (e.g. Voellmy or Bingham). The developed regional model "AschFlow" was applied and evaluated in well documented areas with known past debris flow events.

    14. Analysis of Landslide Hazard Impact Using the Landslide Database for Germany

      NASA Astrophysics Data System (ADS)

      Klose, M.; Damm, B.

      2014-12-01

      The Federal Republic of Germany has long been among the few European countries that lack a national landslide database. Systematic collection and inventory of landslide data still shows a comprehensive research history in Germany, but only one focused on development of databases with local or regional coverage. This has changed in recent years with the launch of a database initiative aimed at closing the data gap existing at national level. The present contribution reports on this project that is based on a landslide database which evolved over the last 15 years to a database covering large parts of Germany. A strategy of systematic retrieval, extraction, and fusion of landslide data is at the heart of the methodology, providing the basis for a database with a broad potential of application. The database offers a data pool of more than 4,200 landslide data sets with over 13,000 single data files and dates back to 12th century. All types of landslides are covered by the database, which stores not only core attributes, but also various complementary data, including data on landslide causes, impacts, and mitigation. The current database migration to PostgreSQL/PostGIS is focused on unlocking the full scientific potential of the database, while enabling data sharing and knowledge transfer via a web GIS platform. In this contribution, the goals and the research strategy of the database project are highlighted at first, with a summary of best practices in database development providing perspective. Next, the focus is on key aspects of the methodology, which is followed by the results of different case studies in the German Central Uplands. The case study results exemplify database application in analysis of vulnerability to landslides, impact statistics, and hazard or cost modeling.

    15. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

      NASA Astrophysics Data System (ADS)

      Wright, Heather; Pallister, John; Newhall, Chris

      2015-04-01

      Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

    16. Childhood cancer incidence rates and hazardous air pollutants in California: an exploratory analysis.

      PubMed Central

      Reynolds, Peggy; Von Behren, Julie; Gunier, Robert B; Goldberg, Debbie E; Hertz, Andrew; Smith, Daniel F

      2003-01-01

      Hazardous air pollutants (HAPs) are compounds shown to cause cancer or other adverse health effects. We analyzed population-based childhood cancer incidence rates in California (USA) from 1988 to 1994, by HAP exposure scores, for all California census tracts. For each census tract, we calculated exposure scores by combining cancer potency factors with outdoor HAP concentrations modeled by the U.S. Environmental Protection Agency. We evaluated the relationship between childhood cancer rates and exposure scores for 25 potentially carcinogenic HAPs emitted from mobile, area, and point sources and from all sources combined. Our study period saw 7,143 newly diagnosed cancer cases in California; of these, 6,989 (97.8%) could be assigned to census tracts and included in our analysis. Using Poisson regression, we estimated rate ratios (RRs) adjusted for age, race/ethnicity, and sex. We found little evidence for elevated cancer RRs for all sites or for gliomas among children living in high-ranking combined-source exposure areas. We found elevated RRs and a significant trend with increasing exposure level for childhood leukemia in tracts ranked highest for exposure to the combined group of 25 HAPs (RR = 1.21; 95% confidence interval, 1.03, 1.42) and in tracts ranked highest for point-source HAP exposure (RR = 1.32; 95% confidence interval, 1.11, 1.57). Our findings suggest an association between increased childhood leukemia rates and high HAP exposure, but studies involving more comprehensive exposure assessment and individual-level exposure data will be important for elucidating this relationship. PMID:12676632

    17. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

      SciTech Connect

      MULKEY, C.H.

      1999-07-06

      This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.

    18. Reading Guidance: Death and Grief.

      ERIC Educational Resources Information Center

      Smith, Alice Gullen

      1989-01-01

      Gives guidelines for the librarian using reading guidance (similar to bibliotherapy). Provides a nine-item annotated bibliography of novels for children and adolescents on the subject of death and grief. Appends an embryo list of categories suitable for content analysis of any file librarians might wish to keep on books suitable for use in this…

    19. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

      PubMed

      Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

      2001-01-01

      To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs. PMID:11464766

    20. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

      DOE PAGESBeta

      Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

      2015-06-04

      The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less