Science.gov

Sample records for hazard analysis guidance

  1. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    SciTech Connect

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  2. K Basin Hazard Analysis

    SciTech Connect

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  3. K Basins Hazard Analysis

    SciTech Connect

    WEBB, R.H.

    1999-12-29

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062, Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  4. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  5. Automated Hazard Analysis

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control andmore » job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the work planning process.« less

  6. Hazard Analysis Database Report

    SciTech Connect

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  7. Hazard Analysis Database Report

    SciTech Connect

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  8. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  9. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  10. Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Not Available

    1988-01-01

    The purpose of Probabilistic Seismic Hazard Analysis (PSHA) is to evaluate the hazard of seismic ground motion at a site by considering all possible earthquakes in the area, estimating the associated shaking at the site, and calculating the probabilities of these occurrences. The Panel on Seismic Hazard Analysis is charged with assessment of the capabilities, limitations, and future trends of PSHA in the context of alternatives. The report identifies and discusses key issues of PSHA and is addressed to decision makers with a modest scientific and technical background and to the scientific and technical community. 37 refs., 19 figs.

  11. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  12. Chemical process hazards analysis

    SciTech Connect

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  13. Counterfactual Volcano Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2013-04-01

    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However

  14. Guidance augmentation for reducing uncertainty in vision-based hazard mapping during lunar landing

    NASA Astrophysics Data System (ADS)

    Crane, E. S.; Rock, S. M.

    A new guidance augmentation scheme, which generates information-seeking trajectory adjustments, is shown to produce improvements in hazard mapping during autonomous lunar landing. This approach utilizes previously developed techniques for detecting hazard objects from images and an Extended Kalman Filter recursive estimation framework in order to create an occupancy grid representation of the hazards in the landing area. The guidance augmentation is driven by a model-predictive scheme which uses predictions of map entropy and fuel usage in order to generate information-seeking acceleration commands which are combined with the targeting capability of Modified Apollo Guidance. An overview of the algorithmic steps required to adjust the trajectory and predict mapping performance and fuel costs are presented. Hazard maps generated using the online information-seeking trajectory adjustments show significant improvement over un-adjusted trajectories where the hazard image data collected is only incidental.

  15. Track 2 sites: Guidance for assessing low probability hazard sites at the INEL. Revision 6

    SciTech Connect

    Not Available

    1994-01-01

    This document presents guidance for assessment of Track 2 low probability hazard sites (LPHS) at the Idaho National Engineering Laboratory (INEL). The Track 2 classification was developed specifically for the INEL to streamline the implementation of Comprehensive Environmental Response, Compensation, and Liability Act. Track 2 LPHSs are described as sites where insufficient data are available to make a decision concerning the risk level or to select or design a remedy. As such, these types of sites are not described in the National Contingency Plan or existing regulatory guidance. The goal of the Track 2 process is to evaluate LPHSs using existing qualitative and quantitative data to minimize the collection of new environmental data. To this end, this document presents a structured format consisting of a series of questions and tables. A qualitative risk assessment is used. The process is iterative, and addresses an LPHS from multiple perspectives (i.e., historical, empirical, process) in an effort to generate a reproducible and defensible method. This rigorous approach follows the data quality objective process and establishes a well organized, logical approach to consolidate and assess existing data, and set decision criteria. If necessary, the process allows for the design of a sampling and analysis strategy to obtain new environmental data of appropriate quality to support decisions for each LPHS. Finally, the guidance expedites consensus between regulatory parties by emphasizing a team approach to Track 2 investigations.

  16. CHARACTERIZATION OF ORGANIC EMISSIONS FROM HAZARDOUS WASTE INCINERATION PROCESSES UNDER THE NEW EPA DRAFT RISK BURN GUIDANCE: MEASUREMENT ISSUES

    EPA Science Inventory

    The paper discusses measurement issues relating to the characterization of organic emissions from hazardous waste incineration processes under EPA's new risk burn guidance. The recently published draft quidance recommends that hazardous waste combustion facilities complete a mass...

  17. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  18. Preparedness of hazardous materials emergencies in railyards: Guidance for railroads and adjacent communities

    SciTech Connect

    Not Available

    1991-09-01

    Railroads are a key part of the distribution system for hazardous materials and, thus, much hazardous material passes through railyards en route to intermediate or final consumers. While the vast majority of these materials are shipped without incident, both the number of shipments and the nature of the materials themselves dictate that railyards and surrounding communities be prepared to respond quickly and effectively to emergencies. This report contains information on 11 emergency preparedness functions and 150 guidance recommendations.

  19. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  20. Preliminary Hazards Analysis Plasma Hearth Process

    SciTech Connect

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  1. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  2. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    SciTech Connect

    R. Longwell; J. Keifer; S. Goodin

    2001-01-22

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  3. VTOL shipboard letdown guidance system analysis

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.; Karmali, M. S.

    1983-01-01

    Alternative letdown guidance strategies are examined for landing of a VTOL aircraft onboard a small aviation ship under adverse environmental conditions. Off line computer simulation of shipboard landing task is utilized for assessing the relative merits of the proposed guidance schemes. The touchdown performance of a nominal constant rate of descent (CROD) letdown strategy serves as a benchmark for ranking the performance of the alternative letdown schemes. Analysis of ship motion time histories indicates the existence of an alternating sequence of quiescent and rough motions called lulls and swells. A real time algorithms lull/swell classification based upon ship motion pattern features is developed. The classification algorithm is used to command a go/no go signal to indicate the initiation and termination of an acceptable landing window. Simulation results show that such a go/no go pattern based letdown guidance strategy improves touchdown performance.

  4. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  5. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  6. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  7. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  8. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  9. CHARACTERISTICS OF ORGANIC EMISSIONS FROM HAZARDOUS WASTE INCINERATION PROCESSES UNDER THE NEW EPA DRAFT RISK BURN GUIDANCE: MEASUREMENT ISSUES

    EPA Science Inventory

    EPA's recently published draft Risk Burn Guidance recommends that hazardous waste combustion facilities complete a mass balance of the total organics (TOs) that may be emitted from the combustor. TOs, consisting of three distinct fractions (volatile, semivolatile, and nonvolatile...

  10. 75 FR 16202 - Office of New Reactors; Interim Staff Guidance on Ensuring Hazard-Consistent Seismic Input for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ... COMMISSION Office of New Reactors; Interim Staff Guidance on Ensuring Hazard-Consistent Seismic Input for...). ACTION: Notice of availability. SUMMARY: The NRC staff is issuing its Final Interim Staff Guidance (ISG... Interaction Analyses,'' (Agencywide Documents Access and Management System (ADAMS) Accession No....

  11. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  12. 78 FR 18273 - Draft Guidance for Industry on Purchasing Reef Fish Species Associated With the Hazard of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... Reef Fish Species Associated With the Hazard of Ciguatera Fish Poisoning; Availability AGENCY: Food and...: Purchasing Reef Fish Species Associated With the Hazard of Ciguatera Fish Poisoning.'' The draft guidance, when finalized, will advise primary seafood processors who purchase reef fish how to minimize the...

  13. 78 FR 41853 - Safety Advisory Guidance: Heating Rail Tank Cars To Prepare Hazardous Material for Unloading or...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-12

    ... Rail Tank Cars To Prepare Hazardous Material for Unloading or Transloading AGENCY: Pipeline and... transloading \\1\\ hazardous materials from rail tank cars, specifically those persons heating a rail tank car to.... Guidance for Heating of Rail Tank Cars for Unloading or Transloading I. Background PHMSA's mission is...

  14. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  15. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  16. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  17. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  18. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Requirements to Obtain an Experimental Permit Operational Safety Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with §...

  19. An analysis of physical guidance as reinforcement for noncompliance.

    PubMed

    Kern, Lee; Delaney, Beth A; Hilt, Alexandra; Bailin, Deborah E; Elliot, Christina

    2002-09-01

    Physical guidance is a strategy commonly used for noncompliance. In two experiments, we evaluated reinforcing effects of physical guidance. Experiment 1 include three individuals with developmental disabilities who were noncompliant with tasks. Anecdotal observations indicated physical contact was highly reinforcing, and a functional analysis identified attention as a reinforcer for problem behavior. Two conditions compared physical guidance following noncompliance and no physical guidance following noncompliance. Results showed noncompliance increased for all three participants when physical guidance followed noncompliance. The second experiment empirically evaluated the function of noncompliance prior to examining the effect of physical guidance on noncompliance. A functional analysis, conducted with two participants, indicated noncompliance served an escape function for one and an attention function for the other. Comparisons of physical guidance and no physical guidance indicated physical guidance resulted in decreased noncompliance for the participant with an escape function but increased noncompliance for the participant with an attention function. PMID:12205825

  20. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    SciTech Connect

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  1. Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    EPA Science Inventory

    The Technical Guidance for Assessing Environmental Justice in Regulatory Analysis (also referred to as the Environmental Justice Technical Guidance or EJTG) is intended for use by Agency analysts, including risk assessors, economists, and other analytic staff that conduct analyse...

  2. Developing guidance for budget impact analysis.

    PubMed

    Trueman, P; Drummond, M; Hutton, J

    2001-01-01

    The role of economic evaluation in the efficient allocation of healthcare resources has been widely debated. Whilst economic evidence is undoubtedly useful to purchasers, it does not address the issue of affordability which is an increasing concern. Healthcare purchasers are concerned not just with maximising efficiency but also with the more simplistic goal of remaining within their annual budgets. These two objectives are not necessarily consistent. This paper examines the issue of affordability, the relationship between affordability and efficiency and builds the case for why there is a growing need for budget impact models to complement economic evaluation. Guidance currently available for such models is also examined and it is concluded that this guidance is currently insufficient. Some of these insufficiencies are addressed and some thoughts on what constitutes best practice in budget impact modelling are suggested. These suggestions include consideration of transparency, clarity of perspective, reliability of data sources, the relationship between intermediate and final end-points and rates of adoption of new therapies. They also include the impact of intervention by population subgroups or indications, reporting of results, probability of re-deploying resources, the time horizon, exploring uncertainty and sensitivity analysis, and decision-maker access to the model. Due to the nature of budget impact models, the paper does not deliver stringent methodological guidance on modelling. The intention was to provide some suggestions of best practice in addition to some foundations upon which future research can build. PMID:11456210

  3. Quality Guidance: A Sectoral Analysis. NICEC Project Report.

    ERIC Educational Resources Information Center

    Watts, A. G.; Sadler, Jackie

    This report reviews the structure of the guidance field and provides a sector-by-sector analysis of current quality assurance arrangements in the United Kingdom. Part 1 presents an outline of the guidance sector, including some key concepts, structures, and roles. It defines guidance and other terms; discusses the three main categories…

  4. DSOD Procedures for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Howard, J. K.; Fraser, W. A.

    2005-12-01

    DSOD, which has jurisdiction over more than 1200 dams in California, routinely evaluates their dynamic stability using seismic shaking input ranging from simple pseudostatic coefficients to spectrally matched earthquake time histories. Our seismic hazard assessments assume maximum earthquake scenarios of nearest active and conditionally active seismic sources. Multiple earthquake scenarios may be evaluated depending on sensitivity of the design analysis (e.g., to certain spectral amplitudes, duration of shaking). Active sources are defined as those with evidence of movement within the last 35,000 years. Conditionally active sources are those with reasonable expectation of activity, which are treated as active until demonstrated otherwise. The Division's Geology Branch develops seismic hazard estimates using spectral attenuation formulas applicable to California. The formulas were selected, in part, to achieve a site response model similar to the 2000 IBC's for rock, soft rock, and stiff soil sites. The level of dynamic loading used in the stability analysis (50th, 67th, or 84th percentile ground shaking estimates) is determined using a matrix that considers consequence of dam failure and fault slip rate. We account for near-source directivity amplification along such faults by adjusting target response spectra and developing appropriate design earthquakes for analysis of structures sensitive to long-period motion. Based on in-house studies, the orientation of the dam analysis section relative to the fault-normal direction is considered for strike-slip earthquakes, but directivity amplification is assumed in any orientation for dip-slip earthquakes. We do not have probabilistic standards, but we evaluate the probability of our ground shaking estimates using hazard curves constructed from the USGS Interactive De-Aggregation website. Typically, return periods for our design loads exceed 1000 years. Excessive return periods may warrant a lower design load. Minimum

  5. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis and Hazard Analysis...

  6. Hazard screening application guide. Safety Analysis Report Update Program

    SciTech Connect

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  7. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    NASA Astrophysics Data System (ADS)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  8. Canister storage building hazard analysis report

    SciTech Connect

    Krahn, D.E.; Garvin, L.J.

    1997-07-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  9. Cold Vacuum Drying Facility hazard analysis report

    SciTech Connect

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  10. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  11. Hazard Analysis for In Tank Spray Leaks

    SciTech Connect

    GRAMS, W.H.

    2000-06-13

    The River Protection Project (RPP) Authorization Basis (AB) contains controls that address spray leaks in tanks. However, there are no hazardous conditions in the Hazards Database that specifically identify in-tank spray leak scenarios. The purpose of this Hazards Evaluation is to develop hazardous conditions related to in-tank spray leaks for the Hazards Database and to provide more complete coverage of Tank Farm facilities. Currently, the in-tank spray leak is part of the ''Spray Leak in Structures or From Waste Transfer Lines'' accidents in Section 3.4.2.9 of the Final Safety Analysis Report (FSAR) (CHG, 2000a). The accident analysis for the ''Spray Leak in Structure or From Waste Transfer Lines'' states the following regarding the location of a possible spray leak: Inside ventilated waste storage tanks (DSTs, DCRTs, and some SSTs). Aerosols could be generated inside a storage tank during a transfer because of a leak from the portion of the transfer pipe inside the tank. The tank ventilation system could help disperse the aerosols to the atmosphere should the vent system HEPA filters fail. This Hazards Evaluation also evaluates the controls currently assigned to the spray leak in structure accident and determines the applicability of the controls to the new hazardous conditions. This comparison reviews both the analysis in the FSAR and the controls found in the Technical Safety Requirements (TSRs) (CHG, 2000h). If the new hazardous conditions do not match the analyzed accident conditions and controls, then additional analysis may be required. This document is not intended to authorize the activity or determine the adequacy of controls; it is only intended to provide information about the hazardous conditions associated with this activity. The Control decision process as defined in the AB will be used to determine the adequacy of controls and whether the proposed activity is within the AB. This hazard evaluation does not constitute an accident analysis.

  12. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  13. Integrating waste management with Job Hazard analysis

    SciTech Connect

    2007-07-01

    The web-based Automated Job Hazard Analysis (AJHA) system is a tool designed to help capture and communicate the results of the hazard review and mitigation process for specific work activities. In Fluor Hanford's day-to-day work planning and execution process, AJHA has become the focal point for integrating Integrated Safety Management (ISM) through industrial health and safety principles; environmental safety measures; and involvement by workers, subject-matter experts and management. This paper illustrates how AJHA has become a key element in involving waste-management and environmental-control professionals in planning and executing work. To support implementing requirements for waste management and environmental compliance within the core function and guiding principles of an integrated safety management system (ISMS), Fluor Hanford has developed the a computer-based application called the 'Automated Job Hazard Analysis' (AJHA), into the work management process. This web-based software tool helps integrate the knowledge of site workers, subject-matter experts, and safety principles and requirements established in standards, and regulations. AJHA facilitates a process of work site review, hazard identification, analysis, and the determination of specific work controls. The AJHA application provides a well-organized job hazard analysis report including training and staffing requirements, prerequisite actions, notifications, and specific work controls listed for each sub-task determined for the job. AJHA lists common hazards addressed in the U.S. Occupational, Safety, and Health Administration (OSHA) federal codes; and State regulations such as the Washington Industrial Safety and Health Administration (WISHA). AJHA also lists extraordinary hazards that are unique to a particular industry sector, such as radiological hazards and waste management. The work-planning team evaluates the scope of work and reviews the work site to identify potential hazards. Hazards

  14. 76 FR 23823 - Guidance for Industry on Fish and Fishery Products Hazards and Controls, Fourth Edition...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    ... control recommendations are listed for the natural toxin action level for diarrhetic shellfish poisoning... the third edition to include ciguatera fish poisoning guidance for northern Gulf of Mexico...

  15. Initial guidelines for probabilistic seismic hazard analysis

    SciTech Connect

    Budnitz, R.J.

    1994-10-01

    In the late 1980s, the methodology for performing probabilistic seismic hazard analysis (PSHA) was exercised extensively for eastern-U.S. nuclear power plant sites by the Electric Power Research Institute (EPRI) and Lawrence Livermore National Laboratory (LLNL) under NRC sponsorship. Unfortunately, the seismic-hazard-curve results of these two studies differed substantially for many of the eastern reactor sites, which has motivated all concerned to revisit the approaches taken. This project is that revisitation.

  16. Analysis of Regulatory Guidance for Health Monitoring

    NASA Technical Reports Server (NTRS)

    Munns, Thomas E.; Beard, Richard E.; Culp, Aubrey M.; Murphy, Dennis A.; Kent, Renee M.; Cooper, Eric G. (Technical Monitor)

    2000-01-01

    The purpose of this study was to assess the connection between current FAA regulations and the incorporation of Health Management (HM) systems into commercial aircraft. To address the overall objectives ARINC: (1) investigated FAA regulatory guidance, (2) investigated airline maintenance practices, (3) systematically identified regulations and practices that would be affected or could act as barriers to the introduction of HM technology, and (4) assessed regulatory and operational tradeoffs that should be considered for implementation. The assessment procedure was validated on a postulated structural HM capability for the B757 horizontal stabilizer.

  17. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  18. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  19. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... analysis. Every processor shall conduct, or have conducted for it, a hazard analysis to determine whether... product processed by that processor and to identify the preventive measures that the processor can apply.... A food safety hazard that is reasonably likely to occur is one for which a prudent processor...

  20. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  1. RCRA (Resource Conservation and Recovery Act) personnel training guidance manual for owners or operators of hazardous-waste-management facilities. Draft report

    SciTech Connect

    Not Available

    1980-09-01

    The manual provides guidance to an audience consisting of owners or operators of hazardous waste-management facilities and also regulatory personnel responsible for facility permitting. Information is provided concerning general training strategies and program design, types of training, training modules pertinent to hazardous-waste management, suggested training module elements, and sources of information.

  2. Repository Subsurface Preliminary Fire Hazard Analysis

    SciTech Connect

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  3. WIPP waste characterization program sampling and analysis guidance manual

    SciTech Connect

    Not Available

    1991-01-01

    The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastes at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.

  4. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  5. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight hazard area analysis. 417.223..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard area analysis. (a) General. A flight safety analysis must include a flight hazard area analysis that...

  6. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard area analysis. (a) General. A flight safety analysis must include a flight hazard area analysis that...

  7. Analysis of the combined maglev levitation, propulsion, and guidance system

    SciTech Connect

    He, J.L.; Coffey, H.T.; Rote, D.M.

    1995-03-01

    An analysis of a Japanese maglev system that uses only one set of coils in the guideway for combined levitation, propulsion, and guidance functions is presented. This preliminary study, using the dynamic circuit approach, indicates that the system is very promising.

  8. Seismic hazard analysis at Rocky Flats Plant

    SciTech Connect

    McGuire, R.K.

    1993-10-01

    A probabilistic seismic hazard analysis is being conducted for the DOE Rocky Flats Plant, Jefferson County, Colorado. This is part of the overall review of the seismic exposure to facilities being conducted by DOE. The study has four major elements. (1) The historical seismicity in Colorado is being reviewed and synthesized to estimate historical rates of earthquake activity in the region of the site. (2) The geologic and tectonic evidence in Colorado and along the Front Range is being reviewed to determine appropriate seismic zones, potentially active faults, and constraints on fault slip rates. (3) Earthquake ground motion equations are being derived based on seismological knowledge of the earth`s crust. Site specific soil amplification factors are also being developed using on-site shear wave velocity measurements. (4) The probability of exceedence of various seismic ground motion levels is being calculated based on the inputs developed on tectonic sources, faults, ground motion, and soil amplification. Deterministic ground motion estimates are also being made. This study is a state-of-the-art analysis of seismic hazard. It incorporates uncertainties in the major aspects governing seismic hazard, and has a documented basis founded on solid data interpretations for the ranges of inputs used. The results will be a valid basis on which to evaluate plant structures, equipment, and components for seismic effects.

  9. Decision analysis for INEL hazardous waste storage

    SciTech Connect

    Page, L.A.; Roach, J.A.

    1994-01-01

    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft{sup 2} of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies.

  10. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  11. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  12. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ... HUMAN SERVICES Food and Drug Administration Guidance for Industry on Documenting Statistical Analysis...: The Food and Drug Administration (FDA) is announcing the availability of a guidance for industry 197 entitled ``Documenting Statistical Analysis Programs and Data Files.'' This guidance is provided to...

  13. Characterizing adoption of precautionary risk management guidance for nanomaterials, an emerging occupational hazard.

    PubMed

    Schubauer-Berigan, Mary K; Dahm, Matthew M; Schulte, Paul A; Hodson, Laura; Geraci, Charles L

    2015-01-01

    Exposure to engineered nanomaterials (substances with at least one dimension of 1-100 nm) has been of increased interest, with the recent growth in production and use of nanomaterials worldwide. Various organizations have recommended methods to minimize exposure to engineered nanomaterials. The purpose of this study was to evaluate available data to examine the extent to which studied U.S. companies (which represent a small fraction of all companies using certain forms of engineered nanomaterials) follow the guidelines for reducing occupational exposures to engineered nanomaterials that have been issued by the National Institute for Occupational Safety and Health (NIOSH) and other organizations. Survey data, field reports, and field notes for all NIOSH nanomaterial exposure assessments conducted between 2006 and 2011 were collected and reviewed to: (1) determine the level of adoption of precautionary guidance on engineering controls and personal protective equipment (PPE), and (2) evaluate the reliability of companies' self-reported use of engineering controls and PPE. Use of PPE was observed among 89% [95% confidence interval (CI): 76%-96%] of 46 visited companies, and use of containment-based engineering controls for at least some processes was observed among 83% (95% CI: 76%-96%). In on-site evaluations, more than 90% of the 16 engineered carbonaceous nanomaterial companies that responded to an industrywide survey were observed to be using engineering controls and PPE as reported or more stringently than reported. Since PPE use was slightly more prevalent than engineering controls, better communication may be necessary to reinforce the importance of the hierarchy of controls. These findings may also be useful in conducting exposure assessment and epidemiologic research among U.S. workers handling nanomaterials.

  14. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    NASA Technical Reports Server (NTRS)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  15. Regional Frequency Analysis of Ocean Hazard

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro; Weiss, Jerome; Benoit, Michel; Andreewsky, Marc

    2015-04-01

    The estimation of the extreme return level (up to 10-4 annual probability of exceedence) of natural phenomena is a very uncertain exercise, when extrapolating using the information and the measure collected in a single site. The aim of the Regional Frequency Analysis (RFA) is to benefit from the information contained in observations and data collected not only on the site of interested but in a larger set of sites, located in the same region of the site of interest or sharing with it similar characteristics. This technique was introduced in the '60 and widely used in various domains including hydrology and meteorology. The RFA was recently acknowledge as a potential choice for the estimation of flooding hazard in the Methodological Guide for flooding hazard estimation [1], published in 2013 by the French Nuclear Safety Autority. The aim of this presentation is to bring in the main concepts of the RFA and illustrate the latest innovation on its application, delivered by EDF R&D. They concerns the statistical definition of storms, the formation of homogeneous regions and a new approach for filtering the redundant information linked to the spatial correlation of natural phenomena. Application to skew surges and waves will be shown 1. ASN, Guide pour la Protection des installations nucléaires de base contre les inondations externes. 2013, ASN. p. 44.

  16. Remote rainfall sensing for landslide hazard analysis

    USGS Publications Warehouse

    Wieczorek, Gerald F.; McWreath, Harry; Davenport, Clay

    2001-01-01

    Methods of assessing landslide hazards and providing warnings are becoming more advanced as remote sensing of rainfall provides more detailed temporal and spatial data on rainfall distribution. Two recent landslide disasters are examined noting the potential for using remotely sensed rainfall data for landslide hazard analysis. For the June 27, 1995, storm in Madison County, Virginia, USA, National Weather Service WSR-88D Doppler radar provided rainfall estimates based on a relation between cloud reflectivity and moisture content on a 1 sq. km. resolution every 6 minutes. Ground-based measurements of rainfall intensity and precipitation total, in addition to landslide timing and distribution, were compared with the radar-derived rainfall data. For the December 14-16, 1999, storm in Vargas State, Venezuela, infrared sensing from the GOES-8 satellite of cloud top temperatures provided the basis for NOAA/NESDIS rainfall estimates on a 16 sq. km. resolution every 30 minutes. These rainfall estimates were also compared with ground-based measurements of rainfall and landslide distribution. In both examples, the remotely sensed data either overestimated or underestimated ground-based values by up to a factor of 2. The factors that influenced the accuracy of rainfall data include spatial registration and map projection, as well as prevailing wind direction, cloud orientation, and topography.

  17. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  18. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  19. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  20. Probabilistic Seismic Hazard Analysis for Southern California Coastal Facilities

    SciTech Connect

    Savy, J; Foxall, B

    2004-04-16

    The overall objective of this study was to develop probabilistic seismic hazard estimates for the coastal and offshore area of Ventura, Los Angeles and Orange counties for use as a basis for the University of Southern California (USC) to develop physical models of tsunami for the coastal regions and by the California State Lands Commission (SLC) to develop regulatory standards for seismic loading and liquefaction evaluation of marine oil terminals. The probabilistic seismic hazard analysis (PSHA) was carried out by the Lawrence Livermore National Laboratory (LLNL), in several phases over a time period of two years, following the method developed by LLNL for the estimation of seismic hazards at Department Of Energy (DOE) facilities, and for 69 locations of nuclear plants in the Eastern United States, for the Nuclear Regulatory Commission (NRC). This method consists in making maximum use of all physical data (qualitative, and quantitative) and to characterize the uncertainties by using a set of alternate spatiotemporal models of occurrence of future earthquakes, as described in the SSHAC, PSHA Guidance Document (Budnitz et al., 1997), and implemented for the NRC (Savy et al., 2002). In general, estimation of seismic hazard is based not only on our understanding of the regional tectonics and detailed characterization of the faults in the area but also on the analysis methods employed and the types of physical and empirical models that are deemed appropriate for the analysis. To develop this understanding, the body of knowledge in the scientific community is sampled in a series of workshops with a group of experts representative of the entire scientific community, including geologists and seismologists from the United States Geological Survey (USGS), members of the South California Earthquake Center (SCEC), and members of academic institutions (University of California Santa-Cruz, Stanford, UC Santa Barbara, and University of Southern California), and members of

  1. Fire hazard analysis for the fuel supply shutdown storage buildings

    SciTech Connect

    REMAIZE, J.A.

    2000-09-27

    The purpose of a fire hazards analysis (FHA) is to comprehensively assess the risk from fire and other perils within individual fire areas in a DOE facility in relation to proposed fire protection so as to ascertain whether the objectives of DOE 5480.7A, Fire Protection, are met. This Fire Hazards Analysis was prepared as required by HNF-PRO-350, Fire Hazards Analysis Requirements, (Reference 7) for a portion of the 300 Area N Reactor Fuel Fabrication and Storage Facility.

  2. Fire hazards analysis of transuranic waste storage and assay facility

    SciTech Connect

    Busching, K.R., Westinghouse Hanford

    1996-07-31

    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  3. Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Maltach, E. G.

    1969-01-01

    The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.

  4. 78 FR 69992 - Guidance for Industry on Purchasing Reef Fish Species Associated With the Hazard of Ciguatera...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... reducing the risk of CFP. DATES: Submit either electronic or written comments on FDA guidances at any time. ADDRESSES: Submit written requests for single copies of the guidance to Division of Seafood Safety/Office of... written comments on the guidance to the Division of Dockets Management (HFA-305), Food and...

  5. Fire hazards analysis for solid waste burial grounds

    SciTech Connect

    McDonald, K.M.

    1995-09-28

    This document comprises the fire hazards analysis for the solid waste burial grounds, including TRU trenches, low-level burial grounds, radioactive mixed waste trenches, etc. It analyzes fire potential, and fire damage potential for these facilities. Fire scenarios may be utilized in future safety analysis work, or for increasing the understanding of where hazards may exist in the present operation.

  6. Cold Vacuum Drying (CVD) Facility Hazards Analysis Report

    SciTech Connect

    PATTERSON, P.R.

    1999-10-20

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard Analysis to support the CVDF Final Safety Analysis Report and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports,'' and implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports.''

  7. Active polarimeter optical system laser hazard analysis.

    SciTech Connect

    Augustoni, Arnold L.

    2005-07-01

    A laser hazard analysis was performed for the SNL Active Polarimeter Optical System based on the ANSI Standard Z136.1-2000, American National Standard for Safe Use of Lasers and the ANSI Standard Z136.6-2000, American National Standard for Safe Use of Lasers Outdoors. The Active Polarimeter Optical System (APOS) uses a pulsed, near-infrared, chromium doped lithium strontium aluminum fluoride (Cr:LiSAF) crystal laser in conjunction with a holographic diffuser and lens to illuminate a scene of interest. The APOS is intended for outdoor operations. The system is mounted on a height adjustable platform (6 feet to 40 feet) and sits atop a tripod that points the beam downward. The beam can be pointed from nadir to as much as 60 degrees off of nadir producing an illuminating spot geometry that can vary from circular (at nadir) to elliptical in shape (off of nadir). The JP Innovations crystal Cr:LiSAF laser parameters are presented in section II. The illuminating laser spot size is variable and can be adjusted by adjusting the separation distance between the lens and the holographic diffuser. The system is adjusted while platform is at the lowest level. The laser spot is adjusted for a particular spot size at a particular distance (elevation) from the laser by adjusting the separation distance (d{sub diffuser}) to predetermined values. The downward pointing angle is also adjusted before the platform is raised to the selected operation elevation.

  8. Optical radiation hazards analysis of ultraviolet headlamps

    NASA Astrophysics Data System (ADS)

    Sliney, David H.; Fast, Peder; Ricksand, Anders

    1995-08-01

    The potential hazards to the eye and skin associated with the use of UV-emitting automotive headlamps are considered. Wide-scale use of high-beam, near-UV headlamps to permit viewing of fluorescence in clothes and fluorescent road markers at great distances can increase automotive safety; however, the potential hazards from exposure of persons to UV radiation must be evaluated. From the study we conclude that such headlamps can be safely designed to preclude human exposure to potentially hazardous levels of UV radiation.

  9. ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis

    NASA Technical Reports Server (NTRS)

    Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.

    2006-01-01

    Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.

  10. Zurich`s hazard analysis process: A systematic team approach

    SciTech Connect

    Frei, H.

    1997-06-01

    The Zurich method of hazard analysis (ZHA) is a process designed to facilitate the systematic identification, assessment and reduction or elimination of hazard and risk in almost any product, system or process. It has been particularly successful as a front-end screening tool in the petrochemical, chemical, and pharmaceutical industries. The complexity and the regulation of these industries and the requirement for management of change have created a demand for highly efficient, yet thorough, hazard analysis techniques capable of capturing and managing the total risk perspective while clearly illuminating the risk priorities. Only when these priorities have been segregated and economically addressed as an organization fully leveraged the power of any hazard analysis tool. This paper will outline the Zurich Hazard Analysis process and will highlight the elements and strategies central to its success as an efficient, yet thorough methodology.

  11. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    SciTech Connect

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  12. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to...

  13. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to...

  14. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... determine the food safety hazards reasonably likely to occur in the production process and identify...

  15. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... determine the food safety hazards reasonably likely to occur in the production process and identify...

  16. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault Tree... qualitative evaluation of a range of the possible safety and health effects of failure of controls. (d)...

  17. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault Tree... qualitative evaluation of a range of the possible safety and health effects of failure of controls. (d)...

  18. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault Tree... qualitative evaluation of a range of the possible safety and health effects of failure of controls. (d)...

  19. Hazard Analysis of Japanese Boxed Lunches (Bento).

    ERIC Educational Resources Information Center

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary reason…

  20. Simple Sensitivity Analysis for Orion Guidance Navigation and Control

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar

    2013-01-01

    The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.

  1. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  2. Hazardous Substance Release Reporting Under CERCLA, EPCR {section}304 and DOE Emergency Management System (EMS) and DOE Occurrence Reporting Requirements. Environmental Guidance

    SciTech Connect

    Traceski, T.T.

    1994-06-01

    Releases of various substances from DOE facilities may be subject to reporting requirements under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and the Emergency Planning and Community Right-to-Know Act (EPCRA), as well as DOE`s internal ``Occurrence Reporting and Processing of Operations Information`` and the ``Emergency Management System`` (EMS). CERCLA and EPCPA are Federal laws that require immediate reporting of a release of a Hazardous Substance (HS) and an Extremely Hazardous Substance (EHS), respectively, in a Reportable Quantity (RQ) or more within a 24-hour period. This guidance uses a flowchart, supplemental information, and tables to provide an overview of the process to be followed, and more detailed explanations of the actions that must be performed, when chemical releases of HSs, EHSs, pollutants, or contaminants occur at DOE facilities. This guidance should be used in conjunction with, rather than in lieu of, applicable laws, regulations, and DOE Orders. Relevant laws, regulations, and DOE Orders are referenced throughout this guidance.

  3. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... § 417.205(a) apply. The analysis must account for, at a minimum: (1) All trajectory times from liftoff... trajectory dispersion effects in the surface impact domain. (b) Public notices. A flight hazard...

  4. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... § 417.205(a) apply. The analysis must account for, at a minimum: (1) All trajectory times from liftoff... trajectory dispersion effects in the surface impact domain. (b) Public notices. A flight hazard...

  5. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... § 417.205(a) apply. The analysis must account for, at a minimum: (1) All trajectory times from liftoff... trajectory dispersion effects in the surface impact domain. (b) Public notices. A flight hazard...

  6. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    SciTech Connect

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka; Ken Yanagisawa; Tadashi Annaka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present an example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)

  7. Fixed-trim re-entry guidance analysis

    NASA Technical Reports Server (NTRS)

    Gracey, C.; Cliff, E. M.; Lutze, F. H.; Kelley, H. J.

    1981-01-01

    The terminal guidance problem for a fixed-trim re-entry body is formulated with the objective of synthesizing a closed-loop steering law. A transformation of variables and subsequent linearization of the motion, with the sight-line to the target as a reference, reduces the order of the state system for the guidance problem. The reduced order system, although nonlinear and time-varying, is simple enough to lend itself to synthesis of a class of guidance laws. A generalization of the feedforward device of classical control theory is successfully employed for compensation of roll autopilot lags. The proposed steering law exhibits superior miss-distance performance in a computational comparison with existing fixed-trim guidance laws.

  8. Mechanistic insights from a quantitative analysis of pollen tube guidance

    PubMed Central

    2010-01-01

    Background Plant biologists have long speculated about the mechanisms that guide pollen tubes to ovules. Although there is now evidence that ovules emit a diffusible attractant, little is known about how this attractant mediates interactions between the pollen tube and the ovules. Results We employ a semi-in vitro assay, in which ovules dissected from Arabidopsis thaliana are arranged around a cut style on artificial medium, to elucidate how ovules release the attractant and how pollen tubes respond to it. Analysis of microscopy images of the semi-in vitro system shows that pollen tubes are more attracted to ovules that are incubated on the medium for longer times before pollen tubes emerge from the cut style. The responses of tubes are consistent with their sensing a gradient of an attractant at 100-150 μm, farther than previously reported. Our microscopy images also show that pollen tubes slow their growth near the micropyles of functional ovules with a spatial range that depends on ovule incubation time. Conclusions We propose a stochastic model that captures these dynamics. In the model, a pollen tube senses a difference in the fraction of receptors bound to an attractant and changes its direction of growth in response; the attractant is continuously released from ovules and spreads isotropically on the medium. The model suggests that the observed slowing greatly enhances the ability of pollen tubes to successfully target ovules. The relation of the results to guidance in vivo is discussed. PMID:20170550

  9. Fire Hazards Analysis for the 200 Area Interim Storage Area

    SciTech Connect

    JOHNSON, D.M.

    2000-01-06

    This documents the Fire Hazards Analysis (FHA) for the 200 Area Interim Storage Area. The Interim Storage Cask, Rad-Vault, and NAC-1 Cask are analyzed for fire hazards and the 200 Area Interim Storage Area is assessed according to HNF-PRO-350 and the objectives of DOE Order 5480 7A. This FHA addresses the potential fire hazards associated with the Interim Storage Area (ISA) facility in accordance with the requirements of DOE Order 5480 7A. It is intended to assess the risk from fire to ensure there are no undue fire hazards to site personnel and the public and to ensure property damage potential from fire is within acceptable limits. This FHA will be in the form of a graded approach commensurate with the complexity of the structure or area and the associated fire hazards.

  10. Analysis of Brazilian data for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Drouet, S.; Assumpção, M.

    2013-05-01

    Seismic hazard analysis in Brazil is going to be re-assessed in the framework of the Global Earthquake Model (GEM) project. Since the last worldwide Global Seismic Hazard Analysis Project (GSHAP) there has been no specific study in this field in Brazil. Brazil is a stable continental region and is characterized by a low seismic activity. In this particular type of regions, seismic hazard assessment is a very hard task due to the limited amount of data available regarding the seismic sources, earthquake catalogue, or ground-motion amplitudes, and the uncertainties associated are very large. This study focuses on recorded data in South-East Brazil where broadband stations are installed, belonging to two networks: the network managed by the seismology group at the IAG-USP in São Paulo which exists since about 20 years, and the network managed by the Observatorio Nacional in Rio de Janeiro which has just been set up. The two networks are now integrated into the national network RSB (Rede Sismográfica Brasileira) which will also include stations from the rest of Brazil currently in installation by the Universities of Brasilia and Natal. There are a couple of events with magnitude greater than 3 recorded at these very sensitive stations, usually at rather large distances. At first sight these data may appear meaningless in the context of seismic hazard but they can help to improve different parts involved in the process. The analysis of the S-wave Fourier spectra can help to better resolve source, path and site effects in Brazil. For instance moment magnitudes can be computed from the flat part of the Fourier spectra. These magnitudes are of utmost importance in order to build an homogeneous catalogue in terms of moment magnitude. At the moment only body wave magnitude (or some equivalent scale) are determined routinely for the events in Brazil. Attenuation and site effect, especially the high-frequency attenuation known as the kappa effect will also help to

  11. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, B.; Siu, Y. L.; Mitchell, G.

    2015-12-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  12. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  13. Seismic hazard analysis for Jayapura city, Papua

    SciTech Connect

    Robiana, R. Cipta, A.

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  14. A sensitivity analysis of hazardous waste disposal site climatic and soil design parameters using HELP3

    SciTech Connect

    Adelman, D.D.; Stansbury, J.

    1997-12-31

    The Resource Conservation and Recovery Act (RCRA) Subtitle C, Comprehensive Environmental Response, Compensation, And Liability Act (CERCLA), and subsequent amendments have formed a comprehensive framework to deal with hazardous wastes on the national level. Key to this waste management is guidance on design (e.g., cover and bottom leachate control systems) of hazardous waste landfills. The objective of this research was to investigate the sensitivity of leachate volume at hazardous waste disposal sites to climatic, soil cover, and vegetative cover (Leaf Area Index) conditions. The computer model HELP3 which has the capability to simulate double bottom liner systems as called for in hazardous waste disposal sites was used in the analysis. HELP3 was used to model 54 combinations of climatic conditions, disposal site soil surface curve numbers, and leaf area index values to investigate how sensitive disposal site leachate volume was to these three variables. Results showed that leachate volume from the bottom double liner system was not sensitive to these parameters. However, the cover liner system leachate volume was quite sensitive to climatic conditions and less sensitive to Leaf Area Index and curve number values. Since humid locations had considerably more cover liner system leachate volume than and locations, different design standards may be appropriate for humid conditions than for and conditions.

  15. Frequency Analysis of Aircraft hazards for License Application

    SciTech Connect

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  16. FMEA, the alternative process hazard method. [Failure Mode Effects Analysis

    SciTech Connect

    Goyal, R.K. )

    1993-05-01

    Failure mode effects analysis (FMEA) is an old reliability/assurance tool finding its way into the HPI. Not popular yet, this hazard technique has some viable applications that can improve hazard assessment data. Notably, FMEA studies can identify possible areas for improvement that may have not been discovered using other methods. Also, FMEA is not as labor intensive and costly as other process hazard analysis (PHA) methods. PSHA 1910.119 set in place an informational structure whose main purpose is the reduction of potential accidents and minimizing risks in the event of an accident. Consequently, HPI operators must evaluate their process systems and identify potential major hazards, such as fires, explosions and accidental release of toxic/hazardous chemicals, and protect their facilities, employees, the public and the environment. But, which PHA method(s) apply to a particular plant or process still remains a difficult question. This paper describes what FMEA is; types of FMEA; how to conduct a FMEA study; comparison with HAZOP (hazard and operability study); computer software; applicability of FMEA; and examples of its use.

  17. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  18. 78 FR 39284 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-01

    ..., Office of Policy, National Center for Environmental Economics, Mail code 1809T, Environmental Protection... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis...

  19. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    SciTech Connect

    Lewis, W.S.

    1994-08-23

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.

  20. Pedestrian Evacuation Analysis for Tsunami Hazards

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Wood, N. J.

    2014-12-01

    Recent catastrophic tsunamis in the last decade, as well as the 50th anniversary of the 1964 Alaskan event, have heightened awareness of the threats these natural hazards present to large and increasing coastal populations. For communities located close to the earthquake epicenter that generated the tsunami, strong shaking may also cause significant infrastructure damage, impacting the road network and hampering evacuation. There may also be insufficient time between the earthquake and first wave arrival to rely on a coordinated evacuation, leaving at-risk populations to self-evacuate on foot and across the landscape. Emergency managers evaluating these coastal risks need tools to assess the evacuation potential of low-lying areas in order to discuss mitigation options, which may include vertical evacuation structures to provide local safe havens in vulnerable communities. The U.S. Geological Survey has developed the Pedestrian Evacuation Analyst software tool for use by researchers and emergency managers to assist in the assessment of a community's evacuation potential by modeling travel times across the landscape and producing both maps of travel times and charts of population counts with corresponding times. The tool uses an anisotropic (directionally dependent) least cost distance model to estimate evacuation potential and allows for the variation of travel speed to measure its effect on travel time. The effectiveness of vertical evacuation structures on evacuation time can also be evaluated and compared with metrics such as travel time maps showing each structure in place and graphs displaying the percentage change in population exposure for each structure against the baseline. Using the tool, travel time maps and at-risk population counts have been generated for some coastal communities of the U.S. Pacific Northwest and Alaska. The tool can also be used to provide valuable decision support for tsunami vertical evacuation siting.

  1. Influence of Analysis Methods on Interpretation of Hazard Maps

    PubMed Central

    Koehler, Kirsten A.

    2013-01-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with ‘off-the-shelf’ mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

  2. Tracking Hazard Analysis Data in a Jungle of Changing Design

    SciTech Connect

    Sullivan, Robin S.; Young, Jonathan

    2006-05-14

    The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site in Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.

  3. Hazardous-waste analysis plan for LLNL operations

    SciTech Connect

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  4. Landslide hazards and systems analysis: A Central European perspective

    NASA Astrophysics Data System (ADS)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  5. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  6. Analytical laboratory quality assurance guidance in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document introduces QA guidance pertaining to design and implementation of laboratory procedures and processes for collecting DOE Environmental Restoration and Waste Management (EM) ESAA (environmental sampling and analysis activities) data. It addresses several goals: identifying key laboratory issues and program elements to EM HQ and field office managers; providing non-prescriptive guidance; and introducing environmental data collection program elements for EM-263 assessment documents and programs. The guidance describes the implementation of laboratory QA elements within a functional QA program (development of the QA program and data quality objectives are not covered here).

  7. Guidance in Kagome-like photonic crystal fibres I: analysis of an ideal fibre structure.

    PubMed

    Chen, Lei; Pearce, Greg J; Birks, Timothy A; Bird, David M

    2011-03-28

    Propagation of light in a square-lattice hollow-core photonic crystal fibre is analysed as a model of guidance in a class of photonic crystal fibres that exhibit broad-band guidance without photonic bandgaps. A scalar governing equation is used and analytic solutions based on transfer matrices are developed for the full set of modes. It is found that an exponentially localised fundamental mode exists for a wide range of frequencies. These analytic solutions of an idealised structure will form the basis for analysis of guidance in a realistic structure in a following paper.

  8. Phase 2 fire hazard analysis for the canister storage building

    SciTech Connect

    Sadanaga, C.T., Westinghouse Hanford

    1996-07-01

    The fire hazard analysis assesses the risk from fire in a facility to ascertain whether the fire protection policies are met. This document provides a preliminary FHA for the CSB facility. Open items have been noted in the document. A final FHA will be required at the completion of definitive design, prior to operation of the facility.

  9. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    SciTech Connect

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  10. DEM processing for the analysis of hydraulic hazards

    NASA Astrophysics Data System (ADS)

    Dresen, M.

    2003-04-01

    The digital analysis of hydrological processes and hydraulic hazards requires high data accuracy especially for topographic data that can not be insured by standard digital elevation models (DEMs). For this reason the terrain analysis and the analysis of topographical factors are highly significant for the modelling of hydrological processes. Most of the common GIS do not fulfill these requirements and do not allow detailed process oriented analysis. As a result, the estimation of hazard potential as well as the derivation of possible effects of catastrophic events are not possible. Due to this the improvement of DEM creation and expansion of placeable methods and functionalities have high priority in hydraulic hazard assessment. We can demonstrate that the quality of DEMs can be clearly improved with the help of different extensions and adaptations. The comparison of different flood events in Europe reveal the better accuracy of topographical factors and the derived hydrological parameters. In this way the simulation of hydrological processes and hydraulic hazards can be improved.

  11. Fire Hazards Analysis for the Inactive Equipment Storage Sprung Structure

    SciTech Connect

    MYOTT, C.F.

    2000-02-03

    The purpose of the analysis is to comprehensively assess the risk from fire within individual fire areas in relation to proposed fire protection so as to ascertain whether the fire protection objective of DOE Order 5480.1A are met. The order acknowledges a graded approach commensurate with the hazards involved.

  12. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    DOE Data Explorer

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  13. PO*WW*ER mobile treatment unit process hazards analysis

    SciTech Connect

    Richardson, R.B.

    1996-06-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damage and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  14. Hazard analysis of Clostridium perfringens in the Skylab Food System

    NASA Technical Reports Server (NTRS)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  15. Environmental hazard analysis and effective remediation of highway seepage.

    PubMed

    Yuan, Renmao; Yang, Y S; Qiu, X; Ma, F S

    2007-04-01

    Risk assessment and minimisation of environmental hazards are critical issues to consider in the geotechnical engineering projects. A case of highway pavement seepage induced by groundwater, at a locality along the section of Hua-Qing Highway of Guangdong Province, China, is presented for environmental hazard analysis and effective remediation. The environmental hazard analyses were based on in situ hydrogeologic investigation, rock-soil testing and integrated environmental understanding. The analyses indicate that the highway seepage was caused by elevation of groundwater hydraulic pressure in low permeable strata near the highway pavement, which was controlled by landform, hydrology, weather and road structure. The risk source of groundwater 'flooding' was the groundwater and surface water in the ring-like valley around Fenshui Village. A blind-ditch system for effective remediation of the pavement seepage hazard was proposed and successfully implemented by declining groundwater table near the highway based on the comprehensive assessment of various conditions. This geotechnical accident shows that the role of groundwater is an essential factor to consider in the geotechnical and environmental engineering studies and multidisciplinary effort for risk assessment of environmental hazards is important under current global climate change condition.

  16. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  17. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product will be consumed without a process sufficient to kill the parasites, or where the processor...

  18. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product will be consumed without a process sufficient to kill the parasites, or where the processor...

  19. Evaluation and Application of Probabilistic Tsunami Hazard Analysis in California

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Wilson, R. I.; Miller, K.

    2014-12-01

    The California Geological Survey (CGS) and URS Corporation are in the process of generating tsunami hazard map products for land-use planning and construction through the California Seismic Hazard Mapping Act (Public Resources Code, sec 2690 et seq.). Similar to seismic hazard zonation, these products for land-use and development decision-making are typically based on a probabilistic analysis, and require information on the frequency of occurrence through a probabilistic tsunami hazard analysis (PTHA). In Phase 1 of CGS's work, the California PTHA Work Group was established to evaluate the results of PTHA demonstration projects in Crescent City and Huntington Beach. The results of this Phase 1 review of the two independent analyses indicate PTHA's can be developed with recommended improvements in source characterization, PTHA methods, and numerical model use. An immediate significant improvement is to align the characterization of the Cascadia Subduction Zone PTHA with the seismic characterization of the National Seismic Hazard Map Program of the USGS. In addition to applying PTHA to land-use planning and the two demonstration projects, CGS and the CA-PTHA Work Group identified other potential applications for various PTHA risk levels (ARP = Average Return Period), including flood insurance (100 and 500 year ARP), building codes (2,500 year ARP), and emergency response planning (1000 year ARP or larger). CGS is working with URS Corp., the California Office of Emergency Services, and FEMA on a Phase 2 plan to produce a single set of reliable and consistent PTHA maps for multiple risk levels and work with various end-users to determine how to use the maps. The California PTHA and the results of the Work Group review are also proposed to be used by the U.S. National Tsunami Hazard Mitigation Program to develop guidelines for production in other coastal states.

  20. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  1. Uncertainty analysis for seismic hazard in Northern and Central Italy

    USGS Publications Warehouse

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  2. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  3. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials.

  4. Guidance and Control Algorithms for the Mars Entry, Descent and Landing Systems Analysis

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; CwyerCianciolo, Alicia M.; Powell, Richard W.; Shidner, Jeremy D.; Garcia-Llama, Eduardo

    2010-01-01

    The purpose of the Mars Entry, Descent and Landing Systems Analysis (EDL-SA) study was to identify feasible technologies that will enable human exploration of Mars, specifically to deliver large payloads to the Martian surface. This paper focuses on the methods used to guide and control two of the contending technologies, a mid- lift-to-drag (L/D) rigid aeroshell and a hypersonic inflatable aerodynamic decelerator (HIAD), through the entry portion of the trajectory. The Program to Optimize Simulated Trajectories II (POST2) is used to simulate and analyze the trajectories of the contending technologies and guidance and control algorithms. Three guidance algorithms are discussed in this paper: EDL theoretical guidance, Numerical Predictor-Corrector (NPC) guidance and Analytical Predictor-Corrector (APC) guidance. EDL-SA also considered two forms of control: bank angle control, similar to that used by Apollo and the Space Shuttle, and a center-of-gravity (CG) offset control. This paper presents the performance comparison of these guidance algorithms and summarizes the results as they impact the technology recommendations for future study.

  5. Long term volcanic hazard analysis in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  6. Surface Fire Hazards Analysis Technical Report-Constructor Facilities

    SciTech Connect

    R.E. Flye

    2000-10-24

    The purpose of this Fire Hazards Analysis Technical Report (hereinafter referred to as Technical Report) is to assess the risk from fire within individual fire areas to ascertain whether the U.S. Department of Energy (DOE) fire safety objectives are met. The objectives identified in DOE Order 420.1, Change 2, Facility Safety, Section 4.2, establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public, or the environment; Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding defined limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  7. A methodology for probabilistic fault displacement hazard analysis (PFDHA)

    USGS Publications Warehouse

    Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H.; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

    2003-01-01

    We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

  8. Coupling induced seismic hazard analysis with reservoir design

    NASA Astrophysics Data System (ADS)

    Gischig, V.; Wiemer, S.; Alcolea, A. R.

    2013-12-01

    positive impact on seismic hazard. However, as smaller magnitudes contribute less to permeability enhancement the efficiency of stimulation is degraded in case of high b-value conditions. Nevertheless, target permeability enhancement can be still be achieved under high b-value condition without reaching an unacceptable seismic hazard level, if either initial permeability is already high or if several fractures are stimulated. The proposed modelling approach is a first step towards including induced seismic hazard analysis into the design of reservoir stimulation.

  9. A Hazard Analysis for a Generic Insulin Infusion Pump

    PubMed Central

    Zhang, Yi; Jones, Paul L.; Jetley, Raoul

    2010-01-01

    Background Researchers at the Food and Drug Administration (FDA)/Center for Device and Radiological Health/Office of Science and Engineering Laboratories have been exploring the concept of model-based engineering as a means for improving the quality of medical device software. Insulin pumps were chosen as a research subject because their design provides the desired degree of research complexity and these types of devices present an ongoing regulatory challenge. Methods Insulin pump hazards and their contributing factors are considered in the context of a highly abstract generic insulin infusion pump (GIIP) model. Hazards were identified by consulting with manufacturers, pump users, and clinicians; by reviewing national and international standards and adverse event reports collected by the FDA; and from workshops sponsored by Diabetes Technology Society. This information has been consolidated in tabular form to facilitate further community analysis and discussion. Results A generic insulin infusion pump model architecture has been established. A fairly comprehensive hazard analysis document, corresponding to the GIIP model, is presented in this article. Conclusions We believe that this work represents the genesis of an insulin pump safety reference standard upon which future insulin pump designs can be based to help ensure a basic level of safety. More interaction with the diabetes community is needed to assure the quality of this safety modeling process. PMID:20307387

  10. Fire hazard analysis of the radioactive mixed waste trenchs

    SciTech Connect

    McDonald, K.M.

    1995-04-27

    This Fire Hazards Analysis (FHA) is intended to assess comprehensively the risk from fire associated with the disposal of low level radioactive mixed waste in trenches within the lined landfills, provided by Project W-025, designated Trench 31 and 34 of the Burial Ground 218-W-5. Elements within the FHA make recommendations for minimizing risk to workers, the public, and the environment from fire during the course of the operation`s activity. Transient flammables and combustibles present that support the operation`s activity are considered and included in the analysis. The graded FHA contains the following elements: description of construction, protection of essential safety class equipment, fire protection features, description of fire hazards, life safety considerations, critical process equipment, high value property, damage potential--maximum credible fire loss (MCFL) and maximum possible fire loss (MPFL), fire department/brigade response, recovery potential, potential for a toxic, biological and/or radiation incident due to a fire, emergency planning, security considerations related to fire protection, natural hazards (earthquake, flood, wind) impact on fire safety, and exposure fire potential, including the potential for fire spread between fire areas. Recommendations for limiting risk are made in the text of this report and printed in bold type. All recommendations are repeated in a list in Section 18.0.

  11. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    SciTech Connect

    Sevougian, S. David

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  12. Hazards Analysis Report Addendum Buildign 518/518A Industrial Gases & Chemtrack Receiving & Barcoding Facility

    SciTech Connect

    Hickman, R D

    2000-02-04

    This report documents the Hazards Analysis Report (HAR) Addendum for Buildings 518 and 518A. In summary, the description of the facility and the operations given in the 1995 PHA are the same as the present in this year 2000. The hazards description also remains the same. The hazards analysis in this HAR Addendum is different in that it needs to be compared to operations routinely ''performed'' by the public. The HAR Addendum characterizes the level of intrinsic potential hazards associated with a facility and provides the basis for hazard classification. The hazard classification determines the level of safety documentation required and the DOE order governing the safety analysis. The hazard classification also determines the level of review and approval required for the safety analysis. This facility does not contain any safety class systems or systems important to safety as defined in Department of Energy standard DOE-STD-3009-94. The hazards of primary concern associated with B518 and B518A are chemical in nature. The hazard classification is determined by comparing facility inventories of chemicals with threshold values for the various hazard classification levels. In this way, the hazard level of the facility can be ascertained. The most significant hazards that could affect people in the local area of B518 and B518A, elsewhere on the LLNL site, and off site, are associated with hazardous and toxic materials. These hazards are the focus of this report and are the basis for the facility hazard classification.

  13. Use of technical expert panels: Applications to probabilistic seismic hazard analysis

    SciTech Connect

    Budnitz, R.J.; Apostolakis, G.; Boore, D.M.; Cluff, L.S.; Coppersmith, K.J.; Cornell, C.A.; Morris, P.A.

    1998-08-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motions will be exceeded at a given location in a given future time period. Due to large uncertainties in all of the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreements among the experts, which in the past has led to disagreement on the selection of a ground motion for design at a given site. This paper reports on a project, co-sponsored by the US Nuclear Regulatory Commission, the US Department of Energy, and the Electric Power Research Institute, that was undertaken to review the state-of-the-art and improve on the overall stability of the PSHA process, by providing methodological guidance on how to perform a PSHA. The project reviewed past studies and examined ways to improve on the present state-of-the-art. In analyzing past PSHA studies, the most important conclusion is that differences in PSHA results are commonly due to process rather than technical differences. Thus, the project concentrated heavily on developing process recommendations, especially on the use of multiple experts, and this paper reports on those process recommendations. The problem of facilitating and integrating the judgments of a diverse group of experts is analyzed in detail. The authors believe that the concepts and process principles apply just as well to non-earthquake fields such as volcanic hazard, flood risk, nuclear-plant safety, and climate change.

  14. Performance analysis of the combined EDS maglev propulsion, levitation, and guidance system

    SciTech Connect

    He, J.L.; Coffey, H.T.; Rote, D.M.

    1993-10-01

    An analysis of the Japanese maglev system which uses only one set of coils in the guideway for combined levitation, propulsion, and guidance functions is presented in this paper. This preliminary study, using the dynamic circuit approach, indicates that the system is very promising.

  15. Guidance analysis of the aeroglide plane change maneuver as a turning point problem

    NASA Technical Reports Server (NTRS)

    Gracey, Christopher

    1989-01-01

    The development of guidance approximations for the atmospheric (aeroglide) portion of the minimum fuel, orbital plane change, trajectory optimization problem is described. Asymptotic methods are used to reduce the two point, boundary value, optimization problem to a turning point problem from the bank angle control. The turning point problem solution, which yields an approximate optimal control policy, is given in terms of parabolic cylinder functions, which are tabulated, and integral expressions, which must be numerically computed. Comparisons of the former, over their region of validity, with optimal control solutions show good qualitative agreement. Additional work and analysis is needed to compute the guidance approximation work.

  16. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  17. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  18. 77 FR 5857 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...: On November 2, 2011 (76 FR 67764), the U.S. Nuclear Regulatory Commission (NRC) published for public comment Draft NUREG, ``Common- Cause Failure Analysis in Event and Condition Assessment: Guidance...

  19. 76 FR 67764 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...-xxxx, Revision 0, ``Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and...-0254 in the subject line of your comments. For additional instructions on submitting comments...

  20. Hazard analysis of a computer based medical diagnostic system.

    PubMed

    Chudleigh, M F

    1994-07-01

    Medical screening of sectors of the population is now a routine and vital part of health care: an example is cervical smear testing. There is currently significant interest in the possible introduction of semi-automated microscopy systems for cervical cytology and one such experimental system is now undergoing laboratory trials. A collaborative project has been set up to demonstrate the benefits and constraints that arise from applying safety-critical methods developed in other domains to such a diagnostic system. We have carried out a system hazard analysis, successfully using the HAZOP technique adapted from the petrochemical industry. PMID:7988111

  1. Probabilistic Seismic Hazard Analysis: Adaptation for CO2 Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Vasudevan, K.; Eaton, D. W.

    2011-12-01

    Large-scale sequestration of CO2 in depleted oil and gas fields in sedimentary basins such as the Western Canada Sedimentary Basin (WCSB) and in particular, central Alberta, should consider, among other safety and risk issues, a seismic hazard analysis that would include potential ground motions induced by earthquakes. The region is juxtaposed to major tectonically active seismogenic zones such as the Cascadia Subduction Zone, the Queen Charlotte Fault Zone, and the northern Cordillera region. Hazards associated with large-scale storage from strong ground motions caused by large-magnitude earthquakes along the west coast of Canada, and/or medium-to-large magnitude earthquakes triggered by such earthquakes in the neighbourhood of the storage site, must be clearly understood. To this end, stochastic modeling of the accelerograms recorded during large magnitude earthquakes in western Canada has been undertaken. A lack of recorded accelerograms and the absence of a catalogue of ground-motion prediction equations similar to the Next Generation Attenuation (NGA) database, however, hamper such analysis for the WCSB. In order to generate our own database of ground-motions for probabilistic seismic hazard analysis, we employ a site-based stochastic simulation approach. We use it to simulate three-component ground-motion accelerograms recorded during the November 3, 2002 Denali earthquake to mimic the Queen Charlotte Fault earthquakes. To represent a Cascadia megathrust earthquake, we consider three-component strong-motion accelerograms recorded during the March 11, 2011 Tohoku earthquake in Japan. Finally, to simulate an event comparable to the thrust-style Kinbasket Lake earthquake of 1908, we use three-component ground-motion accelerograms recorded during the 1985 Nahanni earthquake and the 2004 Chuetsu earthquake. Here, we develop predictive equations for the stochastic model parameters that describe ground motions in terms of earthquake and site characteristics such as

  2. Fire hazards analysis for W030 tank farm ventilation upgrade

    SciTech Connect

    Huckfeldt, R.A.

    1996-07-17

    This Fire Hazard Analysis (FHA) was prepared according to the requirements of U.S. Department of Energy (DOE) Order 5480.7A,FIRE PROTECTION, 2-17-93. The purpose of this FHA is to ascertain whether the objectives of DOE 5480.7A are being met. This purpose is accomplished through a conservative comprehensive assessment of the risk from fire and other perils within individual fire areas of a DOE facility in relation to proposed fire protection. This FHA is based on conditions set forth within this document and is valid only under these conditions.

  3. Hazards assessment for the INEL Landfill Complex

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-02-01

    This report documents the hazards assessment for the INEL Landfill Complex (LC) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and the DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes the hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding the LC, the buildings and structures at the LC, and the processes that are used at the LC are described in this report. All hazardous materials, both radiological and nonradiological, at the LC were identified and screened against threshold quantities according to DOE Order 5500.3A guidance. Asbestos at the Asbestos Pit was the only hazardous material that exceeded its specified threshold quantity. However, the type of asbestos received and the packaging practices used are believed to limit the potential for an airborne release of asbestos fibers. Therefore, in accordance with DOE Order 5500.3A guidance, no further hazardous material characterization or analysis was required for this hazards assessment.

  4. A LiDAR based analysis of hydraulic hazard mapping

    NASA Astrophysics Data System (ADS)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  5. Six-degree-of-freedom guidance and control-entry analysis of the HL-20

    NASA Astrophysics Data System (ADS)

    Powell, Richard W.

    1993-10-01

    The ability of the HL-20 lifting body to fly has been evaluated for an automated entry from atmospheric interface to landing. This evaluation was required to demonstrate that not only successful touchdown conditions would be possible for this low lift-to-drag-ratio vehicle, but also the vehicle would not exceed its design dynamic pressure limit of 400 psf during entry. This dynamic pressure constraint limit, coupled with limited available pitch-control authority at low supersonic speeds, restricts the available maneuvering capability for the HL-20 to acquire the runway. One result of this analysis was that this restrictive maneuvering capability does not allow the use of a model-following atmospheric entry-guidance algorithm, such as that used by the Space Shuttle, but instead requires a more adaptable guidance algorithm. Therefore, for this analysis, a predictor-corrector guidance algorithm was developed that would provide successful touchdown conditions while not violating the dynamic pressure constraint. A flight-control system was designed and incorporated, along with the predictor-corrector guidance algorithm, into a six-DOF simulation. which showed that the HL-20 remained controllable and could reach the landing site and execute a successful landing under all off-nominal conditions simulated.

  6. Estimating an Applying Uncertainties in Probabilistic Tsunami Hazard Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Thio, H. K.

    2013-12-01

    An integral part of the a probabilistic analysis is the formal inclusion of uncertainties, both due to a limited understanding of the physics processes (epistemic) as well their natural variability (aleatory). Because of the strong non-linearity of the tsunami inundation process, it is also important to not only understand the extent of the uncertainties, but also how and where to apply them. We can divide up the uncertainties into several stages: the source, ocean propagation and nearshore/inundation. On the source side, many of the uncertainties are identical to those used in probabilistic seismic hazard analysis (PSHA). However, the details of slip distributions are very significant in tsunami excitation, especially for near-field tsunamis.. We will show several ways of including slip variability, both stochastic and non-stochastic, by developing a probabilistic set of source scenarios. The uncertainties in ocean propagation are less significant since modern algorithms are very successful in modeling open ocean tsunami propagation. However, in the near-shore regime and the inundation, the situation is much more complex. Here, errors in the local elevation models, variability in bottom friction and the omission of built environment can lead to significant errors. Details of the implementation of the tsunami algorithms can yield different results. We will discuss the most significant sources of uncertainty and the alternative ways to implement them using examples for the probabilistic tsunami hazard mapping that we are currently carrying out for the state of California and other regions.

  7. Fire hazard analysis for Plutonium Finishing Plant complex

    SciTech Connect

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  8. Resources Available for Hazards Analysis of Aerospace Fluids

    NASA Technical Reports Server (NTRS)

    Woods, S. S.; Stewart, W. F.; Baker, D. L.

    2001-01-01

    In recent years, the legislative and executive branches of the federal government have pushed to make government more efficient and responsive to the needs of the marketplace. One of these initiatives, Public Law 104-113, also known as the National Technology Transfer and Advancement Act of 1995 (NTTAA), is designed to accelerate technology transfer to industry and promote government-industry partnership. Summarized, NTTAA states that '... all Federal agencies and departments shall use technical standards that are developed or adopted by voluntary consensus standards bodies, using such technical standards as a means to carry out policy objectives or activities determined by the agencies and departments. Government agencies must now determine if their in-house requirement-setting activities are sufficiently unique that no public interest is served by having them adopted by a voluntary consensus organization (VCO), or if not, to use or develop voluntary consensus standards. The Office of Management and Budget (OMB) is chartered by the law to monitor federal agency progress and report the results to Congress. In response to NTTAA, agency-wide oxygen and hydrogen safety standards sponsored by the NASA Headquarters (HQ) Office of Safety and Mission Assurance (OSMA) were obvious choices for early adoption by VCOs. In 1996, HQ sought assistance from the Johnson Space Center (JSC) White Sands Test Facility (WSTF), the technical lead for development of these safety standards, to evaluate their adoption by VCOs. At that time, WSTF-developed propellant hazards manuals were likewise identified for possible VCO adoption. Subsequently, WSTF was asked to represent NASA for development of an international ISO safety standard for hydrogen use. Concurrent with these WSTF standards activities are related efforts to develop and publish propellant hazards analysis protocols and safety courses for the industrial, propellant use of oxygen, hydrogen, and hypergols. This paper reports on

  9. Probabilistic seismic hazard analysis for the city of Quetta, Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Shafiq; Lindholm, Conrad; Ahmed, Najeeb; Rafi, Zahid

    2014-08-01

    Seismic hazard assessment for Quetta is carried out using probabilistic seismic hazard analysis technique based on area sources and augmented by line source used for the first time in Pakistan. Seismic data has been collected and analyzed in spatial and temporal domains. Five Seismic Zones have been modeled in line with tectonics of the region with b-value of 1.14 using regression. The b-value is slightly higher, which is attributed to the fact that aftershocks were not removed as it distorted the dataset. Five fault sources are modeled, with three as reverse and two as strike-slip with 7.8 as maximum magnitude. Mach Structure is included in the tectonics for the first time. The attenuation relation used in the present study is recommended by various researchers. The expected Peak Ground Acceleration for 500-year return period is 4.79 m/s2 for rock outcrop and characterized as very high. Furthermore, variation in spectral acceleration within Quetta city is observed, for which spectral curves are developed for four different places.

  10. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  11. The European ASAMPSA_E project : towards guidance to model the impact of high amplitude natural hazards in the probabilistic safety assessment of nuclear power plants. Information on the project progress and needs from the geosciences.

    NASA Astrophysics Data System (ADS)

    Raimond, Emmanuel; Decker, Kurt; Guigueno, Yves; Klug, Joakim; Loeffler, Horst

    2015-04-01

    The Fukushima nuclear accident in Japan resulted from the combination of two correlated extreme external events (earthquake and tsunami). The consequences, in particular flooding, went beyond what was considered in the initial engineering design design of nuclear power plants (NPPs). Such situations can in theory be identified using probabilistic safety assessment (PSA) methodology. PSA results may then lead industry (system suppliers and utilities) or Safety Authorities to take appropriate decisions to reinforce the defence-in-depth of the NPP for low probability event but high amplitude consequences. In reality, the development of such PSA remains a challenging task. Definitions of the design basis of NPPs, for example, require data on events with occurrence probabilities not higher than 10-4 per year. Today, even lower probabilities, down to 10-8, are expected and typically used for probabilistic safety analyses (PSA) of NPPs and the examination of so-called design extension conditions. Modelling the combinations of natural or man-made hazards that can affect a NPP and affecting some meaningful probability of occurrence seems to be difficult. The European project ASAMPSAE (www.asampsa.eu) gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants. It offers a framework to discuss, at a technical level, how "extended PSA" can be developed efficiently and be used to verify if the robustness of Nuclear Power Plants (NPPs) in their environment is sufficient. The paper will present the objectives of this project, some first lessons and introduce which type of guidance is being developed. It will explain the need of expertise from geosciences to support the nuclear safety assessment in the different area (seismotectonic, hydrological, meteorological and biological

  12. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    SciTech Connect

    , R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  13. UPDATE TO THE PROBABILISTIC VOLCANIC HAZARD ANALYSIS, YUCCA MOUNTAIN, NEVADA

    SciTech Connect

    K.J. Coppersmith

    2005-09-14

    A probabilistic volcanic hazard analysis (PVHA) was conducted in 1996 for the proposed repository at Yucca Mountain, Nevada. Based on data gathered by the Yucca Mountain Project over the course of about 15 years, the analysis integrated the judgments of a panel of ten volcanic experts using methods of formal expert elicitation. PVHA resulted in a probability distribution of the annual frequency of a dike intersecting the repository, which ranges from 10E-7 to 10E-10 (mean 1.6 x 10E-8). The analysis incorporates assessments of the future locations, rates, and types of volcanic dikes that could intersect the repository, which lies about 300 m below the surface. A particular focus of the analysis is the quantification of uncertainties. Since the 1996 PVHA, additional aeromagnetic data have been collected in the Yucca Mountain region, including a high-resolution low-altitude survey. A number of anomalies have been identified within alluvial areas and modeling suggests that some of these may represent buried eruptive centers (basaltic cinder cones). A program is currently underway to drill several of the anomalies to gain information on their origin and, if basalt, their age and composition. To update the PVHA in light of the new aeromagnetic and drilling data as well as other advancements in volcanic hazard modeling over the past decade, the expert panel has been reconvened and the expert elicitation process has been fully restarted. The analysis requires assessments of the spatial distribution of igneous events, temporal distributions, and geometries and characteristics of future events (both intrusive and extrusive). The assessments are for future time periods of 10,000 years and 1,000,000 years. Uncertainties are being quantified in both the conceptual models that define these elements as well as in the parameters for the models. The expert elicitation process is centered around a series of workshops that focus on the available data; alternative approaches to

  14. Precise terrestrial time: A means for improved ballistic missile guidance analysis

    NASA Technical Reports Server (NTRS)

    Ehrsam, E. E.; Cresswell, S. A.; Mckelvey, G. R.; Matthews, F. L.

    1978-01-01

    An approach developed to improve the ground instrumentation time tagging accuracy and adapted to support the Minuteman ICBM program is desired. The Timing Insertion Unit (TIU) technique produces a telemetry data time tagging resolution of one tenth of a microsecond, with a relative intersite accuracy after corrections and velocity data (range, azimuth, elevation and range rate) also used in missile guidance system analysis can be correlated to within ten microseconds of the telemetry guidance data. This requires precise timing synchronization between the metric and telemetry instrumentation sites. The timing synchronization can be achieved by using the radar automatic phasing system time correlation methods. Other time correlation techniques such as Television (TV) Line-10 and the Geostationary Operational Environmental Satellites (GEOS) terrestial timing receivers are also considered.

  15. Analysis of hazardous material releases due to natural hazards in the United States.

    PubMed

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Cruz, Ana Maria

    2012-10-01

    Natural hazards were the cause of approximately 16,600 hazardous material (hazmat) releases reported to the National Response Center (NRC) between 1990 and 2008-three per cent of all reported hazmat releases. Rain-induced releases were most numerous (26 per cent of the total), followed by those associated with hurricanes (20 per cent), many of which resulted from major episodes in 2005 and 2008. Winds, storms or other weather-related phenomena were responsible for another 25 per cent of hazmat releases. Large releases were most frequently due to major natural disasters. For instance, hurricane-induced releases of petroleum from storage tanks account for a large fraction of the total volume of petroleum released during 'natechs' (understood here as a natural hazard and the hazardous materials release that results). Among the most commonly released chemicals were nitrogen oxides, benzene, and polychlorinated biphenyls. Three deaths, 52 injuries, and the evacuation of at least 5,000 persons were recorded as a consequence of natech events. Overall, results suggest that the number of natechs increased over the study period (1990-2008) with potential for serious human and environmental impacts.

  16. Analysis of hazardous material releases due to natural hazards in the United States.

    PubMed

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Cruz, Ana Maria

    2012-10-01

    Natural hazards were the cause of approximately 16,600 hazardous material (hazmat) releases reported to the National Response Center (NRC) between 1990 and 2008-three per cent of all reported hazmat releases. Rain-induced releases were most numerous (26 per cent of the total), followed by those associated with hurricanes (20 per cent), many of which resulted from major episodes in 2005 and 2008. Winds, storms or other weather-related phenomena were responsible for another 25 per cent of hazmat releases. Large releases were most frequently due to major natural disasters. For instance, hurricane-induced releases of petroleum from storage tanks account for a large fraction of the total volume of petroleum released during 'natechs' (understood here as a natural hazard and the hazardous materials release that results). Among the most commonly released chemicals were nitrogen oxides, benzene, and polychlorinated biphenyls. Three deaths, 52 injuries, and the evacuation of at least 5,000 persons were recorded as a consequence of natech events. Overall, results suggest that the number of natechs increased over the study period (1990-2008) with potential for serious human and environmental impacts. PMID:22329456

  17. NFPA 70E: Performing the electrical flash hazard analysis.

    PubMed

    Wallace, W Jon

    2005-08-01

    Arc flash hazards pose a serious risk to employee safety; proper safe work practices must be utilized. Electrical equipment > or = 50 volts must be deenergized and locked out/tagged out prior to servicing and maintenance unless doing so would increase hazards or is infeasible. Remember, convenience is not an acceptable reason for keeping equipment energized during servicing and maintenance. If electrical equipment must remain energized during Servicing and maintenance, NFPA 70E should be consulted to determine flash hazard boundaries as well as required PPE. Finally, circuit breakers and electrical disconnects must be marked to warn qualified employees of potential arc flash hazards. PMID:16212025

  18. Independent Orbiter Assessment (IOA): Analysis of the guidance, navigation, and control subsystem

    NASA Technical Reports Server (NTRS)

    Trahan, W. H.; Odonnell, R. A.; Pietz, K. C.; Hiott, J. M.

    1986-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) is presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The independent analysis results corresponding to the Orbiter Guidance, Navigation, and Control (GNC) Subsystem hardware are documented. The function of the GNC hardware is to respond to guidance, navigation, and control software commands to effect vehicle control and to provide sensor and controller data to GNC software. Some of the GNC hardware for which failure modes analysis was performed includes: hand controllers; Rudder Pedal Transducer Assembly (RPTA); Speed Brake Thrust Controller (SBTC); Inertial Measurement Unit (IMU); Star Tracker (ST); Crew Optical Alignment Site (COAS); Air Data Transducer Assembly (ADTA); Rate Gyro Assemblies; Accelerometer Assembly (AA); Aerosurface Servo Amplifier (ASA); and Ascent Thrust Vector Control (ATVC). The IOA analysis process utilized available GNC hardware drawings, workbooks, specifications, schematics, and systems briefs for defining hardware assemblies, components, and circuits. Each hardware item was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.

  19. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  20. Analysis of the Proportional Hazards Model with Sparse Longitudinal Covariates

    PubMed Central

    Cao, Hongyuan; Churpek, Mathew M.; Zeng, Donglin; Fine, Jason P.

    2014-01-01

    Regression analysis of censored failure observations via the proportional hazards model permits time-varying covariates which are observed at death times. In practice, such longitudinal covariates are typically sparse and only measured at infrequent and irregularly spaced follow-up times. Full likelihood analyses of joint models for longitudinal and survival data impose stringent modelling assumptions which are difficult to verify in practice and which are complicated both inferentially and computationally. In this article, a simple kernel weighted score function is proposed with minimal assumptions. Two scenarios are considered: half kernel estimation in which observation ceases at the time of the event and full kernel estimation for data where observation may continue after the event, as with recurrent events data. It is established that these estimators are consistent and asymptotically normal. However, they converge at rates which are slower than the parametric rates which may be achieved with fully observed covariates, with the full kernel method achieving an optimal convergence rate which is superior to that of the half kernel method. Simulation results demonstrate that the large sample approximations are adequate for practical use and may yield improved performance relative to last value carried forward approach and joint modelling method. The analysis of the data from a cardiac arrest study demonstrates the utility of the proposed methods. PMID:26576066

  1. Hazard function analysis for flood planning under nonstationarity

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  2. Recent Advances in Launch Vehicle Toxic Hazard and Risk Analysis

    NASA Astrophysics Data System (ADS)

    Nyman, R. L.

    2012-01-01

    A number of widely used rocket propellants produce toxic combustion byproducts or are themselves toxic in their un-reacted state. In this paper we focus on the methodology used to evaluate early flight catastrophic failures and nominal launch emissions that release large amounts of propellant or combustion products into the planetary boundary layer that pose a potential risk to launch area personnel, spectators, or the general public. The United States has traditionally used the Rocket Exhaust Effluent Diffusion Model (REEDM) [1] to access the hazard zones associated with such releases. REEDM is a 1970's vintage Gaussian atmospheric dispersion model that is limited in its ability to accurately simulate certain aspects of the initial source geometry and dynamics of a vehicle breakup and propellant fragment dispersion. The Launch Area Toxic Risk Analysis 3-Dimensional (LATRA3D) [2] computer program has been developed that addresses many of REEDM's deficiencies. LATRA3D is a probabilistic risk analysis tool that simulates both nominal vehicle flight and in-flight failure emissions.

  3. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... determine the food safety hazards reasonably likely to occur in the production process and identify the... food safety hazards that can occur before, during, and after entry into the establishment. A...

  4. Development, analysis, and testing of robust nonlinear guidance algorithms for space applications

    NASA Astrophysics Data System (ADS)

    Wibben, Daniel R.

    This work focuses on the analysis and application of various nonlinear, autonomous guidance algorithms that utilize sliding mode control to guarantee system stability and robustness. While the basis for the algorithms has previously been proposed, past efforts barely scratched the surface of the theoretical details and implications of these algorithms. Of the three algorithms that are the subject of this research, two are directly derived from optimal control theory and augmented using sliding mode control. Analysis of the derivation of these algorithms has shown that they are two different representations of the same result, one of which uses a simple error state model (Delta r/Deltav) and the other uses definitions of the zero-effort miss and zero-effort velocity (ZEM/ZEV) values. By investigating the dynamics of the defined sliding surfaces and their impact on the overall system, many implications have been deduced regarding the behavior of these systems which are noted to feature time-varying sliding modes. A formal finite time stability analysis has also been performed to theoretically demonstrate that the algorithms globally stabilize the system in finite time in the presence of perturbations and unmodeled dynamics. The third algorithm that has been subject to analysis is derived from a direct application of higher-order sliding mode control and Lyapunov stability analysis without consideration of optimal control theory and has been named the Multiple Sliding Surface Guidance (MSSG). Via use of reinforcement learning methods an optimal set of gains has been found that make the guidance perform similarly to an open-loop optimal solution. Careful side-by-side inspection of the MSSG and Optimal Sliding Guidance (OSG) algorithms has shown some striking similarities. A detailed comparison of the algorithms has demonstrated that though they are nearly indistinguishable at first glance, there are some key differences between the two algorithms and they are indeed

  5. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    SciTech Connect

    Thomas, John

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  6. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) plan. 120.8 Section 120.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more...

  7. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  8. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  9. Analysis of hazardous biological material by MALDI mass spectrometry

    SciTech Connect

    KL Wahl; KH Jarman; NB Valentine; MT Kingsley; CE Petersen; ST Cebula; AJ Saenz

    2000-03-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-MS) has become a valuable tool for analyzing microorganisms. The speed with which data can be obtained from MALDI-MS makes this a potentially important tool for biological health hazard monitoring and forensic applications. The excitement in the mass spectrometry community in this potential field of application is evident by the expanding list of research laboratories pursuing development of MALDI-MS for bacterial identification. Numerous research groups have demonstrated the ability to obtain unique MALDI-MS spectra from intact bacterial cells and bacterial cell extracts. The ability to differentiate strains of the same species has been investigated. Reproducibility of MALDI-MS spectra from bacterial species under carefully controlled experimental conditions has also been demonstrated. Wang et al. have reported on interlaboratory reproducibility of the MALDI-MS analysis of several bacterial species. However, there are still issues that need to be addressed, including the careful control of experimental parameters for reproducible spectra and selection of optimal experimental parameters such as solvent and matrix.

  10. Hazardous Gas Leak Analysis in the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.

    1991-01-01

    Helium tests of the main propulsion system in the Space Shuttle and on hydrogen leaks are examined. The hazardous gas detection system (HGDS) in the mobile launch pad uses mass spectrometers (MS) to monitor the shuttle environment for leaks. The mass spectrometers are fed by long tubes to sample gas from the payload bay, mid-body, aft engine compartment, and external tank. The purpose is to improve the HGDS, especially in its potential for locating cryogen leaks. Pre-existing leak data was analyzed for transient information to determine if the leak location could be pinpointed from test data. A rapid response leak detection experiment was designed, built, and tested. Large eddies and vortices were visually seen with Schlieren imaging, and they were detected in the time plots of the various instruments. The response time of the MS was found in the range of 0.05 to 0.1 sec. Pulsed concentration waves were clearly detected at 25 cycles per sec by spectral analysis of MS data. One conclusion is that the backup HGDS sampling frequency should be increased above the present rate of 1 sample per second.

  11. Preparation of waste oil for analysis to determine hazardous metals

    SciTech Connect

    Essling, A.M.; Huff, D.R.; Huff, E.A.; Fox, I.M.; Graczyk, D.G.

    1995-07-01

    Two methods for preparing waste-oil samples to permit measurement of their metals content were evaluated. For this evaluation, metals-in-oil standard reference materials were prepared by each method and the resulting solutions were analyzed for 20 metals, including those (As, Ba, Cd, Cr, Pb, Hg, Se, and Ag) regulated as hazardous under the Resource Conservation and Recovery Act. One preparation method involved combustion of the waste oil under oxygen at 25 atm pressure, as described in the American Society for Testing and Materials test method E926-88. As we applied it, this method gave recoveries well under 90% for most of the metals that we examined and, hence, proved unsatisfactory for routine application to waste-oil analysis. With the other method, nitric acid decomposition in a sealed vessel heated with microwave energy (analogous to US Environmental Protection Agency Method 3051), recoveries of all 20 metal contaminants were within 90 to 110% of the certified values. This microwave digestion procedure was also more efficient since it allowed six samples to be prepared together, whereas the oxygen combustion approach allowed processing of only one sample at a time.

  12. Scout: orbit analysis and hazard assessment for NEOCP objects

    NASA Astrophysics Data System (ADS)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  13. Hazards assessment for the Hazardous Waste Storage Facility

    SciTech Connect

    Knudsen, J.K.; Calley, M.B.

    1994-04-01

    This report documents the hazards assessment for the Hazardous Waste Storage Facility (HWSF) located at the Idaho National Engineering Laboratory. The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. The area surrounding HWSF, the buildings and structures at HWSF, and the processes used at HWSF are described in this report. All nonradiological hazardous materials at the HWSF were identified (radiological hazardous materials are not stored at HWSF) and screened against threshold quantities according to DOE Order 5500.3A guidance. Two of the identified hazardous materials exceeded their specified threshold quantity. This report discusses the potential release scenarios and consequences associated with an accidental release for each of the two identified hazardous materials, lead and mercury. Emergency considerations, such as emergency planning zones, emergency classes, protective actions, and emergency action levels, are also discussed based on the analysis of potential consequences. Evaluation of the potential consequences indicated that the highest emergency class for operational emergencies at the HWSF would be a Site Area Emergency.

  14. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD... the intended use or consumers of the finished product shall be identified. (3) Food safety...

  15. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD... the intended use or consumers of the finished product shall be identified. (3) Food safety...

  16. IDENTIFICATION OF AIRCRAFT HAZARDS

    SciTech Connect

    K.L. Ashley

    2005-03-23

    Aircraft hazards were determined to be potentially applicable to a repository at Yucca Mountain in the ''Monitored Geological Repository External Events Hazards Screening Analysis'' (BSC 2004, Section 6.4.1). That determination was conservatively based on limited knowledge of flight data in the area of concern and on crash data for aircraft of the type flying near Yucca Mountain. The purpose of this report is to identify specific aircraft hazards that may be applicable to a Monitored Geologic Repository (MGR) at Yucca Mountain using NUREG-0800, ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987, Section 3.5.1.6), as guidance for the inclusion or exclusion of identified aircraft hazards. NUREG-0800 is being used here as a reference because some of the same considerations apply. The intended use of this report is to provide inputs for further screening and analysis of the identified aircraft hazards based on the criteria that apply to Category 1 and 2 event sequence analyses as defined in 10 CFR 63.2 (see Section 4). The scope of this technical report includes the evaluation of military, private, and commercial use of airspace in the 100-mile regional setting of the MGR at Yucca Mountain with the potential for reducing the regional setting to a more manageable size after consideration of applicable screening criteria (see Section 7).

  17. Identification of Aircraft Hazards

    SciTech Connect

    K. Ashley

    2006-12-08

    Aircraft hazards were determined to be potentially applicable to a repository at Yucca Mountain in ''Monitored Geological Repository External Events Hazards Screening Analysis'' (BSC 2005 [DIRS 174235], Section 6.4.1). That determination was conservatively based upon limited knowledge of flight data in the area of concern and upon crash data for aircraft of the type flying near Yucca Mountain. The purpose of this report is to identify specific aircraft hazards that may be applicable to a monitored geologic repository (MGR) at Yucca Mountain, using NUREG-0800, ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987 [DIRS 103124], Section 3.5.1.6), as guidance for the inclusion or exclusion of identified aircraft hazards. The intended use of this report is to provide inputs for further screening and analysis of identified aircraft hazards based upon the criteria that apply to Category 1 and Category 2 event sequence analyses as defined in 10 CFR 63.2 [DIRS 176544] (Section 4). The scope of this report includes the evaluation of military, private, and commercial use of airspace in the 100-mile regional setting of the repository at Yucca Mountain with the potential for reducing the regional setting to a more manageable size after consideration of applicable screening criteria (Section 7).

  18. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    SciTech Connect

    Dionne, B.J.; Morris, S.C. III; Baum, J.W.

    1998-01-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique. This document contains the Appendices for the report.

  19. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    SciTech Connect

    Dionne, B.J.; Morris, S. III; Baum, J.W.

    1998-03-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique.

  20. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    SciTech Connect

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  1. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  2. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    SciTech Connect

    Matthews, Elizabeth C.; Sattler, Meredith; Friedland, Carol J.

    2014-11-15

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.

  3. The importance of source area mapping for rockfall hazard analysis

    NASA Astrophysics Data System (ADS)

    Valagussa, Andrea; Frattini, Paolo; Crosta, Giovanni B.

    2013-04-01

    A problem in the characterization of the area affected by rockfall is the correct source areas definition. Different positions or different size of the source areas along a cliff result in different possibilities of propagation and diverse interaction with passive countermeasures present in the area. Through the use of Hy-Stone (Crosta et al., 2004), a code able to perform 3D numerical modeling of rockfall processes, different types of source areas were tested on a case study slope along the western flank of the Mt. de La Saxe (Courmayeur, AO), developing between 1200 and 2055 m s.l.m. The first set of source areas consists of unstable rock masses identified on the basis of field survey and Terrestrial Laser Scanning (IMAGEO, 2011). A second set of source areas has been identified by using different thresholds of slope gradient. We tested slope thresholds between 50° and 75° at 5° intervals. The third source area dataset has been generating by performing a kinematic stability analysis. For this analysis, we mapped the join sets along the rocky cliff by means of the software COLTOP 3D (Jaboyedoff, 2004), and then we identified the portions of rocky cliff where planar/wedge and toppling failures are possible assuming an average friction angle of 35°. Through the outputs of the Hy-Stone models we extracted and analyzed the kinetic energy, height of fly and velocity of the blocks falling along the rocky cliff in order to compare the controls of different source areas. We observed strong variations of kinetic energy and fly height among the different models, especially when using unstable masses identified through Terrestrial Laser Scanning. This is mainly related to the size of the blocks identified as susceptible to failure. On the contrary, the slope gradient thresholds does not have a strong impact on rockfall propagation. This contribution highlights the importance of a careful and appropriate mapping of rockfall source area for rockfall hazard analysis and the

  4. Hazard Detection Analysis for a Forward-Looking Interferometer

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Herkert, Ralph; Smith, William L.; Kireev, Stanislav; Schaffner, Philip R.; Daniels, Taumi S.; Cornman, Larry B.; Sharman, Robert; Weekley, Andrew; Perram, Glen; Gross, Kevin; Smith, Greg; Feltz, Wayne; Taylor, Joe; Olson, Erik

    2010-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining the measurements required to alert flight crews to potential weather hazards to safe flight. To meet the needs of the commercial fleet, such a sensor should address multiple hazards to warrant the costs of development, certification, installation, training, and maintenance. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing. These technologies have also been applied to the detection of aerosols and gases for other purposes. The FLI concept is being evaluated for its potential to address multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing during all phases of flight (takeoff, cruise, and landing). The research accomplished in this second phase of the FLI project was in three major areas: further sensitivity studies to better understand the potential capabilities and requirements for an airborne FLI instrument, field measurements that were conducted in an effort to provide empirical demonstrations of radiometric hazard detection, and theoretical work to support the development of algorithms to determine the severity of detected hazards

  5. Job load and hazard analysis: a method for the analysis of workplace conditions for occupational health care.

    PubMed Central

    Mattila, M K

    1985-01-01

    One requirement for successful occupational health care is reliable information on occupational hazards. The aim of this study was to develop a simple, standardised method for workplace investigations for use in occupational health care. The theoretical framework of the method comprises the stress-strain model, the hazard-danger model, and risk behaviour theory. The new method, termed job load and hazard analysis, includes four stages: identification of hazards, their evaluation, conclusions and proposals, and follow up. Different methods are available for hazard identification. The identification starts with a rough analysis of five factors, chemical hazards, physical hazards, physical load, mental stress, and accident risk. Hazards and stress factors are assessed with an ordinal scale. Specialised methods are used if all hazards cannot otherwise be identified. The analytical procedure comprises: detection of hazards through observations and interviews at the workplace and with a questionnaire; assessment of findings as teamwork; and evaluation of the results of these assessments to yield conclusions and proposals made by occupational health care personnel. A data processing system has been developed for data storage and future use. The method has functioned in practice, improving the contents of the occupational health care programme and generating preventive measures. The method offers many new possibilities for controlling occupational hazards and studying relations between working conditions and workers' health. PMID:4041383

  6. Preliminary fire hazard analysis for the PUTDR and TRU trenches in the Solid Waste Burial Ground

    SciTech Connect

    Gaschott, L.J.

    1995-06-16

    This document represents the Preliminary Fire Hazards Analysis for the Pilot Unvented TRU Drum Retrieval effort and for the Transuranic drum trenches in the low level burial grounds. The FHA was developed in accordance with DOE Order 5480.7A to address major hazards inherent in the facility.

  7. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... hazards and each hazard control involved in the process. An analysis that complies with 29 CFR 1910.119(e... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of..., or for use in any real-time physics models used to ensure compliance with the toxic flight...

  8. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP) plan. 120.8 Section 120.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND...

  9. Occupational hazards control of hazardous substances in clean room of semiconductor manufacturing plant using CFD analysis.

    PubMed

    Li, Jianfeng; Zhou, Ya-Fei

    2015-02-01

    The manufacturing processes in chip industries are complex, and many kinds of raw materials and solvents of different nature are used, most of which are highly toxic and dangerous. During the machine preventive maintenance period, these toxic and harmful substances will escape from the sealed reaction chamber to the clean workshop environment and endanger the health of the workers on-site, resulting in occupational diseases. From the perspective of prevention, the spread and prediction of hydrochloric acid (HCl) that escaped from the metal-etching chamber during maintenance were studied in this article. The computational fluid dynamics technology was used for a three-dimensional numerical simulation of the indoor air velocity field and the HCl concentration field, and the simulation results were then compared with the on-site monitoring data to verify the correctness and feasibility. The occupational hazards and control measures were analyzed based on the numerical simulation, and the optimal control measure was obtained. In this article, using the method of ambient air to analyze the occupational exposure can provide a new idea to the field of occupational health research in the integrated circuit industry and had theoretical and practical significance.

  10. Hazard analysis for 300 Area N Reactor Fuel Fabrication and Storage Facilty

    SciTech Connect

    Johnson, D.J.; Brehm, J.R.

    1994-01-25

    This hazard analysis (HA) has been prepared for the 300 Area N Reactor Fuel Fabrication and Storage Facility (Facility), in compliance with the requirements of Westinghouse Hanford Company (Westinghouse Hanford) controlled manual WHC-CM-4-46, Nonreactor Facility Safety Analysis Manual, and to the direction of WHC-IP-0690, Safety Analysis and Regulation Desk Instructions, (WHC 1992). An HA identifies potentially hazardous conditions in a facility and the associated potential accident scenarios. Unlike the Facility hazard classification documented in WHC-SD-NR-HC-004, Hazard Classification for 300 Area N Reactor Fuel Fabrication and Storage Facility, (Huang 1993), which is based on unmitigated consequences, credit is taken in an HA for administrative controls or engineered safety features planned or in place. The HA is the foundation for the accident analysis. The significant event scenarios identified by this HA will be further evaluated in a subsequent accident analysis.

  11. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    SciTech Connect

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  12. Reliability analysis of common hazardous waste treatment processes

    SciTech Connect

    Waters, R.D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  13. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  14. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... COMMISSION Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment AGENCY: Nuclear Regulatory Commission (NRC.../ COL-ISG-020 titled ``Implementation of a Seismic Margin Analysis for New Reactors Based...

  15. National Emission Standards for Hazardous Air Pollutants for Major Sources. Industrial, Commercial, and Institutional Boilers; Guidance for Calculating Emission Credits Resulting from Implementation of Energy Conservation Measures

    SciTech Connect

    Papar, Riyaz; Wright, Anthony; Cox, Daryl

    2012-07-01

    The purpose of this document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule (i.e., Subpart DDDDD of CFR Part 63).

  16. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  17. Hazards assessment for the Waste Experimental Reduction Facility

    SciTech Connect

    Calley, M.B.; Jones, J.L. Jr.

    1994-09-19

    This report documents the hazards assessment for the Waste Experimental Reduction Facility (WERF) located at the Idaho National Engineering Laboratory, which is operated by EG&G Idaho, Inc., for the US Department of Energy (DOE). The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. DOE Order 5500.3A requires that a facility-specific hazards assessment be performed to provide the technical basis for facility emergency planning efforts. This hazards assessment was conducted in accordance with DOE Headquarters and DOE Idaho Operations Office (DOE-ID) guidance to comply with DOE Order 5500.3A. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility`s operational emergency management program. This hazards assessment describes the WERF, the area surrounding WERF, associated buildings and structures at WERF, and the processes performed at WERF. All radiological and nonradiological hazardous materials stored, used, or produced at WERF were identified and screened. Even though the screening process indicated that the hazardous materials could be screened from further analysis because the inventory of radiological and nonradiological hazardous materials were below the screening thresholds specified by DOE and DOE-ID guidance for DOE Order 5500.3A, the nonradiological hazardous materials were analyzed further because it was felt that the nonradiological hazardous material screening thresholds were too high.

  18. A cross-hazard analysis of terse message retransmission on Twitter.

    PubMed

    Sutton, Jeannette; Gibson, C Ben; Phillips, Nolan Edward; Spiro, Emma S; League, Cedar; Johnson, Britta; Fitzhugh, Sean M; Butts, Carter T

    2015-12-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes--local network properties, message content, and message style--that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates.

  19. A cross-hazard analysis of terse message retransmission on Twitter

    PubMed Central

    Sutton, Jeannette; Gibson, C. Ben; Phillips, Nolan Edward; Spiro, Emma S.; League, Cedar; Johnson, Britta; Fitzhugh, Sean M.; Butts, Carter T.

    2015-01-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes—local network properties, message content, and message style—that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates. PMID:26627233

  20. A cross-hazard analysis of terse message retransmission on Twitter.

    PubMed

    Sutton, Jeannette; Gibson, C Ben; Phillips, Nolan Edward; Spiro, Emma S; League, Cedar; Johnson, Britta; Fitzhugh, Sean M; Butts, Carter T

    2015-12-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes--local network properties, message content, and message style--that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates. PMID:26627233

  1. Chemical hazards analysis of resilient flooring for healthcare.

    PubMed

    Lent, Tom; Silas, Julie; Vallette, Jim

    2010-01-01

    This article addresses resilient flooring, evaluating the potential health effects of vinyl flooring and the leading alternatives-synthetic rubber, polyolefin, and linoleum-currently used in the healthcare marketplace. The study inventories chemicals incorporated as components of each of the four material types or involved in their life cycle as feedstocks, intermediary chemicals, or emissions. It then characterizes those chemicals using a chemical hazard-based framework that addresses persistence and bioaccumulation, human toxicity, and human exposures. PMID:21165873

  2. Sensitivity analysis for Probabilistic Tsunami Hazard Assessment (PTHA)

    NASA Astrophysics Data System (ADS)

    Spada, M.; Basili, R.; Selva, J.; Lorito, S.; Sorensen, M. B.; Zonker, J.; Babeyko, A. Y.; Romano, F.; Piatanesi, A.; Tiberti, M.

    2012-12-01

    In modern societies, probabilistic hazard assessment of natural disasters is commonly used by decision makers for designing regulatory standards and, more generally, for prioritizing risk mitigation efforts. Systematic formalization of Probabilistic Tsunami Hazard Assessment (PTHA) has started only in recent years, mainly following the giant tsunami disaster of Sumatra in 2004. Typically, PTHA for earthquake sources exploits the long-standing practices developed in probabilistic seismic hazard assessment (PSHA), even though important differences are evident. In PTHA, for example, it is known that far-field sources are more important and that physical models for tsunami propagation are needed for the highly non-isotropic propagation of tsunami waves. However, considering the high impact that PTHA may have on societies, an important effort to quantify the effect of specific assumptions should be performed. Indeed, specific standard hypotheses made in PSHA may prove inappropriate for PTHA, since tsunami waves are sensitive to different aspects of sources (e.g. fault geometry, scaling laws, slip distribution) and propagate differently. In addition, the necessity of running an explicit calculation of wave propagation for every possible event (tsunami scenario) forces analysts to finding strategies for diminishing the computational burden. In this work, we test the sensitivity of hazard results with respect to several assumptions that are peculiar of PTHA and others that are commonly accepted in PSHA. Our case study is located in the central Mediterranean Sea and considers the Western Hellenic Arc as the earthquake source with Crete and Eastern Sicily as near-field and far-field target coasts, respectively. Our suite of sensitivity tests includes: a) comparison of random seismicity distribution within area sources as opposed to systematically distributed ruptures on fault sources; b) effects of statistical and physical parameters (a- and b-value, Mc, Mmax, scaling laws

  3. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  4. An Exploratory Statistical Analysis of a Planet Approach-Phase Guidance Scheme Using Angular Measurements with Significant Error

    NASA Technical Reports Server (NTRS)

    Friedlander, Alan L.; Harry, David P., III

    1960-01-01

    An exploratory analysis of vehicle guidance during the approach to a target planet is presented. The objective of the guidance maneuver is to guide the vehicle to a specific perigee distance with a high degree of accuracy and minimum corrective velocity expenditure. The guidance maneuver is simulated by considering the random sampling of real measurements with significant error and reducing this information to prescribe appropriate corrective action. The instrumentation system assumed includes optical and/or infrared devices to indicate range and a reference angle in the trajectory plane. Statistical results are obtained by Monte-Carlo techniques and are shown as the expectation of guidance accuracy and velocity-increment requirements. Results are nondimensional and applicable to any planet within limits of two-body assumptions. The problem of determining how many corrections to make and when to make them is a consequence of the conflicting requirement of accurate trajectory determination and propulsion. Optimum values were found for a vehicle approaching a planet along a parabolic trajectory with an initial perigee distance of 5 radii and a target perigee of 1.02 radii. In this example measurement errors were less than i minute of arc. Results indicate that four corrections applied in the vicinity of 50, 16, 15, and 1.5 radii, respectively, yield minimum velocity-increment requirements. Thrust devices capable of producing a large variation of velocity-increment size are required. For a vehicle approaching the earth, miss distances within 32 miles are obtained with 90-percent probability. Total velocity increments used in guidance are less than 3300 feet per second with 90-percent probability. It is noted that the above representative results are valid only for the particular guidance scheme hypothesized in this analysis. A parametric study is presented which indicates the effects of measurement error size, initial perigee, and initial energy on the guidance

  5. Solar Glaze Hazard Analysis Tool v. 3.0

    2015-08-25

    SGHAT predicts the occurrence and intensity of glare caused by a user-specified solar panel array when viewed from one or more observation points. An interactive mapping interface is used to determine the latitude, longitude and elevation of the array and observation points. The presence and intensity of glare is then calculated along a given time interval throughout the year, based on the position of the sun. The potential ocular hazard is also reported. The maximummore » energy production of the solar array is also estimated so that alternative designs can be compared to determine the design that yields the most energy production while mitigating glare.« less

  6. Solar Glaze Hazard Analysis Tool v. 3.0

    SciTech Connect

    Ho, Clifford K.; Sims, Cianan A.

    2015-08-25

    SGHAT predicts the occurrence and intensity of glare caused by a user-specified solar panel array when viewed from one or more observation points. An interactive mapping interface is used to determine the latitude, longitude and elevation of the array and observation points. The presence and intensity of glare is then calculated along a given time interval throughout the year, based on the position of the sun. The potential ocular hazard is also reported. The maximum energy production of the solar array is also estimated so that alternative designs can be compared to determine the design that yields the most energy production while mitigating glare.

  7. Meta-Analysis of Inquiry-Based Learning: Effects of Guidance

    ERIC Educational Resources Information Center

    Lazonder, Ard W.; Harmsen, Ruth

    2016-01-01

    Research has consistently shown that inquiry-based learning can be more effective than other, more expository instructional approaches as long as students are supported adequately. But what type of guidance is adequate, and for whom? These questions are difficult to answer as most previous research has only focused on one type of guidance and one…

  8. Guidance, navigation, and control systems performance analysis: Apollo 13 mission report

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The conclusions of the analyses of the inflight performance of the Apollo 13 spacecraft guidance, navigation, and control equipment are presented. The subjects discussed are: (1) the command module systems, (2) the lunar module inertial measurement unit, (3) the lunar module digital autopilot, (4) the lunar module abort guidance system, (5) lunar module optical alignment checks, and (6) spacecraft component separation procedures.

  9. Monte Carlo analysis of the Titan III/Transfer Orbit Stage guidance system for the Mars Observer mission

    NASA Technical Reports Server (NTRS)

    Bell, Stephen C.; Ginsburg, Marc A.; Rao, Prabhakara P.

    1993-01-01

    An important part of space launch vehicle mission planning for a planetary mission is the integrated analysis of guidance and performance dispersions for both booster and upper stage vehicles. For the Mars Observer mission, an integrated trajectory analysis was used to maximize the scientific payload and to minimize injection errors by optimizing the energy management of both vehicles. This was accomplished by designing the Titan III booster vehicle to inject into a hyperbolic departure plane, and the Transfer Orbit Stage (TOS) to correct any booster dispersions. An integrated Monte Carlo analysis of the performance and guidance dispersions of both vehicles provided sensitivities, an evaluation of their guidance schemes and an injection error covariance matrix. The polynomial guidance schemes used for the Titan III variable flight azimuth computations and the TOS solid rocket motor ignition time and burn direction derivations accounted for a wide variation of launch times, performance dispersions, and target conditions. The Mars Observer spacecraft was launched on 25 September 1992 on the Titan III/TOS vehicle. The post flight analysis indicated that a near perfect park orbit injection was achieved, followed by a trans-Mars injection with less than 2sigma errors.

  10. 76 FR 70768 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft... November 2, 2011 (76 FR 67764). This action is necessary to correct an erroneous date for submission...

  11. Analysis of costs for compliance with Federal Radiation Protection Guidance for Occupational Exposure. Volume 2: case study analysis of the impacts of proposed radiation protection guidance for workers

    SciTech Connect

    Not Available

    1983-11-01

    This report contains the writeups of case studies conducted in support of an effort to estimate costs and economic impacts of proposed Federal Radiation Protection Guidance for Occupational Exposures. The purpose of the case studies was to develop background information on representative organizations necessary to determine the impact of the proposed guidelines on selected industries. This information was used, together with other data, to estimate the aggregate costs of compliance with the proposed guidelines. The cost estimates are contained in a companion report.

  12. A six-degree-of-freedom guidance and control analysis of Mars aerocapture

    NASA Astrophysics Data System (ADS)

    Powell, Richard W.; Braun, Robert D.

    1992-01-01

    A six-degree-of-freedom (6DOF) simulation is developed to investigate the control and guidance issues of a Mars aerobraking vehicle. The guidance algorithm used is a predictor-corrector guidance formulation designed to control the exit orbital apoapsis and wedge angle using bank-angle modulation. Major features of this predictor-corrector guidance algorithm include: (1) integration of the 3DOF equations of motion within an inner-loop simulation; (2) load-relief logic; (3) finite roll rates; and (4) an aerodynamic feedback multiplier. The algorithm is capable of successfully guiding the vehicle through combinations of atmospheric density dispersions, aerodynamic mispredictions, and off-nominal atmospheric interface conditions. This study demonstrated that the addition of vehicle dynamics to the Mars aerobraking simulation does not significantly impact mission feasibility. That is, a robust control system design coupled with an adaptive guidance algorithm can assure mission success in the presence of numerous off-nominal conditions.

  13. Defining geologic Hazards for natural resources management using tree-ring analysis

    USGS Publications Warehouse

    DeGraff, J.V.; Agard, S.S.

    1984-01-01

    Landslides, avalanches, floods, and other geologic hazards impair natural resources management by jeopardizing public safety, damaging or restricting resource utilization, and necessitating expenditures for corrective measures The negative impact of geologic hazard events can be reduced by tailoring resources management to hazard potential of an area This requires assessment of where and how frequently the events occur National forests and other managed wildlands often lack monitoring or historical records to compute frequency of hazard occurrence Tree-ring analysis, based on internal growth response to external events such as tilting and abrasion, can provide frequency data Two examples of the use of tree-ring analysis to date landslide activity illustrate advantages and limitations of the technique An example from the Fishlake National Forest in central Utah illustrates assessment for planning purposes An example from the Sierra National Forest in east-central California shows assessment applied to project design Many geologic hazards in addition to landslides are suited to tree-ring analysis to establish frequency of occurrence Hazard reduction efforts in natural resources management could be enhanced by careful application of tree-ring analysis ?? 1984 Springer-Verlag New York Inc.

  14. Fire hazard analysis for Project W-320 Tank 241-C-106 waste retrieval

    SciTech Connect

    Conner, J.C.

    1995-09-12

    This Fire Hazards Analysis (FHA) for Project W-320, `Tank 241-C-106 Waste Retrieval` addresses fire hazards or fire related concerns in accordance with DOE 5480.7A (DOE 1998), resulting from or related to the processes and equipment to be installed or modified under Project W-320 to ensure that there are no undue fire hazards to site personnel and the public; the potential for the occurrence of a fire is minimized, process control and safety systems are not damaged by fire or related perils; and property damage from fire and related perils does not exceed an acceptable level.

  15. Sensitivity analysis of seismic hazard for Western Liguria (North Western Italy): A first attempt towards the understanding and quantification of hazard uncertainty

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Spallarossa, Daniele; Bazzurro, Paolo; Eva, Claudio

    2007-05-01

    The use of logic trees in probabilistic seismic hazard analyses often involves a large number of branches that reflect the uncertainty in the selection of different models and in the selection of the parameter values of each model. The sensitivity analysis, as proposed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817], is an efficient tool that allows the construction of logic trees focusing attention on the parameters that have greater impact on the hazard. In this paper the sensitivity analysis is performed in order to identify the parameters that have the largest influence on the Western Liguria (North Western Italy) seismic hazard. The analysis is conducted for six strategic sites following the multi-parameter approach developed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817] and accounts for both mean hazard values and hazard values corresponding to different percentiles (e.g., 16%-ile and 84%-ile). The results are assessed in terms of the expected PGA with a 10% probability of exceedance in 50 years for rock conditions and account for both the contribution from specific source zones using the Cornell approach [Cornell, C.A., 1968. Engineering seismic risk analysis. Bull. Seismol. Soc. Am. 58, 1583-1606] and the spatially smoothed seismicity [Frankel, A., 1995. Mapping seismic hazard in the Central and Eastern United States. Seismol. Res. Lett. 66, 8-21]. The influence of different procedures for calculating seismic hazard, seismic catalogues (epicentral parameters), source zone models, frequency-magnitude parameters, maximum earthquake magnitude values and attenuation relationships is considered. As a result, the sensitivity analysis allows us to identify the parameters with higher influence on the hazard. Only these

  16. National Emission Standards for Hazardous Air Pollutants for Major Sources: Industrial, Commercial, and Institutional Boilers; Guidance for Calculating Emission Credits Resulting from Implementation of Energy Conservation Measures

    SciTech Connect

    Cox, Daryl; Papar, Riyaz; Wright, Dr. Anthony

    2012-07-01

    The purpose of this document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule (i.e., subpart DDDDD of CFR part 63). This document divides Boiler System conservation opportunities into four functional areas: 1) the boiler itself, 2) the condensate recovery system, 3) the distribution system, and 4) the end uses of the steam. This document provides technical information for documenting emissions credits proposed in the Implementation Plan for functional areas 2) though 4). This document does not include efficiency improvements related to the Boiler tune-ups.

  17. National Emission Standards for Hazardous Air Pollutants for Major Sources: Industrial, Commercial, and Institutional Boilers; Guidance for Calculating Efficiency Credits Resulting from Implementation of Energy Conservation Measures

    SciTech Connect

    Cox, Daryl; Papar, Riyaz; Wright, Dr. Anthony

    2013-02-01

    The purpose of this document is to provide guidance for developing a consistent approach to documenting efficiency credits generated from energy conservation measures in the Implementation Plan for boilers covered by the Boiler MACT rule (i.e., subpart DDDDD of CFR part 63). This document divides Boiler System conservation opportunities into four functional areas: 1) the boiler itself, 2) the condensate recovery system, 3) the distribution system, and 4) the end uses of the steam. This document provides technical information for documenting emissions credits proposed in the Implementation Plan for functional areas 2) though 4). This document does not include efficiency improvements related to the Boiler tune-ups.

  18. North Portal Fuel Storage System Fire Hazard Analysis-ESF Surface Design Package ID

    SciTech Connect

    N.M. Ruonavaara

    1995-01-18

    The purpose of the fire hazard analysis is to comprehensively assess the risk from fire within the individual fire areas. This document will only assess the fire hazard analysis within the Exploratory Studies Facility (ESF) Design Package ID, which includes the fuel storage system area of the North Portal facility, and evaluate whether the following objectives are met: 1.1.1--This analysis, performed in accordance with the requirements of this document, will satisfy the requirements for a fire hazard analysis in accordance with U.S. Department of Energy (DOE) Order 5480.7A. 1.1.2--Ensure that property damage from fire and related perils does not exceed an acceptable level. 1.1.3--Provide input to the ESF Basis For Design (BFD) Document. 1.1.4 Provide input to the facility Safety Analysis Report (SAR) (Paragraph 3.8).

  19. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  20. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  1. Sampling quality assurance guidance in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document introduces quality assurance guidance pertaining to the design and implementation of sampling procedures and processes for collecting environmental data for DOE`s Office of EM (Environmental Restoration and Waste Management).

  2. Orion Crew Exploration Vehicle Launch Abort System Guidance and Control Analysis Overview

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Kim, Sungwan; Raney, David L.; Aubuchon, Vanessa V.; Sparks, Dean W.; Busan, Ronald C.; Proud, Ryan W.; Merritt, Deborah S.

    2008-01-01

    Aborts during the critical ascent flight phase require the design and operation of Orion Crew Exploration Vehicle (CEV) systems to escape from the Crew Launch Vehicle (CLV) and return the crew safely to the Earth. To accomplish this requirement of continuous abort coverage, CEV ascent abort modes are being designed and analyzed to accommodate the velocity, altitude, atmospheric, and vehicle configuration changes that occur during ascent. Aborts from the launch pad to early in the flight of the CLV second stage are performed using the Launch Abort System (LAS). During this type of abort, the LAS Abort Motor is used to pull the Crew Module (CM) safely away from the CLV and Service Module (SM). LAS abort guidance and control studies and design trades are being conducted so that more informed decisions can be made regarding the vehicle abort requirements, design, and operation. This paper presents an overview of the Orion CEV, an overview of the LAS ascent abort mode, and a summary of key LAS abort analysis methods and results.

  3. Guidance and control analysis of the entry of a lifting body personnel launch vehicle

    NASA Astrophysics Data System (ADS)

    Powell, Richard W.; Cruz, Christopher I.

    1991-01-01

    NASA is currently involved in definition studies of a Personnel Launch System (PLS) that could be used to transport people to and from low-earth orbit. This vehicle would serve both to complement the Space Shuttle and to provide alternative access to space in the event the Space Shuttle fleet were unavailable for a prolonged period. The PLS would consist of a manned spacecraft launched by an expendable vehicle, e.g., Titan 4. One promising candidate for the manned component of the PLS is the NASA Langley Research Center HL-20 lifting body. Many studies are currently underway to assess this vehicle, and one of the main areas of study is the development of the capability to successfully enter, glide to the landing site, and land. To provide this capability, guidance and control algorithms have been developed, incorporated into a six-degree-of-freedom simulation, and evaluation in the presence of off-nominal atmospheric conditions, consisting of both density variations and steady-state winds. In addition, the impact of atmospheric turbulence was examined for the portion of flight from Mach 3.5 to touchdown. This analysis showed that the vehicle remained controllable and could successfully land even in the presence of off-nominal atmospheric conditions.

  4. An analysis of the Hubble Space Telescope fine guidance sensor fine lock mode

    NASA Technical Reports Server (NTRS)

    Taff, L. G.

    1991-01-01

    There are two guiding modes of the Hubble Space Telescope (HST) used for the acquisition of astronomical data by one of its six scientific instruments. The more precise one is called Fine Lock. Command and control problems in the onboard electronics has limited Fine Lock to brighter stars, V less than 13.0 mag, instead of fulfilling its goal of V = 14.5 mag. Consequently, the less precise guiding mode of Coarse Track (approximately 40 milli-arc seconds) has to be used fairly frequently. Indeed, almost half of the scientific observations to have been made with the HST will be compromised. The only realistic or extensive simulations of the Fine Lock guidance mode are reported. The theoretical analysis underlying the Monte Carlo experiments and the numerical computations clearly show both that the control electronics are severely under-engineered and how to adjust the various control parameters to successfully extend Fine Lock guiding performance back to V = 14.0 mag and sometimes beyond.

  5. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  6. [Hazard evaluation modeling of particulate matters emitted by coal-fired boilers and case analysis].

    PubMed

    Shi, Yan-Ting; Du, Qian; Gao, Jian-Min; Bian, Xin; Wang, Zhi-Pu; Dong, He-Ming; Han, Qiang; Cao, Yang

    2014-02-01

    In order to evaluate the hazard of PM2.5 emitted by various boilers, in this paper, segmentation of particulate matters with sizes of below 2. 5 microm was performed based on their formation mechanisms and hazard level to human beings and environment. Meanwhile, taking into account the mass concentration, number concentration, enrichment factor of Hg, and content of Hg element in different coal ashes, a comprehensive model aimed at evaluating hazard of PM2.5 emitted by coal-fired boilers was established in this paper. Finally, through utilizing filed experimental data of previous literatures, a case analysis of the evaluation model was conducted, and the concept of hazard reduction coefficient was proposed, which can be used to evaluate the performance of dust removers.

  7. Preliminary fire hazards analysis for W-211, Initial Tank Retrieval Systems

    SciTech Connect

    Huckfeldt, R.A.

    1995-03-16

    A fire hazards analysis (FHA) was performed for Project W-211, Initial Tank Retrieval System (ITRS), at the Department of Energy (DOE) Hanford site. The objectives of this FHA was to determine (1) the fire hazards that expose the Initial Tank Retrieval System or are inherent in the process, (2) the adequacy of the fire-safety features planned, and (3) the degree of compliance of the project with specific fire safety provisions in DOE orders and related engineering codes and standards. The scope included the construction, the process hazards, building fire protection, and site wide fire protection. The results are presented in terms of the fire hazards present, the potential extent of fire damage, and the impact on employees and public safety. This study evaluated the ITRS with respect to its use at Tank 241-SY-101 only.

  8. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  9. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  10. Hazard baseline documentation

    SciTech Connect

    Not Available

    1994-08-01

    This DOE limited technical standard establishes uniform Office of Environmental Management (EM) guidance on hazards baseline documents that identify and control radiological and nonradiological hazards for all EM facilities. It provides a road map to the safety and health hazard identification and control requirements contained in the Department`s orders and provides EM guidance on the applicability and integration of these requirements. This includes a definition of four classes of facilities (nuclear, non-nuclear, radiological, and other industrial); the thresholds for facility hazard classification; and applicable safety and health hazard identification, controls, and documentation. The standard applies to the classification, development, review, and approval of hazard identification and control documentation for EM facilities.

  11. Techniques for hazard analysis and their use at CERN.

    PubMed

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  12. Techniques for hazard analysis and their use at CERN.

    PubMed

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors. PMID:11840707

  13. A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen

    2014-05-01

    Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho

  14. Information System Hazard Analysis: A Method for Identifying Technology-induced Latent Errors for Safety.

    PubMed

    Weber, Jens H; Mason-Blakley, Fieran; Price, Morgan

    2015-01-01

    Many health information and communication technologies (ICT) are safety-critical; moreover, reports of technology-induced adverse events related to them are plentiful in the literature. Despite repeated criticism and calls to action, recent data collected by the Institute of Medicine (IOM) and other organization do not indicate significant improvements with respect to the safety of health ICT systems. A large part of the industry still operates on a reactive "break & patch" model; the application of pro-active, systematic hazard analysis methods for engineering ICT that produce "safe by design" products is sparse. This paper applies one such method: Information System Hazard Analysis (ISHA). ISHA adapts and combines hazard analysis techniques from other safety-critical domains and customizes them for ICT. We provide an overview of the steps involved in ISHA and describe. PMID:25676999

  15. Process hazards analysis (PrHA) program, bridging accident analyses and operational safety

    SciTech Connect

    Richardson, J. A.; McKernan, S. A.; Vigil, M. J.

    2003-01-01

    Recently the Final Safety Analysis Report (FSAR) for the Plutonium Facility at Los Alamos National Laboratory, Technical Area 55 (TA-55) was revised and submitted to the US. Department of Energy (DOE). As a part of this effort, over seventy Process Hazards Analyses (PrHAs) were written and/or revised over the six years prior to the FSAR revision. TA-55 is a research, development, and production nuclear facility that primarily supports US. defense and space programs. Nuclear fuels and material research; material recovery, refining and analyses; and the casting, machining and fabrication of plutonium components are some of the activities conducted at TA-35. These operations involve a wide variety of industrial, chemical and nuclear hazards. Operational personnel along with safety analysts work as a team to prepare the PrHA. PrHAs describe the process; identi fy the hazards; and analyze hazards including determining hazard scenarios, their likelihood, and consequences. In addition, the interaction of the process to facility systems, structures and operational specific protective features are part of the PrHA. This information is rolled-up to determine bounding accidents and mitigating systems and structures. Further detailed accident analysis is performed for the bounding accidents and included in the FSAR. The FSAR is part of the Documented Safety Analysis (DSA) that defines the safety envelope for all facility operations in order to protect the worker, the public, and the environment. The DSA is in compliance with the US. Code of Federal Regulations, 10 CFR 830, Nuclear Safety Management and is approved by DOE. The DSA sets forth the bounding conditions necessary for the safe operation for the facility and is essentially a 'license to operate.' Safely of day-to-day operations is based on Hazard Control Plans (HCPs). Hazards are initially identified in the PrI-IA for the specific operation and act as input to the HCP. Specific protective features important to worker

  16. Vertical guidance performance analysis of the L1-L5 dual-frequency GPS/WAAS user avionics sensor.

    PubMed

    Jan, Shau-Shiun

    2010-01-01

    This paper investigates the potential vertical guidance performance of global positioning system (GPS)/wide area augmentation system (WAAS) user avionics sensor when the modernized GPS and Galileo are available. This paper will first investigate the airborne receiver code noise and multipath (CNMP) confidence (σair). The σair will be the dominant factor in the availability analysis of an L1-L5 dual-frequency GPS/WAAS user avionics sensor. This paper uses the MATLAB Algorithm Availability Simulation Tool (MAAST) to determine the required values for the σair, so that an L1-L5 dual-frequency GPS/WAAS user avionics sensor can meet the vertical guidance requirements of APproach with Vertical guidance (APV) II and CATegory (CAT) I over conterminous United States (CONUS). A modified MAAST that includes the Galileo satellite constellation is used to determine under what user configurations WAAS could be an APV II system or a CAT I system over CONUS. Furthermore, this paper examines the combinations of possible improvements in signal models and the addition of Galileo to determine if GPS/WAAS user avionics sensor could achieve 10 m Vertical Alert Limit (VAL) within the service volume. Finally, this paper presents the future vertical guidance performance of GPS user avionics sensor for the United States' WAAS, Japanese MTSAT-based satellite augmentation system (MSAS) and European geostationary navigation overlay service (EGNOS).

  17. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two

  18. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  19. FIRE HAZARDS ANALYSIS FOR THE FUEL SUPPLY SYSTEM - ESF PACKAGE 1E

    SciTech Connect

    N.M. Ruonavaara

    1995-04-12

    The purpose of the fire hazards analysis is to comprehensively assess the risk from fire within individual fire areas in accordance with US. Department of Energy (DOE) Order 5480.7h (Reference 4.4.7.4). This document will assess the fire hazard risk within the Exploratory Studies Facility (ESF) fuel supply system, Package 1E, and evaluate whether the following objectives are met: (1) Ensure that property damage from fire and related perils do not exceed an acceptable level. (2) Provide input to the facility Safety Analysis Report (SAR).

  20. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    PubMed

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-01

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition. PMID:26035281

  1. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    PubMed

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-01

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  2. Statistical methods for launch vehicle guidance, navigation, and control (GN&C) system design and analysis

    NASA Astrophysics Data System (ADS)

    Rose, Michael Benjamin

    A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical

  3. Guidance on risk analysis and safety implications of a large liquefied natural gas (LNG) spill over water.

    SciTech Connect

    Wellman, Gerald William; Melof, Brian Matthew; Luketa-Hanlin, Anay Josephine; Hightower, Marion Michael; Covan, John Morgan; Gritzo, Louis Alan; Irwin, Michael James; Kaneshige, Michael Jiro; Morrow, Charles W.

    2004-12-01

    While recognized standards exist for the systematic safety analysis of potential spills or releases from LNG (Liquefied Natural Gas) storage terminals and facilities on land, no equivalent set of standards or guidance exists for the evaluation of the safety or consequences from LNG spills over water. Heightened security awareness and energy surety issues have increased industry's and the public's attention to these activities. The report reviews several existing studies of LNG spills with respect to their assumptions, inputs, models, and experimental data. Based on this review and further analysis, the report provides guidance on the appropriateness of models, assumptions, and risk management to address public safety and property relative to a potential LNG spill over water.

  4. Probabilistic seismic hazard analysis for offshore structures in the Santa Barbara Channel phase 2 report

    SciTech Connect

    Foxall, W; Savy, J

    1999-08-06

    This report summarizes progress through Phase 2 of the probabilistic seismic hazards analysis (PSHA) for the Santa Barbara Channel being carried out by the Lawrence Livermore National Laboratory (LLNL) for the Minerals Management Service (MMS) of the US Department of the Interior. The purpose of the PSHA is to provide a basis for development by MMS of regulations governing evaluation of applications to re-license existing oil platforms in federal waters within the Channel with respect to seismic loading. The final product of the analysis will be hazard maps of ground motion parameters at specified probability levels of exceedence. This report summarizes the characterization of local earthquake sources within the Channel and onshore areas of the Western Transverse Ranges, development of a ground motion attenuation model for the region, and presents preliminary hazard results at three selected sites.

  5. Fire hazards analysis for the uranium oxide (UO{sub 3}) facility

    SciTech Connect

    Wyatt, D.M.

    1994-12-06

    The Fire Hazards Analysis (FHA) documents the deactivation end-point status of the UO{sub 3} complex fire hazards, fire protection and life safety systems. This FHA has been prepared for the Uranium Oxide Facility by Westinghouse Hanford Company in accordance with the criteria established in DOE 5480.7A, Fire Protection and RLID 5480.7, Fire Protection. The purpose of the Fire Hazards Analysis is to comprehensively and quantitatively assess the risk from a fire within individual fire areas in a Department of Energy facility so as to ascertain whether the objectives stated in DOE Order 5480.7, paragraph 4 are met. Particular attention has been paid to RLID 5480.7, Section 8.3, which specifies the criteria for deactivating fire protection in decommission and demolition facilities.

  6. Responses to gestational weight management guidance: a thematic analysis of comments made by women in online parenting forums

    PubMed Central

    2014-01-01

    Background The National Institute for Health and Clinical Excellence (NICE) published guidance on weight management in pregnancy in July 2010 (NICE public health guidance 27: 2010), and this received considerable press coverage across a range of media. This offered an opportunity to examine how gestational weight management guidance was received by UK women. Methods A thematic analysis was conducted of 400 posts made in UK-based parenting internet forums in the week following the publication of the NICE guidance. This allowed us to examine the naturally occurring comments from 202 women who posted about the guidance on public forums. Results Three main themes were identified and explored: i) Perceived control/responsibility ii) Risk perception iii) Confused messages. Conclusions Women differed in their perceptions of the level of control that they had over being overweight with some feeling responsible and motivated to maintain a healthy lifestyle. Others felt there were multiple factors influencing their weight issues beyond their control. There were reports of feeling guilty about the impact of weight on the growing baby and experiencing significant obesity stigma from the public and health professionals. Information about the risks of overweight and obesity in pregnancy were difficult messages for women to hear, and for health professionals to deliver. Women reported being confused by the messages that they received. Health messages need to be delivered sensitively to women, and health professionals need support and training to do this. Risk information should always be accompanied with clear advice and support to help women to manage their weight in pregnancy. PMID:24981024

  7. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    NASA Astrophysics Data System (ADS)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  8. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    NASA Technical Reports Server (NTRS)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  9. Quality assurance guidance for field sampling and measurement assessment plates in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document is one of several guidance documents developed by the US Department of Energy (DOE) Office of Environmental Restoration and Waste Management (EM). These documents support the EM Analytical Services Program (ASP) and are based on applicable regulatory requirements and DOE Orders. They address requirements in DOE Orders by providing guidance that pertains specifically to environmental restoration and waste management sampling and analysis activities. DOE 5700.6C Quality Assurance (QA) defines policy and requirements to establish QA programs ensuring that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized. This is accomplished through the application of effective management systems commensurate with the risks imposed by the facility and the project. Every organization supporting EM`s environmental sampling and analysis activities must develop and document a QA program. Management of each organization is responsible for appropriate QA program implementation, assessment, and improvement. The collection of credible and cost-effective environmental data is critical to the long-term success of remedial and waste management actions performed at DOE facilities. Only well established and management supported assessment programs within each EM-support organization will enable DOE to demonstrate data quality. The purpose of this series of documents is to offer specific guidance for establishing an effective assessment program for EM`s environmental sampling and analysis (ESA) activities.

  10. Damage functions for climate-related hazards: unification and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.

    2016-05-01

    Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.

  11. Guidance on health effects of toxic chemicals. Safety Analysis Report Update Program

    SciTech Connect

    Foust, C.B.; Griffin, G.D.; Munro, N.B.; Socolof, M.L.

    1994-02-01

    Martin Marietta Energy Systems, Inc. (MMES), and Martin Marietta Utility Services, Inc. (MMUS), are engaged in phased programs to update the safety documentation for the existing US Department of Energy (DOE)-owned facilities. The safety analysis of potential toxic hazards requires a methodology for evaluating human health effects of predicted toxic exposures. This report provides a consistent set of health effects and documents toxicity estimates corresponding to these health effects for some of the more important chemicals found within MMES and MMUS. The estimates are based on published toxicity information and apply to acute exposures for an ``average`` individual. The health effects (toxicological endpoints) used in this report are (1) the detection threshold; (2) the no-observed adverse effect level; (3) the onset of irritation/reversible effects; (4) the onset of irreversible effects; and (5) a lethal exposure, defined to be the 50% lethal level. An irreversible effect is defined as a significant effect on a person`s quality of life, e.g., serious injury. Predicted consequences are evaluated on the basis of concentration and exposure time.

  12. Liquefaction hazard analysis for infrastructure development in gulf of Jakarta

    NASA Astrophysics Data System (ADS)

    Dinata, Indra A.; Darlan, Yudi; Sadisun, Imam A.; Pindratno, Haris; Saryanto, Agus

    2016-05-01

    Gulf of Jakarta is an area of active sedimentation. There exist a wide sediment deposition area on the north coast of Jakarta. Generally, these sediments have not been consolidated, so that the conditions in these area is an important factor to determining liquefaction in these area. Liquefaction may occur because of earthquake that cause loss of strength and stiffness in soils. Analysis of liquefaction potential based from SPT data taken at gulf of Jakarta, include susceptibility rate and the factors that triggering. Liquefaction analysis methods compared with each other to get the factor of safety against liquefaction according to the characteristics of the soil. Liquefaction analysis at surface using susceptibility rating factor (SRF). SRF method controled by factors: history, geology, composition, and groundwater. Each factors have parameters that determine the value of SRF.From the analysis, Gulf of Jakarta has susceptibility rating from liquefaction with SRF value 12 - 35. The value shows that Gulf of Jakarta dominated by area that have susceptibility rating from medium to high. High susceptibility rating from liquefaction concentrated at coast area.

  13. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  14. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  15. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...) Each location where juice is processed by that processor; and (2) Each type of juice processed by the processor. The plan may group types of juice products together, or group types of production...

  16. Example process hazard analysis of a Department of Energy water chlorination process

    SciTech Connect

    Not Available

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  17. Guidance for preparation of DOE 5480.22 (TSR) and DOE 5480.23 (SAR) implementation plans

    SciTech Connect

    Not Available

    1994-11-01

    DOE 5480.22 (Technical Safety Requirements) and DOE 5480.23 (Nuclear Safety Analysis Reports) impose requirements for submittal of implementation plans (IPs). This standard provides guidance on the contents of IPs, including plans for managing the upgrade process; the submittal, review, approval, and revision of IPs; and the basis for interim operation during the upgrade process. Guidance is also provided on techniques of preliminary hazards analysis and on the development of site-wide and activity-specific safety documentation.

  18. Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning

    NASA Astrophysics Data System (ADS)

    Jeuken, Ad; Mendoza, Guillermo; Matthews, John; Ray, Patrick; Haasnoot, Marjolijn; Gilroy, Kristin; Olsen, Rolf; Kucharski, John; Stakhiv, Gene; Cushing, Janet; Brown, Casey

    2016-04-01

    over time. They are part of the Dutch adaptive planning approach Adaptive Delta Management, executed and develop by the Dutch Delta program. Both decision scaling and adaptation pathways have been piloted in studies worldwide. The objective of CRIDA is to mainstream effective climate adaptation for professional water managers. The CRIDA publication, due in april 2016, follows the generic water design planning design cycle. At each step, CRIDA describes stepwise guidance for incorporating climate robustness: problem definition, stress test, alternatives formulation and recommendation, evaluation and selection. In the presentation the origin, goal, steps and practical tools available at each step of CRIDA will be explained. In two other abstracts ("Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region" by Gilroy et al., "The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia, by Kucharski et al.), the application of CRIDA to cases is explained

  19. A Reaction to an Analysis of Nine Computer-Assisted Career Guidance Systems.

    ERIC Educational Resources Information Center

    Garcia, Gonzalo, Jr.; Plansker, Christine M.

    1990-01-01

    Critiques the assessment of computer-assisted career guidance systems by Sampson et al. (CE 521 972), stating that the conceptual framework was sound and well developed and criteria were based on recognized experts. Limitations of the study are the assumption of equivalent effectiveness and the use of a checklist for comparing the systems. (SK)

  20. System design impact of guidance and navigation analysis for a SEPS 1979 Encke flyby

    NASA Technical Reports Server (NTRS)

    Hong, P.

    1975-01-01

    Baseline guidance and navigation strategy for an ecliptic projection of the Encke flyby mission consider solar electric propulsion stage parameters in generating optimized and targeted trajectory control. Results show the Encke relative approach error to be dominated by ephemerical uncertainties, particularly the velocity components.

  1. School Coping Style Analysis in the Elementary School. Georgia Comprehensive Guidance Series.

    ERIC Educational Resources Information Center

    Moore, Earl J.

    This workshop manual is one of four volumes in a staff development series designed for use with the Georgia comprehensive guidance and counseling units. The manual presents three separate models of students' school coping styles to aid classroom teachers' understanding of student behavior. Part 1 summarizes four basic models of student coping,…

  2. Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map

    NASA Astrophysics Data System (ADS)

    Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.

    2016-03-01

    Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.

  3. Tank farms hazards assessment

    SciTech Connect

    Broz, R.E.

    1994-09-30

    Hanford contractors are writing new facility specific emergency procedures in response to new and revised US Department of Energy (DOE) Orders on emergency preparedness. Emergency procedures are required for each Hanford facility that has the potential to exceed the criteria for the lowest level emergency, an Alert. The set includes: (1) a facility specific procedure on Recognition and Classification of Emergencies, (2) area procedures on Initial Emergency Response and, (3) an area procedure on Protective Action Guidance. The first steps in developing these procedures are to identify the hazards at each facility, identify the conditions that could release the hazardous material, and calculate the consequences of the releases. These steps are called a Hazards Assessment. The final product is a document that is similar in some respects to a Safety Analysis Report (SAR). The document could br produced in a month for a simple facility but could take much longer for a complex facility. Hanford has both types of facilities. A strategy has been adopted to permit completion of the first version of the new emergency procedures before all the facility hazards Assessments are complete. The procedures will initially be based on input from a task group for each facility. This strategy will but improved emergency procedures in place sooner and therefore enhance Hanford emergency preparedness. The purpose of this document is to summarize the applicable information contained within the Waste Tank Facility ``Interim Safety Basis Document, WHC-SD-WM-ISB-001`` as a resource, since the SARs covering Waste Tank Operations are not current in all cases. This hazards assessment serves to collect, organize, document and present the information utilized during the determination process.

  4. An Analysis of Documents from State Departments of Education Relating to Comprehensive Career Guidance Programs and of FY 1974 and 1975 Part C Project Documents Relating to Guidance. Research and Development Series No. 133.

    ERIC Educational Resources Information Center

    Vetter, Louise; Ransom, Robert M.

    Fifty states and Washington, D.C., were invited to submit documents for an analysis relating to comprehensive career guidance programs (CCGP). These documents were prepared by state departments of education and directors of FY 1974 and 1975 projects funded through the Vocational Education Research Program under Section 131 of Part C of the…

  5. Risk analysis of environmental hazards at the High Flux Beam Reactor

    SciTech Connect

    Boccio, J.L.; Ho, V.S.; Johnson, D.H.

    1994-01-01

    In the late 1980s, a Level 1 internal event probabilistic risk assessment (PRA) was performed for the High-Flux Beam Reactor (HFBR), a US Department of Energy research reactor located at Brookhaven National Laboratory. Prior to the completion of that study, a level 1 PRA for external events was initiated, including environmental hazards such as fire, internal flooding, etc. Although this paper provides a brief summary of the risks from environmental hazards, emphasis will be placed on the methodology employed in utilizing industrial event databases for event frequency determination for the HFBR complex. Since the equipment in the HFBR is different from that of, say, a commercial nuclear power plant, the current approach is to categorize the industrial events according to the hazard initiators instead of categorizing by initiator location. But first a general overview of the analysis.

  6. Considering both aleatory variability and epistemic variability in probabilistic seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Sung, Chih-Hsuan; Gao, Jia-Cian; Lee, Chyi-Tyi

    2015-04-01

    In the modern probabilistic seismic hazard analysis (PSHA), a standard deviation (sigma) of total variability was considered in the integration for seismic exceeding rate, and this lead to increased seismic hazard estimates. Epistemic uncertainty results from incomplete knowledge of the earthquake process and has nothing to do with neither the temporal variation nor the spatial variation of ground motions. It is not could be considered in the integration, epistemic variability may be included in the logic trees. This study uses Taiwan data as example to test a case in Taipei. Results reveal that if only the aleatory variability is considered in the integration, the hazard level could be reduced about 33% at the 475-year return period, and it reduced about 36% and 50% at 10000-year and 100000-year, respectively. However, if epistemic variability is considered in the logic trees besides the aleatory variability is considered in the integration, then the hazard level is similar to that from using total variability; it shows only a little bit smaller at long return period. Much effort in reducing the hazard level to a reasonable value still remains to be studied.

  7. Considering both aleatory variability and epistemic variability in probabilistic seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Gao, J. C.; Lee, C. T.

    2014-12-01

    In the modern probabilistic seismic hazard analysis (PSHA), a standard deviation (sigma) of total variability was considered in the integration for seismic exceeding rate, and this lead to increased seismic hazard estimates. Epistemic uncertainty results from incomplete knowledge of the earthquake process and has nothing to do with neither the temporal variation nor the spatial variation of ground motions. It is not could be considered in the integration, epistemic variability may be included in the logic trees. This study uses Taiwan data as example to test a case in Taipei. Results reveal that if only the aleatory variability is considered in the integration, the hazard level could be reduced about 33% at the 475-year return period, and it reduced about 36% and 50% at 10000-year and 100000-year, respectively. However, if epistemic variability is considered in the logic trees besides the aleatory variability is considered in the integration, then the hazard level is similar to that from using total variability; it shows only a little bit smaller at long return period. Much effort in reducing the hazard level to a reasonable value still remains to be studied.

  8. Ocean thermal conversion (OTEC) project bottom cable protection study: environmental characteristics and hazards analysis

    SciTech Connect

    Chern, C.; Tudor, W.

    1981-10-01

    Seafloor cable-protection criteria and technology as applied to the four proposed OTEC plant sites and cable routes at Hawaii, Puerto Rico, Guam and Florida were examined. Study of environmental characteristics for each site covered: (A) natural factors of location, tide and currents, wind and wave, bottom soil type and seafloor movement; and (B) man-made factors such as ship traffic, fishing activities, ocean mining, government regulations. These characteristics were studied to determine the hazards which are potential sources of damage to a cable system. Hazards include: chafe and corrosion, hydrodynamic forces due to wave and current action, mudslides, earthquakes, trawler and/or dredge action and ship anchors. An analysis of the history of submarine-cable failures was conducted. Included are the probabilities of damage related to water depth. Probabilities become minimal for all hazards in water depths of 1500 feet and more. Chafe and corrosion had the highest probability of causing damage to a seafloor cable compared to the other hazards. Because of the hazards present at all sites, cable burial is recommended as the best means of protection.

  9. Description of the Northwest hazardous waste site data base and preliminary analysis of site characteristics

    SciTech Connect

    Woodruff, D.L.; Hartz, K.E.; Triplett, M.B.

    1988-08-01

    The Northwest Hazardous Waste RD and D Center (the Center) conducts research, development, and demonstration (RD and D) activities for hazardous and radioactive mixed-waste technologies applicable to remediating sites in the states of Idaho, Montana, Oregon, and Washington. To properly set priorities for these RD and D activities and to target development efforts it is necessary to understand the nature of the sites requiring remediation. A data base of hazardous waste site characteristics has been constructed to facilitate this analysis. The data base used data from EPA's Region X Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) and from Preliminary Assessment/Site Investigation (PA/SI) forms for sites in Montana. The Center's data base focuses on two sets of sites--those on the National Priorities List (NPL) and other sites that are denoted as ''active'' CERCLIS sites. Active CERCLIS sites are those sites that are undergoing active investigation and analysis. The data base contains information for each site covering site identification and location, type of industry associated with the site, waste categories present (e.g., heavy metals, pesticides, etc.), methods of disposal (e.g., tanks, drums, land, etc.), waste forms (e.g., liquid, solid, etc.), and hazard targets (e.g., surface water, groundwater, etc.). As part of this analysis, the Northwest region was divided into three geographic subregions to identify differences in disposal site characteristics within the Northwest. 2 refs., 18 figs., 5 tabs.

  10. A hazard and probabilistic safety analysis of a high-level waste transfer process

    SciTech Connect

    Bott, T.F.; Sasser, M.K.

    1996-09-01

    This paper describes a safety analysis of a transfer process for high-level radioactive and toxic waste. The analysis began with a hazard assessment that used elements of What If, Checklist, Failure Modes and Effects Analysis, and Hazards and Operability Study (HAZOP) techniques to identify and rough-in accident sequences. Based on this preliminary analysis, the most significant accident sequences were developed further using event trees. Quantitative frequency estimates for the accident sequences were based on operational data taken from the historical record of the site where the process is performed. Several modeling challenges were encountered in the course of the study. These included linked initiating and accident progression events, fire propagation modeling, accounting for administrative control violations, and handling mission-phase effects.

  11. Additional guidance for including nuclear safety equivalency in the Canister Storage Building and Cold Vacuum Drying Facility final safety analysis report

    SciTech Connect

    Garvin, L.J.

    1997-05-20

    This document provides guidance for the production of safety analysis reports that must meet both DOE Order 5480.23 and STD 3009, and be in compliance with the DOE regulatory policy that imposes certain NRC requirements.

  12. Flight Guidance System Requirements Specification

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Tribble, Alan C.; Carlson, Timothy M.; Danielson, Eric J.

    2003-01-01

    This report describes a requirements specification written in the RSML-e language for the mode logic of a Flight Guidance System of a typical regional jet aircraft. This model was created as one of the first steps in a five-year project sponsored by the NASA Langley Research Center, Rockwell Collins Inc., and the Critical Systems Research Group of the University of Minnesota to develop new methods and tools to improve the safety of avionics designs. This model will be used to demonstrate the application of a variety of methods and techniques, including safety analysis of system and subsystem requirements, verification of key properties using theorem provers and model checkers, identification of potential sources mode confusion in system designs, partitioning of applications based on the criticality of system hazards, and autogeneration of avionics quality code. While this model is representative of the mode logic of a typical regional jet aircraft, it does not describe an actual or planned product. Several aspects of a full Flight Guidance System, such as recovery from failed sensors, have been omitted, and no claims are made regarding the accuracy or completeness of this specification.

  13. In silico analysis of nanomaterials hazard and risk.

    PubMed

    Cohen, Yoram; Rallo, Robert; Liu, Rong; Liu, Haoyang Haven

    2013-03-19

    false positive relative to false negative predictions and the reliability of toxicity data. To establish the environmental impact of ENMs identified as toxic, researchers will need to estimate the potential level of environmental exposure concentration of ENMs in the various media such as air, water, soil, and vegetation. When environmental monitoring data are not available, models of ENMs fate and transport (at various levels of complexity) serve as alternative approaches for estimating exposure concentrations. Risk management decisions regarding the manufacturing, use, and environmental regulations of ENMs would clearly benefit from both the assessment of potential ENMs exposure concentrations and suitable toxicity metrics. The decision process should consider the totality of available information: quantitative and qualitative data and the analysis of nanomaterials toxicity, and fate and transport behavior in the environment. Effective decision-making to address the potential impacts of nanomaterials will require considerations of the relevant environmental, ecological, technological, economic, and sociopolitical factors affecting the complete lifecycle of nanomaterials, while accounting for data and modeling uncertainties. Accordingly, researchers will need to establish standardized data management and analysis tools through nanoinformatics as a basis for the development of rational decision tools.

  14. Graded approach for initiating event selection in a facility hazard analysis

    SciTech Connect

    Majumdar, K.; Altenbach, T.

    1998-04-01

    This paper describes a methodology for selecting initiating events or event scenarios for the hazard analysis of a new Department of Energy (DOE) facility at the Nevada Test Site for nuclear explosive operations called the Device Assembly Facility (DAF). The selection process is a very important first step in conducting the hazard analysis for the facility, which in turn may feed into a quantitative risk analysis. A comprehensive risk analysis is dependent on the identification and inclusion of a complete set of initiating events in the analysis model. A systematic and logical method of grading or screening all the potential initiating events satisfies the needs for completeness within the bounds of efficiency and practicality. By applying the graded approach to the selection of the initiating events, the task and hazard analysis was able to focus its attention on only those events having the potential to develop into credible accident scenarios. Resources were concentrated into the understanding of those scenarios, and assuring that adequate positive measures are in place to control the risk associated with them.

  15. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  16. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  17. Uncertain Characterization of Flood Hazard Using Bivariate Analysis Based on Copulas

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2015-04-01

    This study presents a methodology to derive probabilistic flood hazard map in flood prone areas taking into account uncertainties in the definition of design-hydrographs. Particularly, we present an innovative approach to obtain probabilistic inundation and flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from a bivariate statistical analysis, through the use of copulas. This study also aims to quantify the contribution of boundary conditions uncertainty in order to explore the impact of this uncertainty on probabilistic flood hazard mapping. The uncertainty of extreme flood events is considered in terms of different possible combinations of peak discharge and flood volume given by the copula. Further, we analyzed the role of a multivariate probability hydrological analysis on inundation and flood hazard maps highlighting the differences between deterministic and probabilistic approaches. The methodology has been applied to a study area located in Sicily that was subject to several flooding events in the past.

  18. Multi-hazard risk analysis using the FP7 RASOR Platform

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew

    2014-10-01

    Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.

  19. [Preparation of flavonoid reference standards from Scutellariae Radix under the guidance of high performance liquid chromatography-mass spectrometry analysis].

    PubMed

    Guo, Henan; Yang, Xuedong; Liu, Jun; Zheng, Wenfeng

    2012-07-01

    Flavonoid reference standards were targeted-prepared from Scutellariae Radix under the guidance of high performance liquid chromatography-mass spectrometry (HPLC-MS) analysis. With HPLC-MS analysis of Scutellariae Radix, 19 flavonoid components were identified by analyzing and comparing their retention times, ultraviolet spectra, and mass spectrometry data with literature. The separation and purification protocols of all targeted flavonoid reference standards were optimally designed according to the results of HPLC-MS analysis and related literature. The ethanol extract of Scutellariae Radix was suspended in water and extracted with petroleum ether, ethyl acetate, and n-butanol successively. The ethyl acetate extract and n-butanol extract were separately subjected to primary separation by low pressure reverse phase preparative chromatography. Then the fractions containing targeted compounds were further purified by low pressure reverse and normal phases preparative chromatography. Finally, baicalin and wogonoside reference standards were obtained from n-butanol extract; baicaelin, wogonin, and oroxylin A reference standards were obtained from ethyl acetate extract. The structures of the 5 reference standards were identified by mass spectrometry (MS) and 1H nuclear magnetic resonance (1H NMR) spectroscopy. The HPLC analytical results showed that the purities of the 5 reference standards were all above 98%. It is demonstrated that the rapid targeted-preparation method under the guidance of the HPLC-MS analysis is applicable for the isolation and preparation of chemical components in traditional Chinese medicines.

  20. Mission analysis and guidance, navigation, and control design for rendezvous and docking phase of advanced reentry vehicle mission

    NASA Astrophysics Data System (ADS)

    Strippoli, L.; Colmenarejo, P.; Strauch, H.

    2013-12-01

    Advanced Reentry Vehicle (ARV) belongs to the family of vehicles designed to perform rendezvous and docking (RvD) with the International space station (ISS) [1]. Differently from its predecessor ATV (Automated Transfer Vehicle), the ARV will transport a reentry capsule, equipped with a heatshield and able to bring back cargo, experiments, or, as a possible future development, even crew, being this latter scenario very attracting in view of the Space Shuttle retirement. GMV, as subcontractor of EADS-Astrium Germany, is in charge of the RvD and departure mission analysis and GNC (Guidance, Navigation, and Control) design of ARV mission. This paper will present the main outcomes of the study.

  1. SEPS guidance and navigation autonomy selection via mission analysis. [Solar Electric Propulsion Stage for Space Transportation System

    NASA Technical Reports Server (NTRS)

    Causey, W.; Sohoni, V.; Shenfish, K. L.; Wallace, C. T.

    1975-01-01

    A systematic rationale for selecting a cost-effective guidance and navigation (G & N) autonomy level for the solar electric propulsion stage (SEPS) vehicle is developed. After a definition of autonomy levels, a mission analysis is performed for representative SEPS missions using realistic G & N sensor hardware. Cost data for fabricating, integrating and refurbishing onboard avionics hardware and the ground costs corresponding to each autonomy level are generated. Results are presented that indicate performance of various G & N sensor hardware sets and the dominating factors which influence G & N autonomy level selection.

  2. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  3. Laser Safety and Hazard Analysis for the Trailer (B70) Based AURA Laser System

    SciTech Connect

    AUGUSTONI, ARNOLD L.

    2003-01-01

    A laser safety and hazard analysis was performed for the AURA laser system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for ''Safe Use of Lasers'' and the 2000 version of the ANSI Standard Z136.6, for ''Safe Use of Lasers Outdoors''. The trailer based AURA laser system is a mobile platform, which is used to perform laser interaction experiments and tests at various national test sites. The trailer (B70) based AURA laser system is generally operated on the United State Air Force Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances from the exit telescope. In order to protect personnel, who work inside the Nominal Hazard Zone (NHZ), from hazardous laser emission exposures it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (OD{sub min}) of the laser safety eyewear used by authorized personnel and the Nominal Ocular Hazard Distance (NOHD) to protect unauthorized personnel who may have violated the boundaries of the control area and enter into the laser's NHZ.

  4. Seismic Hazard Analysis of Aizawl, India with a Focus on Water System Fragilities

    NASA Astrophysics Data System (ADS)

    Belair, G. M.; Tran, A. J.; Dreger, D. S.; Rodgers, J. E.

    2015-12-01

    GeoHazards International (GHI) has partnered with the University of California, Berkeley in a joint Civil Engineering and Earth Science summer internship program to investigate geologic hazards. This year the focus was on Aizawl, the capital of India's Mizoram state, situated on a ridge in the Burma Ranges. Nearby sources have the potential for large (M > 7) earthquakes that would be devastating to the approximately 300,000 people living in the city. Earthquake induced landslides also threaten the population as well as the city's lifelines. Fieldwork conducted in June 2015 identified hazards to vital water system components. The focus of this abstract is a review of the seismic hazards that affect Aizawl, with special attention paid to water system locations. To motivate action to reduce risk, GHI created an earthquake scenario describing effects of a M7 right-lateral strike-slip intraplate earthquake occurring 30 km below the city. We extended this analysis by exploring additional mapped faults as well as hypothetical blind reverse faults in terms of PGA, PGV, and PSA. Ground motions with hanging wall and directivity effects were also examined. Several attenuation relationships were used in order to assess the uncertainty in the ground motion parameters. Results were used to determine the likely seismic performance of water system components, and will be applied in future PSHA studies.

  5. Patient-specific meta-analysis for risk assessment using multivariate proportional hazards regression

    PubMed Central

    Crager, Michael R.; Tang, Gong

    2015-01-01

    We propose a method for assessing an individual patient’s risk of a future clinical event using clinical trial or cohort data and Cox proportional hazards regression, combining the information from several studies using meta-analysis techniques. The method combines patient-specific estimates of the log cumulative hazard across studies, weighting by the relative precision of the estimates, using either fixed- or random-effects meta-analysis calculations. Risk assessment can be done for any future patient using a few key summary statistics determined once and for all from each study. Generalizations of the method to logistic regression and linear models are immediate. We evaluate the methods using simulation studies and illustrate their application using real data. PMID:26664111

  6. Hospital food hygiene: the application of Hazard Analysis Critical Control Points to conventional hospital catering.

    PubMed

    Richards, J; Parr, E; Riseborough, P

    1993-08-01

    The Hazard Analysis Critical Control Point (HACCP) concept is a preventive approach to quality control. It is based on a logical, structured exploration of potential hazard points in a food operation and the introduction of control and monitoring measures. HACCP studies have been extensively applied to manufacturing systems. It has been difficult to extend this detailed systematic analysis to conventional catering, partly due to the wide range of foods being processed. We describe here the application of HACCP methods to a hospital department that uses conventional catering methods. Basic principles, based on the work flow and the range of products are established. The guidelines produced can be applied to any kitchen using similar catering methods. Examples of log charts used for monitoring are provided.

  7. Laser safety and hazard analysis for the temperature stabilized BSLT ARES laser system.

    SciTech Connect

    Augustoni, Arnold L.

    2003-08-01

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. As a result of temperature stabilization of the BSLT laser the operating parameters of the laser had changed requiring a hazard analysis based on the new operating conditions. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  8. Analysis of the selected optical parameters of filters protecting against hazardous infrared radiation

    PubMed Central

    Gralewicz, Grzegorz; Owczarek, Grzegorz

    2016-01-01

    The paper analyses the selected optical parameters of protective optic filters used for protection of the eyes against hazardous radiation within the visible (VIS) and near infrared (NIR) spectrum range. The indexes characterizing transmission and reflection of optic radiation incident on the filter are compared. As it follows from the completed analysis, the newly developed interference filters provide more effective blocking of infrared radiation in comparison with the currently used protective filters. PMID:26327153

  9. Analysis of the selected optical parameters of filters protecting against hazardous infrared radiation.

    PubMed

    Gralewicz, Grzegorz; Owczarek, Grzegorz

    2016-09-01

    The paper analyses the selected optical parameters of protective optic filters used for protection of the eyes against hazardous radiation within the visible (VIS) and near infrared (NIR) spectrum range. The indexes characterizing transmission and reflection of optic radiation incident on the filter are compared. As it follows from the completed analysis, the newly developed interference filters provide more effective blocking of infrared radiation in comparison with the currently used protective filters.

  10. Fire hazards analysis of the Radioactive Waste Management Complex Air Support Buildings

    SciTech Connect

    Davis, M.L.; Satterwhite, D.G.

    1989-09-01

    This report describes the methods, analyses, results, and conclusions of a fire hazards risk analysis performed for the RWMC Air Support Buildings. An evaluation of the impact for adding a sprinkler system is also presented. Event and fault trees were used to model and analyze the waste storage process. Tables are presented indicating the fire initiators providing the highest potential for release of radioactive materials into the environment. Engineering insights drawn form the data are also provided.

  11. Analysis of Precipitation (Rain and Snow) Levels and Straight-line Wind Speeds in Support of the 10-year Natural Phenomena Hazards Review for Los Alamos National Laboratory

    SciTech Connect

    Kelly, Elizabeth J.; Dewart, Jean Marie; Deola, Regina

    2015-12-10

    This report provides site-specific return level analyses for rain, snow, and straight-line wind extreme events. These analyses are in support of the 10-year review plan for the assessment of meteorological natural phenomena hazards at Los Alamos National Laboratory (LANL). These analyses follow guidance from Department of Energy, DOE Standard, Natural Phenomena Hazards Analysis and Design Criteria for DOE Facilities (DOE-STD-1020-2012), Nuclear Regulatory Commission Standard Review Plan (NUREG-0800, 2007) and ANSI/ ANS-2.3-2011, Estimating Tornado, Hurricane, and Extreme Straight-Line Wind Characteristics at Nuclear Facility Sites. LANL precipitation and snow level data have been collected since 1910, although not all years are complete. In this report the results from the more recent data (1990–2014) are compared to those of past analyses and a 2004 National Oceanographic and Atmospheric Administration report. Given the many differences in the data sets used in these different analyses, the lack of statistically significant differences in return level estimates increases confidence in the data and in the modeling and analysis approach.

  12. Post-Flight Analysis of the Guidance, Navigation, and Control Performance During Orion Exploration Flight Test 1

    NASA Technical Reports Server (NTRS)

    Barth, Andrew; Mamich, Harvey; Hoelscher, Brian

    2015-01-01

    The first test flight of the Orion Multi-Purpose Crew Vehicle presented additional challenges for guidance, navigation and control as compared to a typical re-entry from the International Space Station or other Low Earth Orbit. An elevated re-entry velocity and steeper flight path angle were chosen to achieve aero-thermal flight test objectives. New IMU's, a GPS receiver, and baro altimeters were flight qualified to provide the redundant navigation needed for human space flight. The guidance and control systems must manage the vehicle lift vector in order to deliver the vehicle to a precision, coastal, water landing, while operating within aerodynamic load, reaction control system, and propellant constraints. Extensive pre-flight six degree-of-freedom analysis was performed that showed mission success for the nominal mission as well as in the presence of sensor and effector failures. Post-flight reconstruction analysis of the test flight is presented in this paper to show whether that all performance metrics were met and establish how well the pre-flight analysis predicted the in-flight performance.

  13. Apollo 16 mission Report. Supplement 1: Apollo 16 guidance, navigation, and control system performance analysis report

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results are reported of additional studies which were conducted to supplement conclusions drawn in the MSC Mission Report and analyses which were not completed in time to meet the Mission Report dealine. A detailed evaluation of the Abort Guidance System sensor assembly and results from the investigation of the X gyro loop anomaly are included. Further evidence is presented substantiating the excellent LM IMU performance obtained from preliminary indications. A detailed study is presented of the procedural changes implemented on Apollo 16 to diminish the number and duration of interruptions to the CSM DAP attitude maneuver during P20 Option 5 operations.

  14. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    SciTech Connect

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  15. WHC-SD-W252-FHA-001, Rev. 0: Preliminary fire hazard analysis for Phase II Liquid Effluent Treatment and Disposal Facility, Project W-252

    SciTech Connect

    Barilo, N.F.

    1995-05-11

    A Fire Hazards Analysis was performed to assess the risk from fire and other related perils and the capability of the facility to withstand these hazards. This analysis will be used to support design of the facility.

  16. Endangerment assessment guidance

    SciTech Connect

    Not Available

    1985-11-22

    The directive clarifies the requirement that an endangerment assessment be developed to support all administrative and judicial enforcement actions under Section 106 of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and Section 7003 of the Resource Conservation and Recovery Act (RCRA). Before taking enforcement action under these provisions to abate the hazards or potential hazards at a site, the Environmental Protection Agency (EPA) must be able to properly document and justify its assertion that an imminent and substantial endangerment to public health or welfare or the environment may exist. The endangerment assessment provides this documentation and justification. The endangerment assessment is not necessary to support Section 104 actions. It also provides guidance on the content, timing, level of detail, format, and resources required for the preparation of endangerment assessments.

  17. Hazard screening and proposals for prevention by occupational health service: an experiment with job load and hazard analysis at a Finnish construction company.

    PubMed

    Mattila, M; Kivi, P

    1991-01-01

    In this study a systematic method for workplace investigation was developed and then tested as part of the regular occupational health care procedures in the building trade. Workplace investigation is a concept which entails the analysis of hazards inherent in the work as well as assessment of their effects on workers' well-being. The aim of this paper is to evaluate the effectiveness of the workplace investigation method. The newly developed method, called Job Load and Hazard Analysis, has the following characteristics: a job analytic approach; the application of group problem-solving; and cooperation between occupational health professionals, occupational safety personnel, and line management. The method comprises the identification of health hazards, their assessment, and conclusions and proposals as to their prevention and follow-up. The method was tested as part of one constructor's actual occupational health care programme, over a 2.5-year period. The method worked well as a central component of preventive occupational health care. It yielded concrete data that could be applied to make the occupational health care programme better suited to preventing the hazards inherent in the building trade. The contents of the occupational health care programme were clearly enhanced, the number of preventive measures increased, and the organizational climate improved; the workers praised the increased emphasis on safety. More research is needed, eg in other production settings and to determine the most effective utilization of the data gathered by the method. PMID:2011001

  18. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    SciTech Connect

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  19. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered

  20. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  1. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics

  2. Hazard analysis of EUCLIDIAN: an image-guided robotic brachytherapy system.

    PubMed

    Hu, Yida; Podder, Tarun; Buzurovic, Ivan; Yan, Kaiguo; Ng, Wan Sing; Yu, Yan

    2007-01-01

    Robotic assistance can help clinicians to improve the flexibility of needle insertion and accuracy of seed deposition. However, the robotic platform is a safety critical system for its automated operational mode. Thus, it is important to perform Hazard Identification & Safety Insurance Control (HISIC) for securing the safety of a medical robotic system. In this paper, we have performed HISIC for our robotic platform, called Endo-Uro Computer Lattice for Intratumoral Delivery, Implementation, and Ablation with Nanosensing (ECLIDIAN). The definition and requirements of the system are described by Unified Modeling Language (UML). Failure Mode and Effect Analysis (FMEA) are executed for the principles of HISIC, such as hazard identification, safety insurance control, safety critical limit, monitoring and control. FMEA combined with UML can also be implemented to ensure reliability of the human operation. On the basis of safety control index and fuzzy mathematics, safety effective value is outlined to assess the validity of safety insurance control for robotic system. The above principles and methods are feasible and effective for hazard analysis during the development of the robotic system.

  3. Ergonomics hazards analysis of linemen's power line fixing work in China.

    PubMed

    Yu, Ming; Sun, Linyan; Du, Jianhua; Wu, Fengge

    2009-01-01

    This study used qualitative and quantitative methods, such as OWAS (Ovako working posture analysis system) and behavior observation, to analyze musculoskeletal disorder (MSD) risk factors of power line fixing work in China. Video-based sampling was used to record and analyze the frequency and posture of on-pole activities. Those key subtasks showed ergonomics characteristics of on-pole fixing tasks. Insulator-fixing was the longest subtask (33% of total working time). Bar-installing was the second longest (26% of total working time). It was evident that bar-installing and insulator-fixing were full of hazardous risks. The action categories of the 2 subtasks were higher than of the other ones. The 2 subtasks were also time-consuming, difficult and induced MSDs. Assistant linemen faced more hazardous factors than chief linemen.

  4. Comparing proportional hazards and accelerated failure time models for survival analysis.

    PubMed

    Orbe, Jesus; Ferreira, Eva; Núñez-Antón, Vicente

    2002-11-30

    This paper describes a method proposed for a censored linear regression model that can be used in the context of survival analysis. The method has the important characteristic of allowing estimation and inference without knowing the distribution of the duration variable. Moreover, it does not need the assumption of proportional hazards. Therefore, it can be an interesting alternative to the Cox proportional hazards models when this assumption does not hold. In addition, implementation and interpretation of the results is simple. In order to analyse the performance of this methodology, we apply it to two real examples and we carry out a simulation study. We present its results together with those obtained with the traditional Cox model and AFT parametric models. The new proposal seems to lead to more precise results.

  5. Rockfall Hazard Analysis From Discrete Fracture Network Modelling with Finite Persistence Discontinuities

    NASA Astrophysics Data System (ADS)

    Lambert, Cédric; Thoeni, Klaus; Giacomini, Anna; Casagrande, Davide; Sloan, Scott

    2012-09-01

    Developing an accurate representation of the rock mass fabric is a key element in rock fall hazard analysis. The orientation, persistence and density of fractures control the volume and shape of unstable blocks or compartments. In this study, the discrete fracture modelling technique and digital photogrammetry were used to accurately depict the fabric. A volume distribution of unstable blocks was derived combining polyhedral modelling and kinematic analyses. For each block size, probabilities of failure and probabilities of propagation were calculated. A complete energy distribution was obtained by considering, for each block size, its occurrence in the rock mass, its probability of falling, its probability to reach a given location, and the resulting distribution of energies at each location. This distribution was then used with an energy-frequency diagram to assess the hazard.

  6. Analysis of Hazards Associated with a Process Involving Uranium Metal and Uranium Hydride Powders

    SciTech Connect

    Bullock, J.S.

    2000-05-01

    An analysis of the reaction chemistry and operational factors associated with processing uranium and uranium hydride powders is presented, focusing on a specific operation in the Development Division which was subjected to the Job Hazard Analysis (JHA) process. Primary emphasis is on the thermodynamic factors leading to pyrophoricity in common atmospheres. The discussion covers feed powders, cold-pressed and hot-pressed materials, and stray material resulting from the operations. The sensitivity of the various forms of material to pyrophoricity in common atmospheres is discussed. Operational recommendations for performing the work described are given.

  7. Determination of controlling earthquakes from probabilistic seismic hazard analysis for nuclear reactor sites

    SciTech Connect

    Boissonnade, A.; Bernreuter, D.; Chokshi, N.; Murphy, A.

    1995-04-04

    Recently, the US Nuclear Regulatory Commission published, for public comments, a revision to 10 CFR Part 100. The proposed regulation acknowledges that uncertainties are inherent in estimates of the Safe Shutdown Earthquake Ground Motion (SSE) and requires that these uncertainties be addressed through an appropriate analysis. One element of this evaluation is the assessment of the controlling earthquake through the probabilistic seismic hazard analysis (PSHA) and its use in determining the SSE. This paper reviews the basis for the various key choices in characterizing the controlling earthquake.

  8. Release Guidance Analysis of a Deputy Satellite Using Charged Formation Flying for On-Orbit Servicing

    NASA Astrophysics Data System (ADS)

    Iliffe, Paul; Matunaga, Saburo

    On-orbit servicing is of interest for long duration missions due to potential benefits such as increasing the mission lifetime. In order to realize on-orbit servicing Coulomb, formation satellite systems lend themselves as one possible approach. This paper discusses such a system. A method is proposed for deploying a small deputy satellite from a docked condition on a main satellite to a Clohessy-Wiltshire bounded solution. A series of elliptical guidance paths are used for this purpose. Additionally, a method of changing the relative plane of motion of the deputy satellite is presented. This method requires the reorientation of the main satellite. Numerical simulations indicate that such maneuvers are possible within the given assumptions. It is observed that even in the worst-case scenario, positional errors can be kept within several centimeters.

  9. Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

    NASA Astrophysics Data System (ADS)

    Necmioglu, Ocal; Meral Ozel, Nurcan

    2015-04-01

    Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the earthquakes resembling the

  10. Ethical guidance in the era of managed care: an analysis of the American College of Healthcare Executives' Code of Ethics.

    PubMed

    Higgins, W

    2000-01-01

    Market competition and the rise of managed care are transforming the healthcare system from a physician-dominated cottage industry into a manager-dominated corporate enterprise. The managed care revolution is also undermining the safe-guards offered by medical ethics and raising serious public concerns. These trends highlight the growing importance of ethical standards for managers. The most comprehensive ethical guidance for health service managers is contained in the American College of Healthcare Executives' (ACHE) Code of Ethics. An analysis of the ACHE Code suggests that it does not adequately address several ethical concerns associated with managed care. The ACHE may wish to develop a supplemental statement regarding ethical issues in managed care. A supplemental statement that provides more specific guidance in the areas of financial incentives to reduce utilization, social mission, consumer/patient information, and the health service manager's responsibility to patients could be extremely valuable in today's complex and rapidly changing environment. More specific ethical guidelines would not ensure individual or organizational compliance. However, they would provide professional standards that could guide decision making and help managers evaluate performance in managed care settings. PMID:11066951

  11. Transportation of Hazardous Evidentiary Material.

    SciTech Connect

    Osborn, Douglas.

    2005-06-01

    This document describes the specimen and transportation containers currently available for use with hazardous and infectious materials. A detailed comparison of advantages, disadvantages, and costs of the different technologies is included. Short- and long-term recommendations are also provided.3 DraftDraftDraftExecutive SummaryThe Federal Bureau of Investigation's Hazardous Materials Response Unit currently has hazardous material transport containers for shipping 1-quart paint cans and small amounts of contaminated forensic evidence, but the containers may not be able to maintain their integrity under accident conditions or for some types of hazardous materials. This report provides guidance and recommendations on the availability of packages for the safe and secure transport of evidence consisting of or contaminated with hazardous chemicals or infectious materials. Only non-bulk containers were considered because these are appropriate for transport on small aircraft. This report will addresses packaging and transportation concerns for Hazardous Classes 3, 4, 5, 6, 8, and 9 materials. If the evidence is known or suspected of belonging to one of these Hazardous Classes, it must be packaged in accordance with the provisions of 49 CFR Part 173. The anthrax scare of several years ago, and less well publicized incidents involving unknown and uncharacterized substances, has required that suspicious substances be sent to appropriate analytical laboratories for analysis and characterization. Transportation of potentially hazardous or infectious material to an appropriate analytical laboratory requires transport containers that maintain both the biological and chemical integrity of the substance in question. As a rule, only relatively small quantities will be available for analysis. Appropriate transportation packaging is needed that will maintain the integrity of the substance, will not allow biological alteration, will not react chemically with the substance being

  12. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    NASA Astrophysics Data System (ADS)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2016-09-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  13. Laser hazard analysis for airborne AURA (Big Sky variant) Proteus platform.

    SciTech Connect

    Augustoni, Arnold L.

    2004-02-01

    A laser safety and hazard analysis was performed for the airborne AURA (Big Sky Laser Technology) lidar system based on the 2000 version of the American National Standard Institute's (ANSI) Standard Z136.1, for the Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for the Safe Use of Lasers Outdoors. The AURA lidar system is installed in the instrument pod of a Proteus airframe and is used to perform laser interaction experiments and tests at various national test sites. The targets are located at various distances or ranges from the airborne platform. In order to protect personnel, who may be in the target area and may be subjected to exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength, calculate the Nominal Ocular Hazard Distance (NOHD), and determine the maximum 'eye-safe' dwell times for various operational altitudes and conditions. It was also necessary to calculate the appropriate minimum Optical Density (ODmin) of the laser safety eyewear used by authorized personnel who may receive hazardous exposures during ground base operations of the airborne AURA laser system (system alignment and calibration).

  14. Hazardous waste crime: a contextual analysis of the offense and the offender

    SciTech Connect

    Rebovich, D.J.

    1986-01-01

    The goal of this study is to analyze hazardous waste offense and offender characteristics. Criminal case data were collected from four sample states (Maine, Maryland, New Jersey, and Pennsylvania). Files of disposed criminal cases charged between 1977 and 1984 were content-analyzed, and interviews were conducted with prominent hazardous waste crime enforcement personnel from the sample states. Areas of analysis include methods of crime commission, skills required for crime commission, patterns of criminal network relationships, and degree of syndicate crime influence. While there has been some previous speculation that hazardous waste criminal behavior is directed through centralized racketeering, the present study of known offenders found little evidence of syndicate crime family infiltration. Crimes occurred within small, informal networks of waste generators, waste transporters, the employees of treatment/storage/disposal (TSD) facilities and certain non-industry peripheral actors. The study concludes that, while attempts have been made by syndicate crime operatives to infiltrate, these attempts have failed largely due to features of criminal commission methods and to the inherent fragmentation of hauling and TSD firm interests.

  15. Additive interaction in survival analysis: use of the additive hazards model.

    PubMed

    Rod, Naja Hulvej; Lange, Theis; Andersen, Ingelise; Marott, Jacob Louis; Diderichsen, Finn

    2012-09-01

    It is a widely held belief in public health and clinical decision-making that interventions or preventive strategies should be aimed at patients or population subgroups where most cases could potentially be prevented. To identify such subgroups, deviation from additivity of absolute effects is the relevant measure of interest. Multiplicative survival models, such as the Cox proportional hazards model, are often used to estimate the association between exposure and risk of disease in prospective studies. In Cox models, deviations from additivity have usually been assessed by surrogate measures of additive interaction derived from multiplicative models-an approach that is both counter-intuitive and sometimes invalid. This paper presents a straightforward and intuitive way of assessing deviation from additivity of effects in survival analysis by use of the additive hazards model. The model directly estimates the absolute size of the deviation from additivity and provides confidence intervals. In addition, the model can accommodate both continuous and categorical exposures and models both exposures and potential confounders on the same underlying scale. To illustrate the approach, we present an empirical example of interaction between education and smoking on risk of lung cancer. We argue that deviations from additivity of effects are important for public health interventions and clinical decision-making, and such estimations should be encouraged in prospective studies on health. A detailed implementation guide of the additive hazards model is provided in the appendix.

  16. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, North-Western Italy

    NASA Astrophysics Data System (ADS)

    Turconi, L.; Tropeano, D.; Savio, G.; De, S. Kumar; Mason, P. J.

    2015-04-01

    A Civil Protection Plan has been drafted for a 600 km2 mountainous region in NW Italy Consisting of Orco and Soana Valleys. It is a part of the oldest natural park in Italy and attracts several thousand tourists every year. The work is concerned with the analysis of relevant physiographic characteristics of this Alpine landscapehaving extremely variable geomorphology and possess a long history of instability. Thousands of records as well as digital maps (involving overlay and comparison of up to 90 GIS layers) have been analyzed and cross-correlated to find out the details of the events. The study area experienced different types of natural hazards, typical of the whole Alpine environment. Thus, the present area has been selected for such multi-hazard research in which several natural processes have been investigated, concerning their damaging effects over the land. Due to 36 different severe hazardous events at least 250 deaths have been recorded in the area since 18th Century, in the occasion of.

  17. The hazard analysis and critical control point system in food safety.

    PubMed

    Herrera, Anavella Gaitan

    2004-01-01

    The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series. PMID:15156035

  18. A Framework for the Validation of Probabilistic Seismic Hazard Analysis Maps Using Strong Ground Motion Data

    NASA Astrophysics Data System (ADS)

    Bydlon, S. A.; Beroza, G. C.

    2015-12-01

    Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.

  19. Google Earth Views of Probabilistic Tsunami Hazard Analysis Pilot Study, Seaside, Oregon

    NASA Astrophysics Data System (ADS)

    Wong, F. L.; Venturato, A. J.; Geist, E. L.

    2006-12-01

    Virtual globes such as Google Earth provide immediate geographic context for research data for coastal hazard planning. We present Google Earth views of data from a Tsunami Pilot Study conducted within and near Seaside and Gearhart, Oregon, as part of FEMA's Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). Two goals of the pilot study were to develop probabilistic 100- year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities along the Cascadia subduction zone that extends from Cape Mendocino, California, to the Strait of Juan de Fuca, Washington. State and local stakeholders also expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study report will be augmented by a separate geographic information systems (GIS) data publication that provides model data and results. In addition to traditional GIS data formats, Google Earth kmz files are available to provide rapid visualization of the data against the rich base map provided by the interface. The data include verbal and geologic observations of historic tsunami events, newly constructed DEMs, historic shorelines, earthquake sources, models of tsunami wave heights, and maps of the estimated 100- and 500-year probabilistic floods. Tsunami Pilot Study Working Group, 2006, Seaside, Oregon Tsunami Pilot Study - Modernization of FEMA Flood Hazard Maps: U.S. Geological Survey Open-file Report 2006

  20. Critical load analysis in hazard assessment of metals using a Unit World Model.

    PubMed

    Gandhi, Nilima; Bhavsar, Satyendra P; Diamond, Miriam L

    2011-09-01

    A Unit World approach has been used extensively to rank chemicals for their hazards and to understand differences in chemical behavior. Whereas the fate and effects of an organic chemical in a Unit World Model (UWM) analysis vary systematically according to one variable (fraction of organic carbon), and the chemicals have a singular ranking regardless of environmental characteristics, metals can change their hazard ranking according to freshwater chemistry, notably pH and dissolved organic carbon (DOC). Consequently, developing a UWM approach for metals requires selecting a series of representative freshwater chemistries, based on an understanding of the sensitivity of model results to this chemistry. Here we analyze results from a UWM for metals with the goal of informing the selection of appropriate freshwater chemistries for a UWM. The UWM loosely couples the biotic ligand model (BLM) to a geochemical speciation model (Windermere Humic Adsorption Model [WHAM]) and then to the multi-species fate transport-speciation (Transpec) model. The UWM is applied to estimate the critical load (CL) of cationic metals Cd, Cu, Ni, Pb, and Zn, using three lake chemistries that vary in trophic status, pH, and other parameters. The model results indicated a difference of four orders of magnitude in particle-to-total dissolved partitioning (K(d)) that translated into minimal differences in fate because of the short water residence time used. However, a maximum 300-fold difference was calculated in Cu toxicity among the three chemistries and three aquatic organisms. Critical loads were lowest (greatest hazard) in the oligotrophic water chemistry and highest (least hazard) in the eutrophic water chemistry, despite the highest fraction of free metal ion as a function of total metal occurring in the mesotrophic system, where toxicity was ameliorated by competing cations. Water hardness, DOC, and pH had the greatest influence on CL, because of the influence of these factors on aquatic

  1. A deterministic analysis of tsunami hazard and risk for the southwest coast of Sri Lanka

    NASA Astrophysics Data System (ADS)

    Wijetunge, J. J.

    2014-05-01

    This paper describes a multi-scenario, deterministic analysis carried out as a pilot study to evaluate the tsunami hazard and risk distribution in the southwest coast of Sri Lanka. The hazard and risk assessment procedure adopted was also assessed against available field records of the impact of the Indian Ocean tsunami in 2004. An evaluation of numerically simulated nearshore tsunami amplitudes corresponding to ‘maximum-credible' scenarios from different subduction segments in the Indian Ocean surrounding Sri Lanka suggests that a seismic event similar to that generated the tsunami in 2004 can still be considered as the ‘worst-case' scenario for the southwest coast. Furthermore, it appears that formation of edge waves trapped by the primary waves diffracting around the southwest significantly influences the nearshore tsunami wave field and is largely responsible for relatively higher tsunami amplitudes in certain stretches of the coastline under study. The extent of inundation from numerical simulations corresponding to the worst-case scenario shows good overall agreement with the points of maximum penetration of inundation from field measurements in the aftermath of the 2004 tsunami. It can also be seen that the inundation distribution is strongly influenced by onshore topography. The present study indicates that the mean depth of inundation could be utilised as a primary parameter to quantify the spatial distribution of the tsunami hazard. The spatial distribution of the risk of the tsunami hazard to the population and residential buildings computed by employing the standard risk formula shows satisfactory correlation with published statistics of the affected population and the damage to residential property during the tsunami in 2004.

  2. The hazard analysis and critical control point system in food safety.

    PubMed

    Herrera, Anavella Gaitan

    2004-01-01

    The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series.

  3. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  4. Remedial Investigation/Feasibility Study (RI/FS) process, elements and techniques guidance

    SciTech Connect

    Not Available

    1993-12-01

    This manual provides detailed guidance on Remedial Investigation/Feasibility Studies (RI/FSs) conducted pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) at Department of Energy (DOE) facilities. The purpose of the RI/FS, to assess the risk posed by a hazardous waste site and to determine the best way to reduce that risk, and its structure (site characterization, risk assessment, screening and detailed analysis of alternatives, etc.) is defined in the National Oil and Hazardous Substances Pollution Contingency Plan (NCP) and further explained in the Environmental Protection Agency`s (EPA`s) Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA (Interim Final) 540/G-89/004, OSWER Directive 9355.3-01, October 1988. Though issued in 1988, the EPA guidance remains an excellent source of information on the conduct and structure of an RI/FS. This document makes use of supplemental RI/FS-related guidance that EPA has developed since its initial document was issued in 1988, incorporates practical lessons learned in more than 12 years of experience in CERCLA hazardous site remediation, and drawing on those lessons, introduces the Streamlined Approach For Environmental Restoration (SAFER), developed by DOE as a way to proceed quickly and efficiently through the RI/FS process at DOE facilities. Thus as its title implies, this guidance is intended to describe in detail the process and component elements of an RI/FS, as well as techniques to manage the RI/FS effectively.

  5. A DOE-STD-3009 hazard and accident analysis methodology for non-reactor nuclear facilities

    SciTech Connect

    MAHN,JEFFREY A.; WALKER,SHARON ANN

    2000-03-23

    This paper demonstrates the use of appropriate consequence evaluation criteria in conjunction with generic likelihood of occurrence data to produce consistent hazard analysis results for nonreactor nuclear facility Safety Analysis Reports (SAR). An additional objective is to demonstrate the use of generic likelihood of occurrence data as a means for deriving defendable accident sequence frequencies, thereby enabling the screening of potentially incredible events (<10{sup {minus}6} per year) from the design basis accident envelope. Generic likelihood of occurrence data has been used successfully in performing SAR hazard and accident analyses for two nonreactor nuclear facilities at Sandia National Laboratories. DOE-STD-3009-94 addresses and even encourages use of a qualitative binning technique for deriving and ranking nonreactor nuclear facility risks. However, qualitative techniques invariably lead to reviewer requests for more details associated with consequence or likelihood of occurrence bin assignments in the test of the SAR. Hazard analysis data displayed in simple worksheet format generally elicits questions about not only the assumptions behind the data, but also the quantitative bases for the assumptions themselves (engineering judgment may not be considered sufficient by some reviewers). This is especially true where the criteria for qualitative binning of likelihood of occurrence involves numerical ranges. Oftentimes reviewers want to see calculations or at least a discussion of event frequencies or failure probabilities to support likelihood of occurrence bin assignments. This may become a significant point of contention for events that have been binned as incredible. This paper will show how the use of readily available generic data can avoid many of the reviewer questions that will inevitably arise from strictly qualitative analyses, while not significantly increasing the overall burden on the analyst.

  6. Hubble Space Telescope (HST) Fine Guidance Sensor Performance Analysis with Respect to Guide Stars

    NASA Astrophysics Data System (ADS)

    Armstrong, Amber; Taylor, Denise; Reinhart, Merle

    2016-06-01

    The Fine Guidance Sensors (FGS) onboard HST include one of Hubble’s first-generation instruments still in operation today. After several servicing missions and even some refurbishment to the FGSs, the interferometers, although aged, perform at a level exceeding expectations. Since launch in April 1990, FGS3 has performed within operational standards while FGS1 & 2 underwent replacements or refurbishments during SM2, SM3A and SM4. Up until 1999 FGS3 was used for astrometry science when FGS1R took over that role. Currently FGS1R is the only FGS used as a primary science instrument. While very few observing programs request the FGS as their prime instrument nearly all GO observations executed onboard HST use the interferometers to lock onto guide stars. Most observations execute successfully onboard HST while 1.93% of all orbits executed onboard HST have an issue that requires repeating the observations. Of all failed visits, roughly 65% are due to a variation of suboptimal guide star options that test the boundaries of FGS capabilities. In this poster we present the limitations of the FGS interferometers regarding guide stars, their surprising capabilities, and suggest specific guidelines to astronomers who are faced with “shared risk” opportunities for their observations. We hope to share this knowledge with the HST community to improve guide star execution onboard Hubble and to make comparisons to the future James Webb Space Telescope FGS instrument, where astronomers will take part in guide star selection.

  7. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    PubMed Central

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  8. ANALYSIS OF MULTIVARIATE FAILURE TIME DATA USING MARGINAL PROPORTIONAL HAZARDS MODEL.

    PubMed

    Chen, Ying; Chen, Kani; Ying, Zhiliang

    2010-01-01

    The marginal proportional hazards model is an important tool in the analysis of multivariate failure time data in the presence of censoring. We propose a method of estimation via the linear combinations of martingale residuals. The estimation and inference procedures are easy to implement numerically. The estimation is generally more accurate than the existing pseudo-likelihood approach: the size of efficiency gain can be considerable in some cases, and the maximum relative efficiency in theory is infinite. Consistency and asymptotic normality are established. Empirical evidence in support of the theoretical claims is shown in simulation studies. PMID:24307815

  9. [Incorporation of the Hazard Analysis and Critical Control Point system (HACCP) in food legislation].

    PubMed

    Castellanos Rey, Liliana C; Villamil Jiménez, Luis C; Romero Prada, Jaime R

    2004-01-01

    The Hazard Analysis and Critical Control Point system (HACCP), recommended by different international organizations as the Codex Alimentarius Commission, the World Trade Organization (WTO), the International Office of Epizootics (OIE) and the International Convention for Vegetables Protection (ICPV) amongst others, contributes to ensuring the innocuity of food along the agro-alimentary chain and requires of Good Manufacturing Practices (GMP) for its implementation, GMP's which are legislated in most countries. Since 1997, Colombia has set rules and legislation for application of HACCP system in agreement with international standards. This paper discusses the potential and difficulties of the legislation enforcement and suggests some policy implications towards food safety.

  10. Frequency agile laser safety & hazard analysis for the Sandia Remote Sensing System LIDAR.

    SciTech Connect

    Augustoni, Arnold L.

    2009-05-01

    A laser safety and hazard analysis was performed for the Raytheon Frequency Agile Laser (FAL) to be used with the Sandia Remote Sensing System (SRSS) B-70 Trailer based on the 2007 version of the American National Standards Institute's (ANSI) Standard 136.1, for Safe Use of Lasers and the 2005 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The B-70 SRSS LIDAR system is a portable platform, which is used to perform laser interaction experiments and tests at various national test sites.

  11. School food safety program based on hazard analysis and critical control point principles. Final rule.

    PubMed

    2009-12-15

    This final rule implements a legislative provision which requires school food authorities participating in the National School Lunch Program (NSLP) or the School Breakfast Program (SBP) to develop a school food safety program for the preparation and service of school meals served to children. The school food safety program must be based on the hazard analysis and critical control point (HACCP) system established by the Secretary of Agriculture. The food safety program will enable schools to take systematic action to prevent or minimize the risk of foodborne illness among children participating in the NSLP and SBP. PMID:20169679

  12. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  13. Confirmatory factor analysis and reliability of the Chinese version of the Maslach Burnout Inventory among guidance teachers in Hong Kong.

    PubMed

    Yuen, Mantak; Lau, Patrick S Y; Shek, Daniel T L; Lam, Man-Ping

    2002-12-01

    In 1995 Chan and Hui examined the responses of a sample of Chinese teachers on the Maslach Burnout Inventory and recommended a possible 2-factor rather than the original 3-factor model for the assessment of burnout among Chinese teachers. In the present study, the factor structure of responses to the Chinese version of the Maslach Burnout Inventory in a sample of 1,398 Chinese secondary school guidance teachers was examined using the EQS approach to confirmatory factor analysis. The results showed that a 3-factor model (Emotional Exhaustion, Depersonalization, and Personal Accomplishment) provided the best fit, with the first two factors highly correlated (r = .80). Internal consistencies for the subscales ranged from .80 to .88.

  14. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

    2015-08-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by

  15. A Simple Model for Probabilistic Seismic Hazard Analysis of Induced Seismicity Associated With Deep Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Schlittenhardt, Joerg; Spies, Thomas; Kopera, Juergen; Morales Aviles, Wilhelm

    2014-05-01

    In the research project MAGS (Microseismic activity of geothermal systems) funded by the German Federal Ministry of Environment (BMU) a simple model was developed to determine seismic hazard as the probability of the exceedance of ground motion of a certain size. Such estimates of the annual frequency of exceedance of prescriptive limits of e.g. seismic intensities or ground motions are needed for the planning and licensing, but likewise for the development and operation of deep geothermal systems. For the development of the proposed model well established probabilistic seismic hazard analysis (PSHA) methods for the estimation of the hazard for the case of natural seismicity were adapted to the case of induced seismicity. Important differences between induced and natural seismicity had to be considered. These include significantly smaller magnitudes, depths and source to site distances of the seismic events and, hence, different ground motion prediction equations (GMPE) that had to be incorporated to account for the seismic amplitude attenuation with distance as well as differences in the stationarity of the underlying tectonic and induced processes. Appropriate GMPE's in terms of PGV (peak ground velocity) were tested and selected from the literature. The proposed model and its application to the case of induced seismicity observed during the circulation period (operation phase of the plant) at geothermal sites in Germany will be presented. Using GMPE's for PGV has the advantage to estimate hazard in terms of velocities of ground motion, which can be linked to engineering regulations (e.g. German DIN 4150) which give prescriptive standards for the effects of vibrations on buildings and people. It is thus possible to specify the probability of exceedance of such prescriptive standard values and to decide whether they can be accepted or not. On the other hand hazard curves for induced and natural seismicity can be compared to study the impact at a site. Preliminary

  16. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

    USGS Publications Warehouse

    Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.

    2004-01-01

    -motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.

  17. Analysis of Flood Hazards for the Materials and Fuels Complex at the Idaho National Laboratory Site

    SciTech Connect

    Skaggs, Richard; Breithaupt, Stephen A.; Waichler, Scott R.; Kim, Taeyun; Ward, Duane L.

    2010-11-01

    Researchers at Pacific Northwest National Laboratory conducted a flood hazard analysis for the Materials and Fuels Complex (MFC) site located at the Idaho National Laboratory (INL) site in southeastern Idaho. The general approach for the analysis was to determine the maximum water elevation levels associated with the design-basis flood (DBFL) and compare them to the floor elevations at critical building locations. Two DBFLs for the MFC site were developed using different precipitation inputs: probable maximum precipitation (PMP) and 10,000 year recurrence interval precipitation. Both precipitation inputs were used to drive a watershed runoff model for the surrounding upland basins and the MFC site. Outflows modeled with the Hydrologic Engineering Centers Hydrologic Modeling System were input to the Hydrologic Engineering Centers River Analysis System hydrodynamic flood routing model.

  18. Strong Ground-Motion Prediction in Seismic Hazard Analysis: PEGASOS and Beyond

    NASA Astrophysics Data System (ADS)

    Scherbaum, F.; Bommer, J. J.; Cotton, F.; Bungum, H.; Sabetta, F.

    2005-12-01

    The SSHAC Level 4 approach to probabilistic seismic hazard analysis (PSHA), which could be considered to define the state-of-the-art in PSHA using multiple expert opinions, has been fully applied only twice, firstly in the multi-year Yucca Mountain study and subsequently (2002-2004) in the PEGASOS project. The authors of this paper participated as ground-motion experts in this latter project, the objective of which was comprehensive seismic hazard analysis for four nuclear power plant sites in Switzerland, considering annual exceedance frequencies down to 1/10000000. Following SSHAC procedure, particular emphasis was put on capturing both the aleatory and epistemic uncertainties. As a consequence, ground motion prediction was performed by combining several empirical ground motion models within a logic tree framework with the weights on each logic tree branch expressing the personal degree-of-belief of each ground-motion expert. In the present paper, we critically review the current state of ground motion prediction methodology in PSHA in particular for regions of low seismicity. One of the toughest lessons from PEGASOS was that in systematically and rigorously applying the laws of uncertainty propagation to all of the required conversions and adjustments of ground motion models, a huge price has to be paid in an ever-growing aleatory variability. Once this path has been followed, these large sigma values will drive the hazard, particularly for low annual frequencies of exceedance. Therefore, from a post-PEGASOS perspective, the key issues in the context of ground-motion prediction for PSHA for the near future are to better understand the aleatory variability of ground motion and to develop suites of ground-motion prediction equations that employ the same parameter definitions. The latter is a global rather than a regional challenge which might be a desirable long-term goal for projects similar to the PEER NGA (Pacific Earthquake Engineering Research Center, Next

  19. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  20. Site specific seismic hazard analysis at the DOE Kansas City Plant

    SciTech Connect

    Lynch, D.T.; Drury, M.A.; Meis, R.C.; Bieniawski, A.; Savy, J.B.; Llopis, J.L.; Constantino, C.; Hashimoto, P.S.; Campbell, K.W.

    1995-10-01

    A site specific seismic hazard analysis is being conducted for the Kansas City Plant to support an on-going structural evaluation of existing buildings. This project is part of the overall review of facilities being conducted by DOE. The seismic hazard was probabilistically defined at the theoretical rock outcrop by Lawrence Livermore National Laboratory. The USArmy Engineer Waterways Experiment Station conducted a subsurface site investigation to characterize in situ S-wave velocities and other subsurface physical properties related to the geology in the vicinity of the Main Manufacturing Building (MMB) at the Bannister Federal Complex. The test program consisted of crosshole S-wave, seismic cone penetrometer testing,and laboratory soil analyses. The information acquired from this investigation was used in a site response analysis by City College of New York to determine the earthquake motion at grade. Ground response spectra appropriate for design and evaluation of Performance Category 1 and 2 structures, systems, and components were recommended. Effects of seismic loadings on the buildings will be used to aid in designing any structural modifications.

  1. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    NASA Astrophysics Data System (ADS)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  2. Fire Hazard Analysis for the Cold Vacuum Drying facility (CVD) Facility

    SciTech Connect

    SINGH, G.

    2000-09-06

    The CVDF is a nonreactor nuclear facility that will process the Spent Nuclear Fuels (SNF) presently stored in the 105-KE and 105-KW SNF storage basins. Multi-canister overpacks (MCOs) will be loaded (filled) with K Basin fuel transported to the CVDF. The MCOs will be processed at the CVDF to remove free water from the fuel cells (packages). Following processing at the CVDF, the MCOs will be transported to the CSB for interim storage until a long-term storage solution can be implemented. This operation is expected to start in November 2000. A Fire Hazard Analysis (FHA) is required for all new facilities and all nonreactor nuclear facilities, in accordance with U.S. Department of Energy (DOE) Order 5480.7A, Fire Protection. This FHA has been prepared in accordance with DOE 5480.7A and HNF-PRO-350, Fire Hazard Analysis Requirements. Additionally, requirements or criteria contained in DOE, Richland Operations Office (RL) RL Implementing Directive (RLID) 5480.7, Fire Protection, or other DOE documentation are cited, as applicable. This FHA comprehensively assesses the risk of fire at the CVDF to ascertain whether the specific objectives of DOE 5480.7A are met. These specific fire protection objectives are: (1) Minimize the potential for the occurrence of a fire. (2) Ensure that fire does not cause an onsite or offsite release of radiological and other hazardous material that will threaten the public health and safety or the environment. (3) Establish requirements that will provide an acceptable degree of life safety to DOE and contractor personnel and ensure that there are no undue hazards to the public from fire and its effects in DOE facilities. (4) Ensure that vital DOE programs will not suffer unacceptable delays as a result of fire and related perils. (5) Ensure that property damage from fire and related perils does not exceed an acceptable level. (6) Ensure that process control and safety systems are not damaged by fire or related perils. This FHA is based on the

  3. Towards inclusion of dynamic slip features in stochastic models for probabilistic (tsunami) hazard analysis.

    NASA Astrophysics Data System (ADS)

    Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Nielsen, S. B.; Festa, G.; Trasatti, E.; Tonini, R.; Molinari, I.; Romano, F.

    2015-12-01

    Stochastic slip modelling based on general scaling features with uniform slip probability over the fault plane is commonly employed in tsunami and seismic hazard. However, dynamic rupture effects driven by specific fault geometry and frictional conditions can potentially control the slip probability. Unfortunately dynamic simulations can be computationally intensive, preventing their extensive use for hazard analysis. The aim of this study is to produce a stochastic model that incorporates slip features observed in dynamic simulations. Taking a Tohoku-like fault as a case study, numerous 2d spectral element dynamic simulations are performed using a variety of pre-stress distributions. Comparing the slip distributions generated from these simulations to traditional stochastic slip models we find that the stochastic models generally under represent slip near the free surface. This is an important feature in tsunami hazard with very large slip at shallow depth observed for the 2011 Tohoku earthquake. To incorporate dynamic features in the stochastic modeling we generate a depth dependent "transfer function" based on comparisons between the dynamic and stochastic models. Assuming that the differences between stochastic and dynamic slip distributions are predominantly depth dependent and not along strike, the transfer function is then applied to stochastic source models over a 3d geometry of the Tohoku fault. Comparing maximum tsunami wave height along the Japanese coast using a traditional stochastic model and one modified by the transfer function we find that the inclusion of the transfer function leads to the occurrence of more extreme events. Applying this function to the traditional stochastic slip distribution as a depth-dependent PDF for the slip may allow for an approximated but efficient incorporation of regionally specific dynamic features in a modified source model, to be used specifically when a significant number of slip scenarios need to be produced, e

  4. Earthquake Cluster Analysis for Turkey and its Application for Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake clusters are an important element in general seismology and also for the application in seismic hazard assessment. In probabilistic seismic hazard assessment, the occurrence of earthquakes is often linked to an independent Monte Carlo process, following a stationary Poisson model. But earthquakes are dependent and constrained, especially in terms of earthquake swarms, fore- and aftershocks or even larger sequences as observed for the Landers sequence in California or the Darfield-Christchurch sequence in New Zealand. For earthquake catalogues, the element of declustering is an important step to capture earthquake frequencies by avoiding a bias towards small magnitudes due to aftershocks. On the other hand, declustered catalogues for independent probabilistic seismic activity will underestimate the total number of earthquakes by neglecting dependent seismicity. In this study, the effect of clusters on probabilistic seismic hazard assessment is investigated in detail. To capture the features of earthquake clusters, a uniform framework for earthquake cluster analysis is introduced using methodologies of geostatistics and machine learning. These features represent important cluster characteristics like cluster b-values, temporal decay, rupture orientations and many more. Cluster parameters are mapped in space using kriging. Furthermore, a detailed data analysis is undertaken to provide magnitude-dependent relations for various cluster parameters. The acquired features are used to introduce dependent seismicity within stochastic earthquake catalogues. In addition, the development of smooth seismicity maps based on historic databases is in general biased to the more complete recent decades. A filling methodology is introduced which will add dependent seismicity in catalogues where none has been recorded to avoid the above mentioned bias. As a case study, Turkey has been chosen due to its inherent seismic activity and well-recorded data coverage. Clustering

  5. A prototype web-GIS application for risk analysis of natural hazards in Switzerland

    NASA Astrophysics Data System (ADS)

    Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien

    2016-04-01

    Following changes in the system of Swiss subsidy in January 2008, the Swiss cantons and the Federal Office for the Environment (FOEN) were forced to prioritize different natural hazard protection projects based on their cost-effectiveness, as a response to limited financial resources (Bründl et al., 2009). For this purpose, applications such as EconoMe (OFEV, 2016) and Valdorisk (DGE, 2016) were developed for risk evaluation and prioritization of mitigation projects. These tools serve as a useful decision-making instrument to the community of practitioners and responsible authorities for natural hazard risk management in Switzerland. However, there are several aspects which could be improved, in particular, the integration and visualization of spatial information interactively through a web-GIS interface for better risk planning and evaluation. Therefore, in this study, we aim to develop an interactive web-GIS application based on the risk concepts applied in Switzerland. The purpose of this tool is to provide a rapid evaluation of risk before and after protection measures, and to test the efficiency of measures by using a simplified cost-benefit analysis within the context of different protection projects. This application allows to integrate different layers which are necessary to calculate risk, in particular, hazard intensity (vector) maps for different scenarios (such as 30, 100 and 300 years of return periods based on Swiss guidelines), exposed objects (such as buildings) and vulnerability information of these objects. Based on provided information and additional parameters, risk is calculated automatically and results are visualized within the web-GIS interface of the application. The users can modify these input information and parameters to create different risk scenarios. Based on the resultant risk scenarios, the users can propose and visualize (preliminary) risk reduction measures before realizing the actual design and dimensions of such protective

  6. Regional analysis assessment of landslide hazard and zoning map for transmission line route selection using GIS

    NASA Astrophysics Data System (ADS)

    Baharuddin, I. N. Z.; Omar, R. C.; Usman, F.; Mejan, M. A.; Abd Halim, M. K.; Zainol, M. A.; Zulkarnain, M. S.

    2013-06-01

    The stability of ground as foundation for infrastructure development is always associated with geology and geomorphology aspects. Failure to carefully analyze these aspects may induce ground instability such subsidence and landslide which eventually can cause catastrophe to the infrastructure i.e. instability of transmission tower. However, in some cases such as the study area this is unavoidable. A GIS system for analysis of route was favoured to perform optimal route predictions based selection by incorporating multiple influence factors into its analysis by incorporating the Landslide Hazard Map (LHM) that was produced on basis of slope map, aspect map, land use map and geological map with the help of ArcGIS using weighted overlay method. Based on LHM it is safe to conclude that the proposed route for Ulu Jelai- Neggiri-Lebir-LILO transmission line has very low risk in term of landslides.

  7. Biological monitoring systems for hazardous waste sites (production and analysis of analytical reference materials)

    SciTech Connect

    Bohman, V.R.; Blincoe, C.R.; Miller, G.C.; Scholl, R.L.; Sutton, W.W.

    1989-02-01

    EPA programs in pesticides, toxics, and hazardous-waste require analytical reference materials. This project emphasized the collection of and analysis of urine, fat, and blood for ultimate use as reference samples and the practicality of using certain metabolites to indicate previous exposure to chlorinated hydrocarbons. The reference samples can, with verified compound concentrations, be used as qualifying samples when evaluating a technique to use for a particular analysis. However, the reference materials may be of greatest benefit when used by laboratories to determine analytical accuracy for samples of human urine, blood, etc. This is because the standards, like the unknown samples, will contain pollutant compounds and associated metabolites (all in vivo incorporated). Dairy animals were used during this study.

  8. Variability and Uncertainty in Probabilistic Seismic Hazard Analysis for the Island of Montreal

    NASA Astrophysics Data System (ADS)

    Elkady, Ahmed Mohamed Ahmed

    The current seismic design process for structures in Montreal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montreal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "epsilon" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montreal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.

  9. Incorporating Climate Change Projections into a Hydrologic Hazard Analysis for Friant Dam

    NASA Astrophysics Data System (ADS)

    Holman, K. D.; Novembre, N.; Sankovich-Bahls, V.; England, J. F.

    2015-12-01

    The Bureau of Reclamation's Dam Safety Office has initiated a series of pilot studies focused on exploring potential impacts of climate change on hydrologic hazards at specific dam locations across the Western US. Friant Dam, located in Fresno, California, was chosen for study because the site had recently undergone a high-level hydrologic hazard analysis using the Stochastic Event Flood Model (SEFM). SEFM is a deterministic flood-event model that treats input parameters as variables, rather than fixed values. Monte Carlo sampling allows the hydrometeorological input parameters to vary according to observed relationships. In this study, we explore the potential impacts of climate change on the hydrologic hazard at Friant Dam using historical and climate-adjusted hydrometeorological inputs to the SEFM. Historical magnitude-frequency relationships of peak inflow and reservoir elevation were developed at Friant Dam for the baseline study using observed temperature and precipitation data between 1966 and 2011. Historical air temperatures, antecedent precipitation, mean annual precipitation, and the precipitation-frequency curve were adjusted for the climate change study using the delta method to create climate-adjusted hydrometeorological inputs. Historical and future climate projections are based on the Bias-Corrected Spatially-Disaggregated CMIP5 dataset (BCSD-CMIP5). The SEFM model was run thousands of times to produce magnitude-frequency relationships of peak reservoir inflow, inflow volume, and reservoir elevation, based on historical and climate-adjusted inputs. Results suggest that peak reservoir inflow and peak reservoir elevation increase (decrease) for all return periods under mean increases (decreases) in precipitation, independently of changes in surface air temperature.

  10. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  11. Evaluation of the Potential of NASA Multi-satellite Precipitation Analysis in Global Landslide Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Hong, Yang; Adler, Robert F.; Huffman, George J.

    2007-01-01

    Landslides are one of the most widespread natural hazards on Earth, responsible for thousands of deaths and billions of dollars in property damage every year. In the U.S. alone landslides occur in every state, causing an estimated $2 billion in damage and 25- 50 deaths each year. Annual average loss of life from landslide hazards in Japan is 170. The situation is much worse in developing countries and remote mountainous regions due to lack of financial resources and inadequate disaster management ability. Recently, a landslide buried an entire village on the Philippines Island of Leyte on Feb 17,2006, with at least 1800 reported deaths and only 3 houses left standing of the original 300. Intense storms with high-intensity , long-duration rainfall have great potential to trigger rapidly moving landslides, resulting in casualties and property damage across the world. In recent years, through the availability of remotely sensed datasets, it has become possible to conduct global-scale landslide hazard assessment. This paper evaluates the potential of the real-time NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA) system to advance our understanding of and predictive ability for rainfall-triggered landslides. Early results show that the landslide occurrences are closely associated with the spatial patterns and temporal distribution of rainfall characteristics. Particularly, the number of landslide occurrences and the relative importance of rainfall in triggering landslides rely on the influence of rainfall attributes [e.g. rainfall climatology, antecedent rainfall accumulation, and intensity-duration of rainstorms). TMPA precipitation data are available in both real-time and post-real-time versions, which are useful to assess the location and timing of rainfall-triggered landslide hazards by monitoring landslide-prone areas while receiving heavy rainfall. For the purpose of identifying rainfall-triggered landslides, an empirical global rainfall intensity

  12. A New Insight into Probabilistic Seismic Hazard Analysis for Central India

    NASA Astrophysics Data System (ADS)

    Mandal, H. S.; Shukla, A. K.; Khan, P. K.; Mishra, O. P.

    2013-12-01

    The Son-Narmada-Tapti lineament and its surroundings of Central India (CI) is the second most important tectonic regime following the converging margin along Himalayas-Myanmar-Andaman of the Indian sub-continent, which attracted several geoscientists to assess its seismic hazard potential. Our study area, a part of CI, is bounded between latitudes 18°-26°N and longitudes 73°-83°E, representing a stable part of Peninsular India. Past damaging moderate magnitude earthquakes as well as continuing microseismicity in the area provided enough data for seismological study. Our estimates based on regional Gutenberg-Richter relationship showed lower b values (i.e., between 0.68 and 0.76) from the average for the study area. The Probabilistic Seismic Hazard Analysis carried out over the area with a radius of ~300 km encircling Bhopal yielded a conspicuous relationship between earthquake return period ( T) and peak ground acceleration (PGA). Analyses of T and PGA shows that PGA value at bedrock varies from 0.08 to 0.15 g for 10 % ( T = 475 years) and 2 % ( T = 2,475 years) probabilities exceeding 50 years, respectively. We establish the empirical relationships and between zero period acceleration (ZPA) and shear wave velocity up to a depth of 30 m [ V s (30)] for the two different return periods. These demonstrate that the ZPA values decrease with increasing shear wave velocity, suggesting a diagnostic indicator for designing the structures at a specific site of interest. The predictive designed response spectra generated at a site for periods up to 4.0 s at 10 and 2 % probability of exceedance of ground motion for 50 years can be used for designing duration dependent structures of variable vertical dimension. We infer that this concept of assimilating uniform hazard response spectra and predictive design at 10 and 2 % probability of exceedance in 50 years at 5 % damping at bedrocks of different categories may offer potential inputs for designing earthquake resistant

  13. SCEC/CME CyberShake: Probabilistic Seismic Hazard Analysis Using 3D Seismic Waveform Modeling

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Cui, Y.; Faerman, M.; Field, E.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T. H.; Kesselman, C.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2005-12-01

    Researchers on the SCEC Community Modeling Environment (SCEC/CME) Project are calculating Probabilistic Seismic Hazard Curves for several sites in the Los Angeles area. The hazard curves calculated in this study use Intensity Measure Relationships (IMRs) based on 3D ground motion simulations rather than on attenuation relationships. State-of-the-art Probabilistic Seismic Hazard Analysis (PSHA) is currently conducted using IMRs that use empirically-based attenuation relationships. These attenuation relationships represent relatively simple analytical models based on the regression of observed data. However, it is widely believed that significant improvements in SHA will rely on the use of more physics-based, waveform modeling. In fact, a more physics-based approach to PSHA was endorsed in a recent assessment of earthquake science by National Research Council (2003). In order to introduce the use of 3D seismic waveform modeling into PSHA hazard curve calculations, the SCEC/CME CyberShake group is integrating state-of-the-art PSHA software tools (OpenSHA), SCEC-developed geophysical models (SCEC CVM3.0), validated anelastic wave modeling (AWM) software, and state-of-the-art computational technologies including high performance computing and grid-based scientific workflows in an effort to develop an OpenSHA-compatible 3D waveform-based IMR component. This will allow researchers to combine a new class of waveform-based IMRs with the large number of existing PSHA components, such as Earthquake Rupture Forecasts (ERF's), that are currently implemented in the OpenSHA system. To calculate a probabilistic hazard curve for a site of interest, we use the OpenSHA implementation of the NSHMP-2002 ERF and identify all ruptures within 200km of the site of interest. For each of these ruptures, we convert the NSHMP-2002 rupture definition into one, or more, Ruptures with Slip Time History (Rupture Variations) using newly developed Rupture Generator software. Strain Green Tensors are

  14. General guidance on exploratory and confirmatory subgroup analysis in late-stage clinical trials.

    PubMed

    Dmitrienko, Alex; Muysers, Christoph; Fritsch, Arno; Lipkovich, Ilya

    2016-01-01

    This article focuses on a broad class of statistical and clinical considerations related to the assessment of treatment effects across patient subgroups in late-stage clinical trials. This article begins with a comprehensive review of clinical trial literature and regulatory guidelines to help define scientifically sound approaches to evaluating subgroup effects in clinical trials. All commonly used types of subgroup analysis are considered in the article, including different variations of prospectively defined and post-hoc subgroup investigations. In the context of confirmatory subgroup analysis, key design and analysis options are presented, which includes conventional and innovative trial designs that support multi-population tailoring approaches. A detailed summary of exploratory subgroup analysis (with the purpose of either consistency assessment or subgroup identification) is also provided. The article promotes a more disciplined approach to post-hoc subgroup identification and formulates key principles that support reliable evaluation of subgroup effects in this setting.

  15. Analysis of aerosol emission and hazard evaluation of electrical discharge machining (EDM) process.

    PubMed

    Jose, Mathew; Sivapirakasam, S P; Surianarayanan, M

    2010-01-01

    The safety and environmental aspects of a manufacturing process are important due to increased environmental regulations and life quality. In this paper, the concentration of aerosols in the breathing zone of the operator of Electrical Discharge Machining (EDM), a commonly used non traditional manufacturing process is presented. The pattern of aerosol emissions from this process with varying process parameters such as peak current, pulse duration, dielectric flushing pressure and the level of dielectric was evaluated. Further, the HAZOP technique was employed to identify the inherent safety aspects and fire risk of the EDM process under different working conditions. The analysis of aerosol exposure showed that the concentration of aerosol was increased with increase in the peak current, pulse duration and dielectric level and was decreased with increase in the flushing pressure. It was also found that at higher values of peak current (7A) and pulse duration (520 micros), the concentration of aerosols at breathing zone of the operator was above the permissible exposure limit value for respirable particulates (5 mg/m(3)). HAZOP study of the EDM process showed that this process is vulnerable to fire and explosion hazards. A detailed discussion on preventing the fire and explosion hazard is presented in this paper. The emission and risk of fire of the EDM process can be minimized by selecting proper process parameters and employing appropriate control strategy.

  16. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  17. Implementing Extreme Value Analysis in a Geospatial Workflow for Storm Surge Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Catelli, J.; Nong, S.

    2014-12-01

    Gridded data of 100-yr (1%) and 500-yr (0.2%) storm surge flood elevations for the United States, Gulf of Mexico, and East Coast are critical to understanding this natural hazard. Storm surge heights were calculated across the study area utilizing SLOSH (Sea, Lake, and Overland Surges from Hurricanes) model data for thousands of synthetic US landfalling hurricanes. Based on the results derived from SLOSH, a series of interpolations were performed using spatial analysis in a geographic information system (GIS) at both the SLOSH basin and the synthetic event levels. The result was a single grid of maximum flood elevations for each synthetic event. This project addresses the need to utilize extreme value theory in a geospatial environment to analyze coincident cells across multiple synthetic events. The results are 100-yr (1%) and 500-yr (0.2%) values for each grid cell in the study area. This talk details a geospatial approach to move raster data to SciPy's NumPy Array structure using the Python programming language. The data are then connected through a Python library to an outside statistical package like R to fit cell values to extreme value theory distributions and return values for specified recurrence intervals. While this is not a new process, the value behind this work is the ability to keep this process in a single geospatial environment and be able to easily replicate this process for other natural hazard applications and extreme event modeling.

  18. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk. PMID:19087232

  19. Grand Junction projects office mixed-waste treatment program, VAC*TRAX mobile treatment unit process hazards analysis

    SciTech Connect

    Bloom, R.R.

    1996-04-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changes to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  20. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  1. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    NASA Astrophysics Data System (ADS)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  2. A regional analysis of elements at risk exposed to mountain hazards in the Eastern European Alps

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Zischg, Andreas

    2014-05-01

    We present a method to quantify the number and value of buildings exposed to torrents and snow avalanches in the Austrian Alps, as well as the number of exposed people. Based on a unique population and building register dataset, a relational SQL database was developed that allows in combination with GIS data a rule-based nation-wide automated analysis. Furthermore, possibilities and challenges are discussed with respect to the use of such data in vulnerability assessment and with respect to resilience measures. We comprehensively address the challenge of data accuracy, scale and uncertainties. From the total of approximately 2.4 million buildings with a clearly attributable geographical location, around 120,000 are exposed to torrent processes (5 %) and snow avalanches (0.4 %); exposition was defined here as located within the digitally available hazard maps of the Austrian Torrent and Avalanche Control Service. Around 5 % of the population (360,000 out of 8.5 million inhabitants), based on those people being compulsory listed in the population register, are located in these areas. The analysis according to the building category resulted in 2.05 million residential buildings in Austria (85 %), 93,000 of which (4.5 %) are exposed to these hazards. In contrast, 37,300 buildings (1.6 %) throughout the country belong to the category of accommodation facilities, 5,600 of which are exposed (15 %). Out of the 140,500 commercial buildings, 8,000 (5 %) are exposed. A considerable spatial variation was detectable within the communities and Federal States. In general, an above-average exposition of buildings to torrent process and snow avalanches was detectable in communities located in the Federal State of Salzburg, Styria and Vorarlberg (torrents), and Tyrol and Vorarlberg (snow avalanches). In the alpine part of Austria, the share of exposed accommodation buildings was two times (Salzburg) and three times (Vorarlberg) higher than the regional average of exposed buildings

  3. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  4. Application of Hazard Analysis Critical Control Point (HACCP) to the handling of expressed breast milk on a neonatal unit.

    PubMed

    Hunter, P R

    1991-02-01

    Hazard Analysis Critical Control Point (HACCP) is a powerful procedure for ensuring quality and safety and has gained widespread use in industry. This paper describes the basic method which is a four-stage process including process analysis, hazard identification, identification of critical control points and devising effective control options. Suggestions as to how the method can be adapted to the control of hospital infection are made. A HACCP analysis, on the supply of expressed human breast milk to babies on a Special Care Baby Unit, is presented. Although only the mother's own milk was given to babies on the unit, several potential hazards were identified and suggestions have been made for their control.

  5. Teacher Guidance of Algebraic Formula Building: Functional Grammatical Analysis of a Whole-Class Conversation

    ERIC Educational Resources Information Center

    Zolkower, Betina; Shreyar, Sam; Pérez, Silvia

    2015-01-01

    How does teacher-guided whole-class interaction contribute to expanding students' potential for making and exchanging mathematical meanings? We address this question through an interpretative analysis of a whole-group conversation in a sixth grade class taught by an experienced teacher in a school in Southern Argentina. The extended interaction…

  6. 76 FR 23824 - Guidance for Industry: “Computer Crossmatch” (Computerized Analysis of the Compatibility Between...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    .... In the Federal Register of August 6, 2001 (66 FR 40886), FDA issued a final rule that revised 21 CFR... manufacturing change to FDA under 21 CFR 601.12. In the Federal Register of June 21, 2007 (72 FR 34259), FDA... Analysis of the Compatibility Between the Donor's Cell Type and the Recipient's Serum or Plasma...

  7. Analysis, biomedicine, collaboration, and determinism challenges and guidance: wish list for biopharmaceuticals on the interface of computing and statistics.

    PubMed

    Goodman, Arnold F

    2011-11-01

    I have personally witnessed processing advance from desk calculators and mainframes, through timesharing and PCs, to supercomputers and cloud computing. I have also witnessed resources grow from too little data into almost too much data, and from theory dominating data into data beginning to dominate theory while needing new theory. Finally, I have witnessed problems advance from simple in a lone discipline into becoming almost too complex in multiple disciplines, as well as approaches evolve from analysis driving solutions into solutions by data mining beginning to drive the analysis itself. How we do all of this has transitioned from competition overcoming collaboration into collaboration starting to overcome competition, as well as what is done being more important than how it is done has transitioned into how it is done becoming as important as what is done. In addition, what or how we do it being more important than what or how we should actually do it has shifted into what or how we should do it becoming just as important as what or how we do it, if not more so. Although we have come a long way in both our methodology and technology, are they sufficient for our current or future complex and multidisciplinary problems with their massive databases? Since the apparent answer is not a resounding yes, we are presented with tremendous challenges and opportunities. This personal perspective adapts my background and experience to be appropriate for biopharmaceuticals. In these times of exploding change, informed perspectives on what challenges should be explored with accompanying guidance may be even more valuable than the far more typical literature reviews in conferences and journals of what has already been accomplished without challenges or guidance. Would we believe that an architect who designs a skyscraper determines the skyscraper's exact exterior, interior and furnishings or only general characteristics? Why not increase dependability of conclusions in

  8. Developing Sustainable Modeling Software and Necessary Data Repository for Volcanic Hazard Analysis -- Some Lessons Learnt

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Connor, C.; Webley, P.; Jones, M.; Charbonnier, S. J.; Connor, L.; Gallo, S.; Bursik, M. I.; Valentine, G.; Hughes, C. G.; Aghakhani, H.; Renschler, C. S.; Kosar, T.

    2014-12-01

    We report here on an effort to improve the sustainability, robustness and usability of the core modeling and simulation tools housed in the collaboratory VHub.org and used in the study of complex volcanic behavior. In particular, we focus on tools that support large scale mass flows (TITAN2D), ash deposition/transport and dispersal (Tephra2 and PUFF), and lava flows (Lava2). These tools have become very popular in the community especially due to the availability of an online usage modality. The redevelopment of the tools ot take advantage of new hardware and software advances was a primary thrust for the effort. However, as we start work we have reoriented the effort to also take advantage of significant new opportunities for supporting the complex workflows and use of distributed data resources that will enable effective and efficient hazard analysis.

  9. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  10. Human body impedance for electromagnetic hazard analysis in the VLF to MF band

    SciTech Connect

    Kanai, H.; Chatterjee, I.; Gandhi, O.P.

    1984-08-01

    A knowledge of the average electrical impedance of the human body is essential for the analysis of electromagnetic hazards in the VLF to MF band. The purpose of the measurements was to determine the average body impedance of several human subjects as a function of frequency. Measurements were carried out with the subjects standing barefoot on a ground plane and touching various metal electrodes with the hand or index finger. The measured impedance includes the electrode polarization and skin impedances, spread impedance near the electrode, body impedance, stray capacitance between the body surface and ground, and inductance due to the body and grounding strap. These components are separated and simplified equivalent circuits are presented for body impedance of humans exposed to free-space electromagnetic waves as well as in contact with large ungrounded metallic objects therein.

  11. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  12. Use of bioassays in assessing health hazards from complex mixtures: A RASH analysis

    SciTech Connect

    Jones, T.D.

    1993-10-14

    The Finney harmonic mean model for joint toxicity of ingredients in mixtures can be used to estimate the toxicity of the neat compound if one component can be substituted in potency-adjusted-doses for each of the other components. Chemical analysis data and relative potency values (computed according to the Rapid Screening of Hazard (RASH) method) were used to compare the toxicities as predicted from ingredients of cigarette smoke, PAHs in diesel exhaust, asphalt, coal tar, pitch, and creosote with the measured toxicities of the neat mixtures. Accuracy for cigarette smoke condensate, coal tar, pitch, and creosote were within a factor of three; asphalt within a factor of 18; but the PAC content of diesel particulate was inadequate to accurately describe the toxicity of diesel emissions.

  13. Expressed breast milk on a neonatal unit: a hazard analysis and critical control points approach.

    PubMed

    Cossey, Veerle; Jeurissen, Axel; Thelissen, Marie-José; Vanhole, Chris; Schuermans, Annette

    2011-12-01

    With the increasing use of human milk and growing evidence of the benefits of mother's milk for preterm and ill newborns, guidelines to ensure its quality and safety are an important part of daily practice in neonatal intensive care units. Operating procedures based on hazard analysis and critical control points can standardize the handling of mother's expressed milk, thereby improving nutrition and minimizing the risk of breast milk-induced infection in susceptible newborns. Because breast milk is not sterile, microorganisms can multiply when the milk is not handled properly. Additional exogenous contamination should be prevented. Strict hygiene and careful temperature and time control are important during the expression, collection, transport, storage, and feeding of maternal milk. In contrast to formula milk, no legal standards exist for the use of expressed maternal milk. The need for additional measures, such as bacteriological screening or heat treatment, remains unresolved.

  14. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Paces, James B.

    2014-08-31

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  15. 230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis

    USGS Publications Warehouse

    Paces, James B.

    2014-01-01

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  16. Marine natural hazards in coastal zone: observations, analysis and modelling (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Didenkulova, Ira

    2010-05-01

    Giant surface waves approaching the coast frequently cause extensive coastal flooding, destruction of coastal constructions and loss of lives. Such waves can be generated by various phenomena: strong storms and cyclones, underwater earthquakes, high-speed ferries, aerial and submarine landslides. The most famous examples of such events are the catastrophic tsunami in the Indian Ocean, which occurred on 26 December 2004 and hurricane Katrina (28 August 2005) in the Atlantic Ocean. The huge storm in the Baltic Sea on 9 January 2005, which produced unexpectedly long waves in many areas of the Baltic Sea and the influence of unusually high surge created by long waves from high-speed ferries, should also be mentioned as examples of regional marine natural hazards connected with extensive runup of certain types of waves. The processes of wave shoaling and runup for all these different marine natural hazards (tsunami, coastal freak waves, ship waves) are studied based on rigorous solutions of nonlinear shallow-water theory. The key and novel results presented here are: i) parameterization of basic formulas for extreme runup characteristics for bell-shape waves, showing that they weakly depend on the initial wave shape, which is usually unknown in real sea conditions; ii) runup analysis of periodic asymmetric waves with a steep front, as such waves are penetrating inland over large distances and with larger velocities than symmetric waves; iii) statistical analysis of irregular wave runup demonstrating that wave nonlinearity nearshore does not influence on the probability distribution of the velocity of the moving shoreline and its moments, and influences on the vertical displacement of the moving shoreline (runup). Wave runup on convex beaches and in narrow bays, which allow abnormal wave amplification is also discussed. Described analytical results are used for explanation of observed extreme runup of tsunami, freak (sneaker) waves and ship waves on different coasts

  17. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    NASA Astrophysics Data System (ADS)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  18. Mission planning, mission analysis and software formulation. Level C requirements for the shuttle mission control center orbital guidance software

    NASA Technical Reports Server (NTRS)

    Langston, L. J.

    1976-01-01

    The formulation of Level C requirements for guidance software was reported. Requirements for a PEG supervisor which controls all input/output interfaces with other processors and determines which PEG mode is to be utilized were studied in detail. A description of the two guidance modes for which Level C requirements have been formulated was presented. Functions required for proper execution of the guidance software were defined. The requirements for a navigation function that is used in the prediction logic of PEG mode 4 were discussed. It is concluded that this function is extracted from the current navigation FSSR.

  19. An OSHA based approach to safety analysis for nonradiological hazardous materials

    SciTech Connect

    Yurconic, M.

    1992-08-01

    The PNL method for chemical hazard classification defines major hazards by means of a list of hazardous substances (or chemical groups) with associated trigger quantities. In addition, the functional characteristics of the facility being classified is also be factored into the classification. In this way, installations defined as major hazard will only be those which have the potential for causing very serious incidents both on and off site. Because of the diversity of operations involving chemicals, it may not be possible to restrict major hazard facilities to certain types of operations. However, this hazard classification method recognizes that in the industrial sector major hazards are most commonly associated with activities involving very large quantities of chemicals and inherently energetic processes. These include operations like petrochemical plants, chemical production, LPG storage, explosives manufacturing, and facilities which use chlorine, ammonia, or other highly toxic gases in bulk quantities. The basis for this methodology is derived from concepts used by OSHA in its proposed chemical process safety standard, the Dow Fire and Explosion Index Hazard Classification Guide, and the International Labor Office`s program on chemical safety. For the purpose of identifying major hazard facilities, this method uses two sorting criteria, (1) facility function and processes and (2) quantity of substances to identify facilities requiringclassification. Then, a measure of chemical energy potential (material factor) is used to identify high hazard class facilities.

  20. An OSHA based approach to safety analysis for nonradiological hazardous materials

    SciTech Connect

    Yurconic, M.

    1992-08-01

    The PNL method for chemical hazard classification defines major hazards by means of a list of hazardous substances (or chemical groups) with associated trigger quantities. In addition, the functional characteristics of the facility being classified is also be factored into the classification. In this way, installations defined as major hazard will only be those which have the potential for causing very serious incidents both on and off site. Because of the diversity of operations involving chemicals, it may not be possible to restrict major hazard facilities to certain types of operations. However, this hazard classification method recognizes that in the industrial sector major hazards are most commonly associated with activities involving very large quantities of chemicals and inherently energetic processes. These include operations like petrochemical plants, chemical production, LPG storage, explosives manufacturing, and facilities which use chlorine, ammonia, or other highly toxic gases in bulk quantities. The basis for this methodology is derived from concepts used by OSHA in its proposed chemical process safety standard, the Dow Fire and Explosion Index Hazard Classification Guide, and the International Labor Office's program on chemical safety. For the purpose of identifying major hazard facilities, this method uses two sorting criteria, (1) facility function and processes and (2) quantity of substances to identify facilities requiringclassification. Then, a measure of chemical energy potential (material factor) is used to identify high hazard class facilities.

  1. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

    SciTech Connect

    Sharifi, Mozafar Hadidi, Mosslem Vessali, Elahe Mosstafakhani, Parasto Taheri, Kamal Shahoie, Saber Khodamoradpour, Mehran

    2009-10-15

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  2. Guidance for performing multivariate data analysis of bioprocessing data: pitfalls and recommendations.

    PubMed

    Rathore, Anurag S; Mittal, Shachi; Pathak, Mili; Arora, Arushi

    2014-01-01

    Biotech unit operations are often characterized by a large number of inputs (operational parameters) and outputs (performance parameters) along with complex correlations among them. A typical biotech process starts with the vial of the cell bank, ends with the final product, and has anywhere from 15 to 30 such unit operations in series. Besides the above-mentioned operational parameters, raw material attributes can also impact process performance and product quality as well as interact among each other. Multivariate data analysis (MVDA) offers an effective approach to gather process understanding from such complex datasets. Review of literature suggests that the use of MVDA is rapidly increasing, fuelled by the gradual acceptance of quality by design (QbD) and process analytical technology (PAT) among the regulators and the biotech industry. Implementation of QbD and PAT requires enhanced process and product understanding. In this article, we first discuss the most critical issues that a practitioner needs to be aware of while performing MVDA of bioprocessing data. Next, we present a step by step procedure for performing such analysis. Industrial case studies are used to elucidate the various underlying concepts. With the increasing usage of MVDA, we hope that this article would be a useful resource for present and future practitioners of MVDA. PMID:24778085

  3. An analysis of approach navigation accuracy and guidance requirements for the grand tour mission to the outer planets

    NASA Technical Reports Server (NTRS)

    Jones, D. W.

    1971-01-01

    The navigation and guidance process for the Jupiter, Saturn and Uranus planetary encounter phases of the 1977 Grand Tour interior mission was simulated. Reference approach navigation accuracies were defined and the relative information content of the various observation types were evaluated. Reference encounter guidance requirements were defined, sensitivities to assumed simulation model parameters were determined and the adequacy of the linear estimation theory was assessed. A linear sequential estimator was used to provide an estimate of the augmented state vector, consisting of the six state variables of position and velocity plus the three components of a planet position bias. The guidance process was simulated using a nonspherical model of the execution errors. Computation algorithms which simulate the navigation and guidance process were derived from theory and implemented into two research-oriented computer programs, written in FORTRAN.

  4. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  5. Development of guidance for states transitioning to new safety analysis tools

    NASA Astrophysics Data System (ADS)

    Alluri, Priyanka

    With about 125 people dying on US roads each day, the US Department of Transportation heightened the awareness of critical safety issues with the passage of SAFETEA-LU (Safe Accountable Flexible Efficient Transportation Equity Act---a Legacy for Users) legislation in 2005. The legislation required each of the states to develop a Strategic Highway Safety Plan (SHSP) and incorporate data-driven approaches to prioritize and evaluate program outcomes: Failure to do so resulted in funding sanctioning. In conjunction with the legislation, research efforts have also been progressing toward the development of new safety analysis tools such as IHSDM (Interactive Highway Safety Design Model), SafetyAnalyst, and HSM (Highway Safety Manual). These software and analysis tools are comparatively more advanced in statistical theory and level of accuracy, and have a tendency to be more data intensive. A review of the 2009 five-percent reports and excerpts from the nationwide survey revealed astonishing facts about the continuing use of traditional methods including crash frequencies and rates for site selection and prioritization. The intense data requirements and statistical complexity of advanced safety tools are considered as a hindrance to their adoption. In this context, this research aims at identifying the data requirements and data availability for SafetyAnalyst and HSM by working with both the tools. This research sets the stage for working with the Empirical Bayes approach by highlighting some of the biases and issues associated with the traditional methods of selecting projects such as greater emphasis on traffic volume and regression-to-mean phenomena. Further, the not-so-obvious issue with shorter segment lengths, which effect the results independent of the methods used, is also discussed. The more reliable and statistically acceptable Empirical Bayes methodology requires safety performance functions (SPFs), regression equations predicting the relation between crashes

  6. 3rd Tech DeltaSphere-3000 Laser 3D Scene Digitizer infrared laser scanner hazard analysis.

    SciTech Connect

    Augustoni, Arnold L.

    2005-02-01

    A laser hazard analysis and safety assessment was performed for the 3rd Tech model DeltaSphere-3000{reg_sign} Laser 3D Scene Digitizer, infrared laser scanner model based on the 2000 version of the American National Standard Institute's Standard Z136.1, for the Safe Use of Lasers. The portable scanner system is used in the Robotic Manufacturing Science and Engineering Laboratory (RMSEL). This scanning system had been proposed to be a demonstrator for a new application. The manufacture lists the Nominal Ocular Hazard Distance (NOHD) as less than 2 meters. It was necessary that SNL validate this NOHD prior to its use as a demonstrator involving the general public. A formal laser hazard analysis is presented for the typical mode of operation for the current configuration as well as a possible modified mode and alternative configuration.

  7. The value of integrating information from multiple hazards for flood risk analysis and management

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, J. T.; Escuder-Bueno, I.; Altarejos-García, L.; Serrano-Lombillo, A.

    2014-02-01

    This article presents a methodology for estimating flood risk in urban areas integrating pluvial flooding, river flooding and failure of both small and large dams. The first part includes a review of basic concepts on flood risk analysis, evaluation and management. Flood risk analyses may be developed at local, regional and national level, however a general methodology to perform a quantitative flood risk analysis including different flood hazards is still required. The second part describes the proposed methodology, which presents an integrated approach - combining pluvial, river flooding and flooding from dam failure, as applied to a case study: an urban area located downstream of a dam under construction. The methodology enhances the approach developed within the SUFRI project ("Sustainable Strategies of Urban Flood Risk Management to cope with the residual risk", 2009-2011). This article also shows how outcomes from flood risk analysis provide better and more complete information to inform authorities, local entities and the stakeholders involved in decision-making with regard to flood risk management.

  8. Hazardous Waste

    MedlinePlus

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  9. The dilemma in prioritizing chemicals for environmental analysis: known versus unknown hazards.

    PubMed

    Anna, Sobek; Sofia, Bejgarn; Christina, Rudén; Magnus, Breitholtz

    2016-08-10

    A major challenge for society is to manage the risks posed by the many chemicals continuously emitted to the environment. All chemicals in production and use cannot be monitored and science-based strategies for prioritization are essential. In this study we review available data to investigate which substances are included in environmental monitoring programs and published research studies reporting analyses of chemicals in Baltic Sea fish between 2000 and 2012. Our aim is to contribute to the discussion of priority settings in environmental chemical monitoring and research, which is closely linked to chemical management. In total, 105 different substances or substance groups were analyzed in Baltic Sea fish. Polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans (PCDD/Fs) and polychlorinated biphenyls (PCBs) were the most studied substances or substance groups. The majority, 87%, of all analyses comprised 20% of the substances or substance groups, whereas 46 substance groups (44%) were analyzed only once. Almost three quarters of all analyses regarded a POP-substance (persistent organic pollutant). These results demonstrate that the majority of analyses on environmental contaminants in Baltic Sea fish concern a small number of already regulated chemicals. Legacy pollutants such as POPs pose a high risk to the Baltic Sea due to their hazardous properties. Yet, there may be a risk that prioritizations for chemical analyses are biased based on the knowns of the past. Such biases may lead to society failing in identifying risks posed by yet unknown hazardous chemicals. Alternative and complementary ways to identify priority chemicals are needed. More transparent communication between risk assessments performed as part of the risk assessment process within REACH and monitoring programs, and information on chemicals contained in consumer articles, would offer ways to identify chemicals for environmental analysis.

  10. The dilemma in prioritizing chemicals for environmental analysis: known versus unknown hazards.

    PubMed

    Anna, Sobek; Sofia, Bejgarn; Christina, Rudén; Magnus, Breitholtz

    2016-08-10

    A major challenge for society is to manage the risks posed by the many chemicals continuously emitted to the environment. All chemicals in production and use cannot be monitored and science-based strategies for prioritization are essential. In this study we review available data to investigate which substances are included in environmental monitoring programs and published research studies reporting analyses of chemicals in Baltic Sea fish between 2000 and 2012. Our aim is to contribute to the discussion of priority settings in environmental chemical monitoring and research, which is closely linked to chemical management. In total, 105 different substances or substance groups were analyzed in Baltic Sea fish. Polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans (PCDD/Fs) and polychlorinated biphenyls (PCBs) were the most studied substances or substance groups. The majority, 87%, of all analyses comprised 20% of the substances or substance groups, whereas 46 substance groups (44%) were analyzed only once. Almost three quarters of all analyses regarded a POP-substance (persistent organic pollutant). These results demonstrate that the majority of analyses on environmental contaminants in Baltic Sea fish concern a small number of already regulated chemicals. Legacy pollutants such as POPs pose a high risk to the Baltic Sea due to their hazardous properties. Yet, there may be a risk that prioritizations for chemical analyses are biased based on the knowns of the past. Such biases may lead to society failing in identifying risks posed by yet unknown hazardous chemicals. Alternative and complementary ways to identify priority chemicals are needed. More transparent communication between risk assessments performed as part of the risk assessment process within REACH and monitoring programs, and information on chemicals contained in consumer articles, would offer ways to identify chemicals for environmental analysis. PMID:27222376

  11. The ARIA project: Advanced Rapid Imaging and Analysis for Natural Hazard Monitoring and Response

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Webb, F.; Simons, M.; Rosen, P. A.; Cruz, J.; Yun, S.; Fielding, E. J.; Moore, A. W.; Hua, H.; Agram, P.; Lundgren, P.

    2012-12-01

    ARIA is a joint JPL/Caltech coordinated effort to automate geodetic imaging capabilities for hazard response and societal benefit. Over the past decade, space-based geodetic measurements such as InSAR and GPS have provided new assessment capabilities and situational awareness on the size and location of earthquakes following seismic disasters and on volcanic eruptions following magmatic events. Geodetic imaging's unique ability to capture surface deformation in high spatial and temporal resolution allow us to resolve the fault geometry and distribution of slip associated with any given earthquake in correspondingly high spatial & temporal detail. In addition, remote sensing with radar provides change detection and damage assessment capabilities for earthquakes, floods and other disasters that can image even at night or through clouds. These data sets are still essentially hand-crafted, and thus are not generated rapidly and reliably enough for informing decision-making agencies and the public following an earthquake. We are building an end-to-end prototype geodetic imaging data system that would form the foundation for an envisioned operational hazard response center integrating InSAR, GPS, seismology, and modeling to deliver monitoring, actionable science, and situational awareness products. This prototype exploits state-of-the-art analysis algorithms from technologists and scientists, These algorithms enable the delivery of actionable products from larger data sets with enhanced modeling and interpretation, and the development of next generation techniques. We are collaborating with USGS scientists in both the earthquake and volcano science program for our initial data product infusion. We present our progress to date on development of prototype data system and demonstration data products, and example responses we have run such as generating products for the 2011 M9.0 Tohoku-oki, M6.3 Christchurch earthquakes, the 2011 M7.1 Van earthquake, and several simulated

  12. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  13. Safety and Hazard Analysis for the Coherent/Acculite Laser Based Sandia Remote Sensing System (Trailer B70).

    SciTech Connect

    Augustoni, Arnold L.

    2005-09-01

    A laser safety and hazard analysis is presented, for the Coherent(r) driven Acculite(r) laser central to the Sandia Remote Sensing System (SRSS). The analysis is based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The trailer (B70) based SRSS laser system is a mobile platform which is used to perform laser interaction experiments and tests at various national test sites. The trailer based SRSS laser system is generally operated on the United State Air Force Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances. In order to protect personnel who work inside the Nominal Hazard Zone (NHZ) from hazardous laser exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (ODmin) necessary for the laser safety eyewear used by authorized personnel. Also, the Nominal Ocular Hazard Distance (NOHD) and The Extended Ocular Hazard Distance (EOHD) are calculated in order to protect unauthorized personnel who may have violated the boundaries of the control area and might enter into the laser's NHZ for testing outside the trailer. 4Page intentionally left blank

  14. Vertical Guidance Performance Analysis of the L1–L5 Dual-Frequency GPS/WAAS User Avionics Sensor

    PubMed Central

    Jan, Shau-Shiun

    2010-01-01

    This paper investigates the potential vertical guidance performance of global positioning system (GPS)/wide area augmentation system (WAAS) user avionics sensor when the modernized GPS and Galileo are available. This paper will first investigate the airborne receiver code noise and multipath (CNMP) confidence (σair). The σair will be the dominant factor in the availability analysis of an L1–L5 dual-frequency GPS/WAAS user avionics sensor. This paper uses the MATLAB Algorithm Availability Simulation Tool (MAAST) to determine the required values for the σair, so that an L1–L5 dual-frequency GPS/WAAS user avionics sensor can meet the vertical guidance requirements of APproach with Vertical guidance (APV) II and CATegory (CAT) I over conterminous United States (CONUS). A modified MAAST that includes the Galileo satellite constellation is used to determine under what user configurations WAAS could be an APV II system or a CAT I system over CONUS. Furthermore, this paper examines the combinations of possible improvements in signal models and the addition of Galileo to determine if GPS/WAAS user avionics sensor could achieve 10 m Vertical Alert Limit (VAL) within the service volume. Finally, this paper presents the future vertical guidance performance of GPS user avionics sensor for the United States’ WAAS, Japanese MTSAT-based satellite augmentation system (MSAS) and European geostationary navigation overlay service (EGNOS). PMID:22319263

  15. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... period for the information collection related to the proposed rule on ``Current Good Manufacturing... information collection provisions that are subject to review by the Office of Management and Budget...

  16. Scalar and vector probabilistic seismic hazard analysis: application for Algiers City

    NASA Astrophysics Data System (ADS)

    Faouzi, Gherboudj; Nasser, Laouami

    2014-04-01

    This study deals with the application of probabilistic seismic hazard analysis (PSHA) for a rock site located in Algiers city. For this purpose, recent ground motion prediction equations developed in the world for similar sismotectonic context are used through logic tree in PSHA framework; the obtained results reflect clearly the high seismicity of the considered region. Moreover, deaggregation analysis is conducted to obtain the mean scenario in terms of magnitude and distance. In addition to the scalar-PSHA, a new method named vector-PSHA developed in recent years is performed in this study. Based on the multivariate probability theory, the software used in scalar approach is modified allowing the application of this approach for a real site in Algiers city with a vector of two and three parameters of intensity measure. The results are presented in terms of the joint annual rate of exceeding several thresholds such as PGA, PSA( T) of multiple vibration periods, peak ground velocity and Arias intensity and comparison between results of PSHA and V-PSHA is done.

  17. Bayesian analysis of uncertainty in predisposing and triggering factors for landslides hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Petropoulos, Y.; Chitu, Z.; Mihai, B.

    2012-04-01

    The landslide hazard analysis models takes into consideration both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. The latter is expressed not as land use classes, as for example CORINE, but as leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index was derived from Landsat time series images, starting from 1984 and up to 2011. All the images available for the Panatau administrative unit in Buzau County, Romania, have been downloaded from http://earthexplorer.usgs.gov, including the images with cloud cover. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created

  18. A Laboratory Exercise for Compatibility Testing of Hazardous Wastes in an Environmental Analysis Course.

    ERIC Educational Resources Information Center

    Chang, J. C.; And Others

    1986-01-01

    Discusses a new program at the University of Michigan in hazardous waste management. Describes a laboratory demonstration that deals with the reactivity and potential violence of several reactions that may be encountered on a hazardous waste site. Provides criteria for selecting particular compatibility testing methods. (TW)

  19. Combined fluvial and pluvial urban flood hazard analysis: concept development and application to Can Tho city, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen

    2016-04-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of

  20. DOE Waste Treatability Group Guidance

    SciTech Connect

    Kirkpatrick, T.D.

    1995-01-01

    This guidance presents a method and definitions for aggregating U.S. Department of Energy (DOE) waste into streams and treatability groups based on characteristic parameters that influence waste management technology needs. Adaptable to all DOE waste types (i.e., radioactive waste, hazardous waste, mixed waste, sanitary waste), the guidance establishes categories and definitions that reflect variations within the radiological, matrix (e.g., bulk physical/chemical form), and regulated contaminant characteristics of DOE waste. Beginning at the waste container level, the guidance presents a logical approach to implementing the characteristic parameter categories as part of the basis for defining waste streams and as the sole basis for assigning streams to treatability groups. Implementation of this guidance at each DOE site will facilitate the development of technically defined, site-specific waste stream data sets to support waste management planning and reporting activities. Consistent implementation at all of the sites will enable aggregation of the site-specific waste stream data sets into comparable national data sets to support these activities at a DOE complex-wide level.

  1. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA): towards PTHA assessment for the coasts of Italy

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Tonini, Roberto; Molinari, Irene; Tiberti, Mara M.; Romano, Fabrizio; Grezio, Anita; Melini, Daniele; Piatanesi, Alessio; Basili, Roberto; Lorito, Stefano

    2016-04-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes. Differently from classical approaches that commonly adopt the hazard integral and logic tree, we use an event tree approach and ensemble modelling. The procedure was developed in the framework of the EC projects ASTARTE and STREST, of the Italian National Flagship project RITMARE, and of the agreement between Italian Civil Protection and INGV. A total of about 2 × 107 different potential seismic sources covering the entire Mediterranean Sea, and more than 1 × 105 alternative model implementations have been considered to quantify both the aleatory variability and the epistemic uncertainty. A set of hazard curves is obtained along the coasts of the entire Italian territory. They are the prototype of the first homogeneous Italian national SPTHA map.

  2. Hazard analysis of active tectonics through geomorphometric parameters to cultural heritage conservation: the case of Paphos in Cyprus

    NASA Astrophysics Data System (ADS)

    Argyriou, A. V.; Sarris, A.; Alexakis, D.; Agapiou, A.; Themistocleous, K.; Lysandrou, V.; Hadjimitsis, D.

    2014-08-01

    Natural hazards, such as earthquakes, can have a large destructive effect on cultural heritage sites conservation. This study aims to assess from a geospatial perspective the risk from natural hazards for the archaeological sites and monuments and evaluate the potential tectonic activity impact on the cultural and historic heritage. Geomorphometric data derivatives that can be extracted from Digital Elevation Models (DEMs) provide information relevant with active tectonics. The specific extracted tectonic information when being used on the basis of analytical hierarchy process and weighted linear combination approach can offer an important robust approach. The ranking of the derived information relatively to specific criteria of weights can enhance the interrelationships and assemblages over neotectonics aspects. The outcomes of that methodological framework can propose an assessment approach for the spatial distribution of neotectonic activity and can become a useful tool to assessing seismic hazard for disaster risk reduction. The risk assessment aspects of such a hazard are being interlinked with the archaeological sites in order to highlight and examine those that are exposed on ongoing tectonic activity and seismic hazard. Paphos area in Cyprus has been used as the test bed for the particular analysis. The results show an important number of archaeological sites being located within zones of high degree of neotectonic activity.

  3. Non-Participation in Guidance: An Opportunity for Development?

    ERIC Educational Resources Information Center

    Thomsen, Rie

    2014-01-01

    This article discusses how new opportunities for guidance can emerge from an analysis of the interplay between the participation (or lack of participation) of the individuals in career guidance, and the career guidance practitioner's response. The article suggests critical psychology as a framework for career guidance research and presents…

  4. Hazardous materials car placement in a train consist. Volume 1. Review and analysis. Final report, 18 Feb-17 Nov 88

    SciTech Connect

    Thompson, R.E.; Zamejc, E.R.; Ahlbeck, D.R.

    1992-06-01

    In response to major derailments involving hazardous materials cars, the Federal Railroad Administration (FRA) initiated the review of the consequences of hazardous materials car placement in a train consist. The review and analysis consisted of six task items: (1) review of accident trends and regulations, (2) an analysis of hazardous materials compatibility, (3) railroad operational constraints, (4) a cost/benefit analysis, (5) recommendations, and (6) preparation of a final report. A review of the 1982-1985 Railroad Accident/Icident Reporting System (RAIRS) data showed the rear quarter to be statistically the 'safest' location in a mainline freight train. Also, the top 101 hazardous commodities (by volume movement) plus fuming nitric acid were analyzed for chemical incompatibility, a total of 5,151 binary combinations. Consequence-based and risk-based rankings were established. Calculations established a post-derailment separation distance of 40 meters minimum to prevent mixing of incompatible chemicals. It was noted that mixing of hazmat materials was not cited in any NTSB accident report as a specific problem.

  5. Recommendations for dealing with waste contaminated with Ebola virus: a Hazard Analysis of Critical Control Points approach

    PubMed Central

    Edmunds, Kelly L; Elrahman, Samira Abd; Bell, Diana J; Brainard, Julii; Dervisevic, Samir; Fedha, Tsimbiri P; Few, Roger; Howard, Guy; Lake, Iain; Maes, Peter; Matofari, Joseph; Minnigh, Harvey; Mohamedani, Ahmed A; Montgomery, Maggie; Morter, Sarah; Muchiri, Edward; Mudau, Lutendo S; Mutua, Benedict M; Ndambuki, Julius M; Pond, Katherine; Sobsey, Mark D; van der Es, Mike; Zeitoun, Mark

    2016-01-01

    Abstract Objective To assess, within communities experiencing Ebola virus outbreaks, the risks associated with the disposal of human waste and to generate recommendations for mitigating such risks. Methods A team with expertise in the Hazard Analysis of Critical Control Points framework identified waste products from the care of individuals with Ebola virus disease and constructed, tested and confirmed flow diagrams showing the creation of such products. After listing potential hazards associated with each step in each flow diagram, the team conducted a hazard analysis, determined critical control points and made recommendations to mitigate the transmission risks at each control point. Findings The collection, transportation, cleaning and shared use of blood-soiled fomites and the shared use of latrines contaminated with blood or bloodied faeces appeared to be associated with particularly high levels of risk of Ebola virus transmission. More moderate levels of risk were associated with the collection and transportation of material contaminated with bodily fluids other than blood, shared use of latrines soiled with such fluids, the cleaning and shared use of fomites soiled with such fluids, and the contamination of the environment during the collection and transportation of blood-contaminated waste. Conclusion The risk of the waste-related transmission of Ebola virus could be reduced by the use of full personal protective equipment, appropriate hand hygiene and an appropriate disinfectant after careful cleaning. Use of the Hazard Analysis of Critical Control Points framework could facilitate rapid responses to outbreaks of emerging infectious disease. PMID:27274594

  6. Guidance for performing site inspections under CERCLA

    SciTech Connect

    Not Available

    1992-09-01

    This guidance presents EPA`s site inspection (SI) strategy. The strategy discusses procedural guidelines to investigate potential Superfund (CERCLA) sites for evaluation pursuant to the Hazard Ranking System (HRS), revised in accordance with the Superfund Amendments and Reauthorization Act of 1986. The HRS is the primary means by which EPA evaluates sites for superfund`s National Priorities List (NPL).

  7. Portable total reflection x-ray fluorescence analysis in the identification of unknown laboratory hazards

    SciTech Connect

    Liu, Ying Imashuku, Susumu; Sasaki, Nobuharu; Ze, Long; Kawai, Jun; Takano, Shotaro; Sohrin, Yoshiki; Seki, Hiroko; Miyauchi, Hiroya

    2014-05-15

    In this study, a portable total reflection x-ray fluorescence (TXRF) spectrometer was used to analyze unknown laboratory hazards that precipitated on exterior surfaces of cooling pipes and fume hood pipes in chemical laboratories. With the aim to examine the accuracy of TXRF analysis for the determination of elemental composition, analytical results were compared with those of wavelength-dispersive x-ray fluorescence spectrometry, scanning electron microscope and energy-dispersive x-ray spectrometry, energy-dispersive x-ray fluorescence spectrometry, inductively coupled plasma atomic emission spectrometry, x-ray diffraction spectrometry (XRD), and x-ray photoelectron spectroscopy (XPS). Detailed comparison of data confirmed that the TXRF method itself was not sufficient to determine all the elements (Z > 11) contained in the samples. In addition, results suggest that XRD should be combined with XPS in order to accurately determine compound composition. This study demonstrates that at least two analytical methods should be used in order to analyze the composition of unknown real samples.

  8. Landslide hazard analysis for pipelines: The case of the Simonette river crossing

    SciTech Connect

    Grivas, D.A.; Schultz, B.C.; O`Neil, G.; Rizkalla, M.; McGuffey, V.C.

    1995-12-31

    The overall objective of this study is to develop a probabilistic methodology to analyze landslide hazards and their effects on the safety of buried pipelines. The methodology incorporates a range of models that can accommodate differences in the ground movement modes and the amount and type of information available at various site locations. Two movement modes are considered, namely (a) instantaneous (catastrophic) slides, and (b) gradual ground movement which may result in cumulative displacements over the pipeline design life (30--40 years) that are in excess of allowable values. Probabilistic analysis is applied in each case to address the uncertainties associated with important factors that control slope stability. Availability of information ranges from relatively well studied, instrumented installations to cases where data is limited to what can be derived from topographic and geologic maps. The methodology distinguishes between procedures applied where there is little information and those that can be used when relatively extensive data is available. important aspects of the methodology are illustrated in a case study involving a pipeline located in Northern Alberta, Canada, in the Simonette river valley.

  9. Enclosure fire hazard analysis using relative energy release criteria. [burning rate and combustion control

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1978-01-01

    A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.

  10. Descriptive characteristics and cluster analysis of male veteran hazardous drinkers in an alcohol moderation intervention.

    PubMed

    Walker, Robrina; Hunt, Yvonne M; Olivier, Jake; Grothe, Karen B; Dubbert, Patricia M; Burke, Randy S; Cushman, William C

    2012-01-01

    Current efforts underway to develop the fifth edition of the Diagnostic and Statistical Manual (DSM-5) have reignited discussions for classifying the substance use disorders. This study's aim was to contribute to the understanding of abusive alcohol use and its validity as a diagnosis. Cluster analysis was used to identify relatively homogeneous groups of hazardous, nondependent drinkers by using data collected from the Prevention and Treatment of Hypertension Study (PATHS), a multisite trial that examined the ability of a cognitive-behavioral-based alcohol reduction intervention, compared to a control condition, to reduce alcohol use. Participants for this study (N = 511) were male military veterans. Variables theoretically associated with alcohol use (eg, demographic, tobacco use, and mental health) were used to create the clusters and a priori, empirically based external criteria were used to assess discriminant validity. Bivariate correlations among cluster variables were generally consistent with previous findings in the literature. Analyses of internal and discriminant validity of the identified clusters were largely nonsignificant, suggesting meaningful differences between clusters could not be identified. Although the typology literature has contributed supportive validity for the alcohol dependence diagnosis, this study's results do not lend supportive validity for the construct of alcohol abuse. PMID:22691012

  11. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

    NASA Astrophysics Data System (ADS)

    Liu, Aiwen; Chen, Kun; Wu, Jian

    2010-06-01

    The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

  12. Validation of acid washes as critical control points in hazard analysis and critical control point systems.

    PubMed

    Dormedy, E S; Brashears, M M; Cutter, C N; Burson, D E

    2000-12-01

    A 2% lactic acid wash used in a large meat-processing facility was validated as an effective critical control point (CCP) in a hazard analysis and critical control point (HACCP) plan. We examined the microbial profiles of beef carcasses before the acid wash, beef carcasses immediately after the acid wash, beef carcasses 24 h after the acid wash, beef subprimal cuts from the acid-washed carcasses, and on ground beef made from acid-washed carcasses. Total mesophilic, psychrotrophic, coliforms, generic Escherichia coli, lactic acid bacteria, pseudomonads, and acid-tolerant microorganisms were enumerated on all samples. The presence of Salmonella spp. was also determined. Acid washing significantly reduced all counts except for pseudomonads that were present at very low numbers before acid washing. All other counts continued to stay significantly lower (P < 0.05) than those on pre-acid-washed carcasses throughout all processing steps. Total bacteria, coliforms, and generic E. coli enumerated on ground beef samples were more than 1 log cycle lower than those reported in the U.S. Department of Agriculture Baseline data. This study suggests that acid washes may be effective CCPs in HACCP plans and can significantly reduce the total number of microorganisms present on the carcass and during further processing. PMID:11131890

  13. Subsystem Hazard Analysis Methodology for the Ares I Upper Stage Source Controlled Items

    NASA Technical Reports Server (NTRS)

    Mitchell, Michael S.; Winner, David R.

    2010-01-01

    This article describes processes involved in developing subsystem hazard analyses for Source Controlled Items (SCI), specific components, sub-assemblies, and/or piece parts, of the NASA ARES I Upper Stage (US) project. SCIs will be designed, developed and /or procured by Boeing as an end item or an off-the-shelf item. Objectives include explaining the methodology, tools, stakeholders and products involved in development of these hazard analyses. Progress made and further challenges in identifying potential subsystem hazards are also provided in an effort to assist the System Safety community in understanding one part of the ARES I Upper Stage project.

  14. Safety in earth orbit study. Volume 2: Analysis of hazardous payloads, docking, on-board survivability

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Detailed and supporting analyses are presented of the hazardous payloads, docking, and on-board survivability aspects connected with earth orbital operations of the space shuttle program. The hazards resulting from delivery, deployment, and retrieval of hazardous payloads, and from handling and transport of cargo between orbiter, sortie modules, and space station are identified and analyzed. The safety aspects of shuttle orbiter to modular space station docking includes docking for assembly of space station, normal resupply docking, and emergency docking. Personnel traffic patterns, escape routes, and on-board survivability are analyzed for orbiter with crew and passenger, sortie modules, and modular space station, under normal, emergency, and EVA and IVA operations.

  15. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  16. Hazard analysis and critical control point systems in the United States Department of Agriculture regulatory policy.

    PubMed

    Billy, T J; Wachsmuth, I K

    1997-08-01

    Recent outbreaks of foodborne illness and studies by expert groups have established the need for fundamental change in the United States meat and poultry inspection programme to reduce the risk of foodborne illness. The Food Safety and Inspection Service (FSIS) of the United States Department of Agriculture (USDA) has embarked on a broad effort to bring about such change, with particular emphasis on the reduction of pathogenic micro-organisms in raw meat and poultry products. The publication on 25 July 1996 of the Final Rule on pathogen reduction and hazard analysis and critical control point (HACCP) systems was a major milestone in the FSIS strategy for change. The Final Rule provides a framework for change and clarifies the respective roles of industry and government in ensuring the safety of meat and poultry products. With the implementation of this Final Rule underway, the FSIS has been exploring ways in which slaughter inspection carried out under an HACCP-based system can be changed so that food safety risks are addressed more adequately and the allocation of inspection resources is improved further. In addition, the FSIS is broadening the focus of food safety activities to extend beyond slaughter and processing plants by working with industry, academia and other government agencies. Such co-operation should lead to the development of measures to improve food safety before animals reach the slaughter plant and after products leave the inspected establishment for distribution to the retail level. For the future, the FSIS believes that quantitative risk assessments will be at the core of food safety activities. Risk assessments provide the most effective means of identifying how specific pathogens and other hazards may be encountered throughout the farm-to-table chain and of measuring the potential impact of various interventions. In addition, these assessments will be used in the development and evaluation of HACCP systems. The FSIS is currently conducting a

  17. Hazard analysis in active landslide areas in the State of Veracruz, Mexico

    NASA Astrophysics Data System (ADS)

    Wilde, Martina; Morales Barrera, Wendy V.; Rodriguez Elizarrarás, Sergio R.; Solleiro Rebolledo, Elizabeth; Sedov, Sergey; Terhorst, Birgit

    2016-04-01

    mass movements are analyzed in order to reconstruct complex interrelations of the causes and effects of landslide events. One of the major objectives of this research is to evaluate the potential hazard of active landslide areas. Detailed field analyzes were performed to investigate the situations and dynamics of the slope movements. Therefore, geomorphological mapping, sediment characterization as well as geophysical methods are applied. On the one hand, a detailed sediment characterization aims to identify the type of material (e.g. geotechnical attributes), on the other sediments can provide information on different activity phases, respectively movement processes in slide masses. Furthermore, the focus is placed on the determination of landslide relevant parameters and thresholds. Digital elevation models, which were generated before the onset of slope movements, are integrated in the geomorphological analysis. The poster presents the specific study sites in Veracruz and the situation of endangered slopes before and after the landslide events. It is planned to use this knowledge to model susceptibility maps for the region in the future. Moreover, field data will be used as basic information for further monitoring plans. Resulting susceptibility maps will be provided to the responsible authorities in order to support sustainable planning of settlements and infrastructure in hazardous regions.

  18. Analysis of Earthquake Hazard and Perceptibility Study in Çanakkale, NW Turkey

    NASA Astrophysics Data System (ADS)

    Bekler, Tolga

    2010-05-01

    Keeping in mind the epicenter distributions of earthquakes in and around the city of Çanakkale and faulting, hazard study was performed for the region in 4 seismic sub-zones. Using the seismic data for 4933 earthquakes with magnitudes M≥3 that took place between the years 1903-2009, seismic hazard input parameters a and b were calculated using empirical relations. The probabilities of earthquake occurrences and recurrence periods have been obtained by using Poisson and Gumbel extreme values statistical distribution model. In order to determine the regions where strong and destructive earthquakes may occur, distribution maps for a, b and a/b values have been prepared. For the studied region of Çanakkale and its surroundings the values of a and b were determined to be 5.85 and 0.80 respectively. According to the analysis made with respect to seismic sub regions, the region where the risk of an earthquake occurrence is greatest has been determined to be the region taking in Gelibolu-Tekirdag-West Marmara Sea. Whereas the risk of an earthquake occurrence within a 100 year period with a magnitude of 7.3 was determined to be 46% according to the Poisson distribution, the same value was determined to be 71% according to Gumbel distribution and the recurrence period was calculated to be 50%. The region extending NW-SE between North of Edremit Bay, west of Saros Bay, Yenice Gönen is where the seismic hazard is high. When we analyze the a/b modal distribution maps for the b-value tectonic parameter of the study region, comparatively higher a/b values are observed at low b values. When the earthquakes with magnitudes of M≥ 3.0 that have taken place within the last 100 years are analyzed, the regions where Saroz- Gazikoy and Yenice-Gonen faults are located at are the regions where b value has decreased relatively and where the risk of earthquake occurrence has increased. The b values range between 1.0 and 1.15 in this region. High magnitude earthquakes have not been

  19. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    NASA Astrophysics Data System (ADS)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  20. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

    2005-12-01

    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy

  1. A comparative analysis of hazard models for predicting debris flows in Madison County, VA

    USGS Publications Warehouse

    Morrissey, Meghan M.; Wieczorek, Gerald F.; Morgan, Benjamin A.

    2001-01-01

    During the rainstorm of June 27, 1995, roughly 330-750 mm of rain fell within a sixteen-hour period, initiating floods and over 600 debris flows in a small area (130 km2) of Madison County, Virginia. Field studies showed that the majority (70%) of these debris flows initiated with a thickness of 0.5 to 3.0 m in colluvium on slopes from 17 o to 41 o (Wieczorek et al., 2000). This paper evaluated and compared the approaches of SINMAP, LISA, and Iverson's (2000) transient response model for slope stability analysis by applying each model to the landslide data from Madison County. Of these three stability models, only Iverson's transient response model evaluated stability conditions as a function of time and depth. Iverson?s model would be the preferred method of the three models to evaluate landslide hazards on a regional scale in areas prone to rain-induced landslides as it considers both the transient and spatial response of pore pressure in its calculation of slope stability. The stability calculation used in SINMAP and LISA is similar and utilizes probability distribution functions for certain parameters. Unlike SINMAP that only considers soil cohesion, internal friction angle and rainfall-rate distributions, LISA allows the use of distributed data for all parameters, so it is the preferred model to evaluate slope stability over SINMAP. Results from all three models suggested similar soil and hydrologic properties for triggering the landslides that occurred during the 1995 storm in Madison County, Virginia. The colluvium probably had cohesion of less than 2KPa. The root-soil system is above the failure plane and consequently root strength and tree surcharge had negligible effect on slope stability. The result that the final location of the water table was near the ground surface is supported by the water budget analysis of the rainstorm conducted by Smith et al. (1996).

  2. A Critical Analysis of UNESCO's International Technical Guidance on School-Based Education for Puberty and Sexuality

    ERIC Educational Resources Information Center

    Goldman, Juliette D. G.

    2012-01-01

    Preparing children and adolescents for sexual safety and reproductive responsibility lies at the heart and purpose of puberty/sexuality education. The document of International Technical Guidance released by UNESCO in December 2009 aims to provide an evidence-based and rights-based platform offering children and adolescents vital knowledge about…

  3. 77 FR 9946 - Draft Guidance for Industry on Drug Interaction Studies-Study Design, Data Analysis, Implications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-21

    ... Federal Register of September 12, 2006 (71 FR 53696), FDA announced the availability of a draft guidance... in vivo studies of drug metabolism, drug transport, and drug-drug, or drug-therapeutic protein... metabolism and/or drug transport abruptly in individuals who previously had been receiving and tolerating...

  4. Analysis of targeting method for closed-loop guidance of a multi-stage space launch vehicle

    NASA Astrophysics Data System (ADS)

    Song, Eun-Jung; Cho, Sangbum; Roh, Woong-Rae

    2016-04-01

    This study considers the orbit insertion targeting problem for guidance of a multi-stage space launch vehicle. To design guidance algorithm for a particular mission target, the orbit designation method should be determined first, as the target orbit of closed-loop guidance during the upper stage flight is usually specified at the end of the final stage. We investigate some advantages of this single target approach over the one with intermediate targets in addition to the final target by comparing them in various flight conditions using an explicit guidance algorithm. We notice that in some cases application of the single targeting can be rather tricky and thus propose here an extended version of the single targeting algorithm that can be applied to problems where the target orbit plane has to be changed during the upper stages flight. We also demonstrate that this pseudo-single target method can be applied to these kinds of problems smoothly in contrast to the conventional single target algorithms by conducting numerical simulations for disturbed off-nominal conditions as well as for the nominal condition.

  5. Development and implementation of hazard analysis and critical control point plans by several U.S. feed manufacturers.

    PubMed

    Herrman, Timothy J; Langemeier, Michael R; Frederking, Matt

    2007-12-01

    The commitment to consumer food safety, global trade, and proposed new regulations by the U.S. Food and Drug Administration Center for Veterinary Medicine has led to increased adoption of hazard analysis and critical control point (HACCP) by the U.S. feed industry. A project supported by the U.S. Department of Agriculture Integrated Food Safety Initiate titled "Development and Implementation of a Voluntary HACCP Program for the US Feed Industry" enabled faculty from three land grant universities to assist individuals from 14 feed companies that collectively manufacture 15 million metric tons of feed in 100 facilities to develop HACCP plans. The process flow in these plans averaged 20 steps, and the most detailed plan included 60 process steps. Chemical hazards were more commonly identified in HACCP plans (average of four hazards per plan) than were biological hazards (average of one per plan). The most prevalent chemical hazards were cross-contamination of type A medicated articles and type B medicated feeds, aflatoxin, and wrong ingredient inclusion in feed. The most common biological hazard was mammalian protein contamination of feed ingredients and finished feed for cattle. An assessment of time and costs associated with developing HACCP plans revealed that approximately 29% of the companies needed additional personnel or additional equipment to implement a HACCP plan, and on average 268 additional person hours were needed to develop and implement a HACCP plan. Plan design, compliance monitoring, and record keeping were the three most time-consuming activities needed for developing and implementing a HACCP plan. The average cost of additional equipment needed to implement a HACCP plan was $250.

  6. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    SciTech Connect

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  7. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, Northwest Italy

    NASA Astrophysics Data System (ADS)

    Turconi, L.; Tropeano, D.; Savio, G.; De, S. K.; Mason, P. J.

    2015-09-01

    The study area (600 km2), consisting of Orco and Soana valleys in the Western Italian Alps, experienced different types of natural hazards, typical of the whole Alpine environment. Some of the authors have been requested to draw a civil protection plan for such mountainous regions. This offered the special opportunity (1) to draw a lot of unpublished historical data, dating back several centuries mostly concerning natural hazard processes and related damages, (2) to develop original detailed geo-morphological studies in a region still poorly known, (3) to prepare detailed thematic maps illustrating landscape components related to natural conditions and hazards, (4) to thoroughly check present-day situations in the area compared to the effects of past events and (5) to find adequate natural hazard scenarios for all sites exposed to risk. The method of work has been essentially to compare archival findings with field evidence in order to assess natural hazard processes, their occurrence and magnitude, and to arrange all such elements in a database for GIS-supported thematic maps. Several types of natural hazards, such as landslides, rockfalls, debris flows, stream floods and snow avalanches cause huge damage to lives and properties (housings, roads, tourist sites). We aim to obtain newly acquired knowledge in this large, still poorly understood area as well as develop easy-to-interpret products such as natural risk maps.

  8. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    PubMed

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters.

  9. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    PubMed

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. PMID:26327640

  10. Endangerment assessment guidance. Final report

    SciTech Connect

    Not Available

    1985-10-21

    The directive clarifies the requirement that an endangerment assessment be developed to support all administrative and judicial enforcement actions under Section 106 of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and Section 7003 of the Resource Conservation and Recovery Act (RCRA). Before taking enforcement action under these provisions to abate the hazards or potential hazards at a site, the Environmental Protection Agency (EPA) must be able to properly document and justify its assertion that an imminent and substantial endangerment to public health or welfare or the environment may exist. The endangerment assessment provides the documentation and justification. The endangerment assessment is not necessary to support Section 104 actions. It also provides guidance on the content, timing, level of detail, format, and resources required for the preparation of endangerment assessments.

  11. Automation of statistical analysis in the WIPP hazardous waste facility permit for analytical results from characterization

    SciTech Connect

    Shokes, T.; Einerson, J.

    2007-07-01

    One goal of characterizing, processing, and shipping waste to the Waste Isolation Pilot Plant (WIPP) is to make all activities as efficient as possible. Data management and repetitive calculations are a critical part of the process that can be automated, thereby increasing the accuracy and rate at which work is completed and reducing costs. This paper presents the tools developed to automate statistical analysis and other calculations required by the WIPP Hazardous Waste Facility Permit (HWFP). Statistical analyses are performed on the analytical results on gas samples from the headspace of waste containers and solid samples from the core of the waste container. The calculations include determining the number of samples, test for the shape of the distribution of the analytical results, mean, standard deviation, upper 90-percent confidence limit of the mean, and the minimum required Waste Acceptance Plan (WAP) sample size. The input data for these calculations are from the batch data reports for headspace gas analytical results and solids analysis, which must also be obtained and collated for proper use. The most challenging component of the statistical analysis, if performed manually, is the determination of the distribution shape; therefore, the distribution testing is typically performed using a certified software tool. All other calculations can be completed manually, with a spreadsheet, custom developed software, and/or certified software tool. Out of the options available, manually performing the calculations or using a spreadsheet are the least desirable. These methods rely heavily on the availability of an expert, such as a statistician, to perform the calculation. These methods are also more open to human error such as transcription or 'cut and paste' errors. A SAS program is in the process of being developed to perform the calculations. Due to the potential size of the data input files and the need to archive the data in an accessible format, the SAS

  12. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    SciTech Connect

    MULKEY, C.H.

    1999-07-06

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.

  13. Analysis of Landslide Hazard Impact Using the Landslide Database for Germany

    NASA Astrophysics Data System (ADS)

    Klose, M.; Damm, B.

    2014-12-01

    The Federal Republic of Germany has long been among the few European countries that lack a national landslide database. Systematic collection and inventory of landslide data still shows a comprehensive research history in Germany, but only one focused on development of databases with local or regional coverage. This has changed in recent years with the launch of a database initiative aimed at closing the data gap existing at national level. The present contribution reports on this project that is based on a landslide database which evolved over the last 15 years to a database covering large parts of Germany. A strategy of systematic retrieval, extraction, and fusion of landslide data is at the heart of the methodology, providing the basis for a database with a broad potential of application. The database offers a data pool of more than 4,200 landslide data sets with over 13,000 single data files and dates back to 12th century. All types of landslides are covered by the database, which stores not only core attributes, but also various complementary data, including data on landslide causes, impacts, and mitigation. The current database migration to PostgreSQL/PostGIS is focused on unlocking the full scientific potential of the database, while enabling data sharing and knowledge transfer via a web GIS platform. In this contribution, the goals and the research strategy of the database project are highlighted at first, with a summary of best practices in database development providing perspective. Next, the focus is on key aspects of the methodology, which is followed by the results of different case studies in the German Central Uplands. The case study results exemplify database application in analysis of vulnerability to landslides, impact statistics, and hazard or cost modeling.

  14. AschFlow - A dynamic landslide run-out model for medium scale hazard analysis.

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Blahut, Jan; van Asch, Theo; van Westen, Cees; Kappes, Melanie

    2015-04-01

    Landslides and debris flow hazard assessments require a scale-dependent analysis in order to mitigate damage and other negative consequences at the respective scales of occurrence. Medium or large scale landslide run-out modelling for many possible landslide initiation areas has been a cumbersome task in the past. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the run-out models to compute the displacement with a large amount of individual initiation areas (computational exhaustive). Most of the existing physically based run-out models have complications in handling such situations and therefore empirical methods have been used as a practical mean to predict landslides mobility at a medium scale (1:10,000 to 1:50,000). In this context, a simple medium scale numerical model for rapid mass movements in urban and mountainous areas was developed. The deterministic nature of the approach makes it possible to calculate the velocity, height and increase in mass by erosion, resulting in the estimation of various forms of impacts exerted by debris flows at the medium scale The established and implemented model ("AschFlow") is a 2-D one-phase continuum model that simulates, the entrainment, spreading and deposition process of a landslide or debris flow at a medium scale. The flow is thus treated as a single phase material, whose behavior is controlled by rheology (e.g. Voellmy or Bingham). The developed regional model "AschFlow" was applied and evaluated in well documented areas with known past debris flow events.

  15. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  16. Childhood cancer incidence rates and hazardous air pollutants in California: an exploratory analysis.

    PubMed Central

    Reynolds, Peggy; Von Behren, Julie; Gunier, Robert B; Goldberg, Debbie E; Hertz, Andrew; Smith, Daniel F

    2003-01-01

    Hazardous air pollutants (HAPs) are compounds shown to cause cancer or other adverse health effects. We analyzed population-based childhood cancer incidence rates in California (USA) from 1988 to 1994, by HAP exposure scores, for all California census tracts. For each census tract, we calculated exposure scores by combining cancer potency factors with outdoor HAP concentrations modeled by the U.S. Environmental Protection Agency. We evaluated the relationship between childhood cancer rates and exposure scores for 25 potentially carcinogenic HAPs emitted from mobile, area, and point sources and from all sources combined. Our study period saw 7,143 newly diagnosed cancer cases in California; of these, 6,989 (97.8%) could be assigned to census tracts and included in our analysis. Using Poisson regression, we estimated rate ratios (RRs) adjusted for age, race/ethnicity, and sex. We found little evidence for elevated cancer RRs for all sites or for gliomas among children living in high-ranking combined-source exposure areas. We found elevated RRs and a significant trend with increasing exposure level for childhood leukemia in tracts ranked highest for exposure to the combined group of 25 HAPs (RR = 1.21; 95% confidence interval, 1.03, 1.42) and in tracts ranked highest for point-source HAP exposure (RR = 1.32; 95% confidence interval, 1.11, 1.57). Our findings suggest an association between increased childhood leukemia rates and high HAP exposure, but studies involving more comprehensive exposure assessment and individual-level exposure data will be important for elucidating this relationship. PMID:12676632

  17. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE PAGES

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  18. Topographic and Remote Sensing Analysis of Explosion Craters in the Nevada Test Site: Application to Mars Hazards

    NASA Astrophysics Data System (ADS)

    Hurtado, J. M.; Chee, Y.; Farr, T. G.

    2004-12-01

    High-resolution LIDAR (LIght Detection and Ranging) data were recently acquired for several explosion craters at the Nevada Test Site (NTS) for the purpose of generating high-resolution (1- to 2-m horizontal posting; 50-cm vertical accuracy) digital elevation models (DEMs). The LIDAR DEMs and ancillary datasets, including ASTER visible-IR images and SIR-C radar images, are to be used in the study of cratering processes. Among the questions we seek to answer are whether these explosion craters can be used as analogs for impact craters on Mars. To this end, we present an error analysis of the LIDAR DEMs and an assessment of their utility in hazard avoidance applications for landers and rovers. We show preliminary data integration results in which we geologically characterize the multisensor signatures of the craters and their ejecta. These results are interpreted in terms of hazard zones defined by different types of ejecta, rock populations, and topographic characteristics (e.g. slope, aspect, relief). From these, we aim to produce a quantitative hazards map for craters in the NTS that can serve as a model for the type of information LIDAR and imagery can provide to a hazard avoidance system. All data and derived products are being manipulated and cataloged in a geographic information system (GIS), which will facilitate the packaging and distribution of materials via the Planetary Data System or other community data repository. Work performed under contract to NASA.

  19. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    SciTech Connect

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.

  20. 2002 Hyperspectral Analysis of Hazardous Waste Sites on the Savannah River Site

    SciTech Connect

    Gladden, J.B.

    2003-08-28

    Hazardous waste site inspection is a labor intensive, time consuming job, performed primarily on the ground using visual inspection and instrumentation. It is an expensive process to continually monitor hazardous waste and/or landfill sites to determine if they are maintaining their integrity. In certain instances, it may be possible to monitor aspects of the hazardous waste sites and landfills remotely. The utilization of multispectral data was suggested for the mapping of clays and iron oxides associated with contaminated groundwater, vegetation stress, and methane gas emissions (which require longer wavelength detectors). The Savannah River Site (SRS) near Aiken, S.C. is a United States Department of Energy facility operated by the Westinghouse Savannah River Company. For decades the SRS was responsible for developing weapons grade plutonium and other materials for the nation's nuclear defense. Hazardous waste was generated during this process. Waste storage site inspection is a particularly important issue at the SRS because there are over 100 hazardous waste sites scattered throughout the 300 mile complex making it difficult to continually monitor all of the facilities. The goal is to use remote sensing technology to identify surface anomalies on the hazardous waste sites as early as possible so that remedial work can take place rapidly to maintain the integrity of the storage sites. The anomalous areas are then targeted for intensive in situ human examination and measurement. During the 1990s, many of the hazardous waste sites were capped with protective layers of polyethelene sheeting and soil, and planted with bahia grass and/or centipede grass. This research investigated hyperspectral remote sensing technology to determine if it can be used to measure accurately and monitor possible indicators of change on vegetated hazardous waste sites. Specifically, it evaluated the usefulness of hyperspectral remote sensing to assess the condition of vegetation on clay

  1. The RiskScape System - a tool for quantitative multi-risk analysis for natural hazards.

    NASA Astrophysics Data System (ADS)

    Schmidt, J.; Reese, S.; Matcham, I.; King, A.; Bell, R.

    2009-04-01

    This paper introduces a generic framework for multi-risk modelling developed in the project ‘Regional RiskScape' at the Research Organization GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand. Our goal was to develop a generic technology for modelling risks from multiple natural hazards and for multiple risk elements. The framework is independent on the specific nature of the individual hazard and individual risk element. A software prototype has been developed which is capable of ‘plugging in' various natural hazards and risk elements without reconfiguring / adapting the generic software framework. To achieve that goal we developed a set of standards for treating the fundamental components of a risk model: hazards, assets (risk elements), and vulnerability models (or fragility functions). Thus, the developed prototype system is able to understand any hazard, asset, or fragility model which is provided to the system according to that standard. We tested the software prototype for modelling earthquake, volcanic, flood, wind, and tsunami risks for urban centres in New Zealand.

  2. Simulating Social and Political Influences on Hazard Analysis through a Classroom Role Playing Exercise

    NASA Astrophysics Data System (ADS)

    Hales, T. C.; Cashman, K. V.

    2006-12-01

    Geological hazard mitigation is a complicated process that involves both detailed scientific research and negotiations between community members with competing interests in the solution. Geological hazards classes based around traditional lecture methods have difficulty conveying the decision-making processes that go into these negotiations. To address this deficiency, we have spent five years developing and testing a role- playing exercise based on mitigation of a dam outburst hazard on Ruapehu volcano, New Zealand. In our exercise, students are asked to undertake one of five different roles and decide the best way to mitigate the hazard. Over the course of their discussion students are challenged to reach a consensus decision despite the presence of strongly opposed positions. Key to the success of the exercise are (1) the presence of a facilitator and recorder for each meeting, (2) the provision of unique information for each interested party, and (3) the division of the class into multiple meeting groups, such that everyone is required to participate and individual groups can evolve to different conclusions. The exercise can be completed in a single hour and twenty minute classroom session that is divided into four parts: an introduction, a meeting between members of the same interested party to discuss strategy, a meeting between different interested parties, and a debriefing session. This framework can be readily translated to any classroom hazard problem. In our experience, students have responded positively to the use of role-playing to supplement lectures.

  3. Moving beyond the hazard ratio in quantifying the between-group difference in survival analysis.

    PubMed

    Uno, Hajime; Claggett, Brian; Tian, Lu; Inoue, Eisuke; Gallo, Paul; Miyata, Toshio; Schrag, Deborah; Takeuchi, Masahiro; Uyama, Yoshiaki; Zhao, Lihui; Skali, Hicham; Solomon, Scott; Jacobus, Susanna; Hughes, Michael; Packer, Milton; Wei, Lee-Jen

    2014-08-01

    In a longitudinal clinical study to compare two groups, the primary end point is often the time to a specific event (eg, disease progression, death). The hazard ratio estimate is routinely used to empirically quantify the between-group difference under the assumption that the ratio of the two hazard functions is approximately constant over time. When this assumption is plausible, such a ratio estimate may capture the relative difference between two survival curves. However, the clinical meaning of such a ratio estimate is difficult, if not impossible, to interpret when the underlying proportional hazards assumption is violated (ie, the hazard ratio is not constant over time). Although this issue has been studied extensively and various alternatives to the hazard ratio estimator have been discussed in the statistical literature, such crucial information does not seem to have reached the broader community of health science researchers. In this article, we summarize several critical concerns regarding this conventional practice and discuss various well-known alternatives for quantifying the underlying differences between groups with respect to a time-to-event end point. The data from three recent cancer clinical trials, which reflect a variety of scenarios, are used throughout to illustrate our discussions. When there is not sufficient information about the profile of the between-group difference at the design stage of the study, we encourage practitioners to consider a prespecified, clinically meaningful, model-free measure for quantifying the difference and to use robust estimation procedures to draw primary inferences. PMID:24982461

  4. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    SciTech Connect

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  5. Controls and guidance: Space

    NASA Technical Reports Server (NTRS)

    Dibattista, John D.

    1988-01-01

    The Space Controls and Guidance Research and Technology Program is directed toward enabling the next generation of space transportation systems, large future spacecraft, and space systems such as the Space Station to have large communication antennas and high precision segmented reflector astrophysical telescopes. The new generation of transportation vehicles has demanding requirements to provide for an order of magnitude reduction in cost as well as an increase in capability. The future orbital facilities have demanding control requirements for pointing and stabilization, momentum management, build-up and growth accomodation, and disturbance management. To address these advanced requirements, the research and development program is designed to provide the generic technology base to support the implementation of advanced guidance, navigation, and control. The area of computational controls will be stressed in order to develop cost effective, high speed, high fidelity control system simulation and analysis and synthesis tools. The trust of this work will be to develop methods and software to enable analysis and real-time hardware-in-the-loop simulation of complex spacecraft for control design certification. To address future orbital facilities requirements, an advanced technology program is underway in system identification, distributed control, integrated controls/structures design methods, and advanced sensors and actuators. Because the behavior of large, light weight per unit area deployable/assembled spacecraft is greatly influenced by the ground environment, the testing and verification activity is both ground- and space-based.

  6. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L. K.; Vogel, R. M.

    2015-11-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.

  7. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  8. Hazard function theory for nonstationary natural hazards

    DOE PAGES

    Read, Laura K.; Vogel, Richard M.

    2016-04-11

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  9. Unified powered flight guidance

    NASA Technical Reports Server (NTRS)

    Brand, T. J.; Brown, D. W.; Higgins, J. P.

    1973-01-01

    A complete revision of the orbiter powered flight guidance scheme is presented. A unified approach to powered flight guidance was taken to accommodate all phases of exo-atmospheric orbiter powered flight, from ascent through deorbit. The guidance scheme was changed from the previous modified version of the Lambert Aim Point Maneuver Mode used in Apollo to one that employs linear tangent guidance concepts. This document replaces the previous ascent phase equation document.

  10. Médecins Sans Frontières' Clinical Guidance mobile application: analysis of a new electronic health tool

    PubMed Central

    Wright, V.; Dalwai, M.; Smith, R. Vincent

    2015-01-01

    Many health care workers lack access to clinical support tools in rural and resource-limited settings. To address this gap, the Médecins Sans Frontières (MSF) Clinical Guidelines manual was converted into a static mobile health reference application (app) entitled MSF Guidance. The app's utility and growth was examined, and within 6 months of its launch 150 countries had downloaded the app, with demonstrated retention among new and existing users. With over 3500 downloads and 36 000 sessions amounting to 250 000 screen views, MSF Guidance is a new mobile health platform with widely demonstrated utility, including potential use as an epidemiological tool, where clinical conditions investigated by app users were found to correlate with geographical outbreaks. These findings show that mobile apps can be used to disseminate health information effectively. PMID:26767171

  11. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  12. Design and tolerance analysis of two null corrector designs for the Space Telescope fine guidance aspheric collimating mirror

    NASA Technical Reports Server (NTRS)

    Friedman, I.; Casas, R. E.

    1982-01-01

    The collimating mirror within the Fine Guidance Subsystem of the Space Telescope's Pointing Control System is aspherized in order to correct the pupil aberration. A null corrector is needed to test the collimating mirror in autocollimation. Triplet and doublet null corrector designs are subjected to tolerance sensitivity analyses, and the doublet design is chosen despite its more restricted tolerances because of its compactness and simplicity.

  13. New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.

    2013-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for

  14. Review Article "Valuating the intangible effects of natural hazards - review and analysis of the costing methods"

    NASA Astrophysics Data System (ADS)

    Markantonis, V.; Meyer, V.; Schwarze, R.

    2012-05-01

    The "intangible" or "non-market" effects are those costs of natural hazards which are not, or at least not easily measurable in monetary terms, as for example, impacts on health, cultural heritage or the environment. The intangible effects are often not included in costs assessments of natural hazards leading to an incomplete and biased cost assessment. However, several methods exist which try to estimate these effects in a non-monetary or monetary form. The objective of the present paper is to review and evaluate methods for estimating the intangible effects of natural hazards, specifically related to health and environmental effects. Existing methods are analyzed and compared using various criteria, research gaps are identified, application recommendations are provided, and valuation issues that should be addressed by the scientific community are highlighted.

  15. ANALYSIS AND MITIGATION OF X-RAY HAZARD GENERATED FROM HIGH INTENSITY LASER-TARGET INTERACTIONS

    SciTech Connect

    Qiu, R.; Liu, J.C.; Prinz, A.A.; Rokni, S.H.; Woods, M.; Xia, Z.; /SLAC

    2011-03-21

    Interaction of a high intensity laser with matter may generate an ionizing radiation hazard. Very limited studies have been made, however, on the laser-induced radiation protection issue. This work reviews available literature on the physics and characteristics of laser-induced X-ray hazards. Important aspects include the laser-to-electron energy conversion efficiency, electron angular distribution, electron energy spectrum and effective temperature, and bremsstrahlung production of X-rays in the target. The possible X-ray dose rates for several femtosecond Ti:sapphire laser systems used at SLAC, including the short pulse laser system for the Matter in Extreme Conditions Instrument (peak power 4 TW and peak intensity 2.4 x 10{sup 18} W/cm{sup 2}) were analysed. A graded approach to mitigate the laser-induced X-ray hazard with a combination of engineered and administrative controls is also proposed.

  16. Assessing the long-term probabilistic volcanic hazard for tephra fallout in Reykjavik, Iceland: a preliminary multi-source analysis

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Barsotti, Sara; Sandri, Laura; Tumi Guðmundsson, Magnús

    2015-04-01

    Icelandic volcanism is largely dominated by basaltic magma. Nevertheless the presence of glaciers over many Icelandic volcanic systems results in frequent phreatomagmatic eruptions and associated tephra production, making explosive eruptions the most common type of volcanic activity. Jökulhlaups are commonly considered as major volcanic hazard in Iceland for their high frequency and potentially very devastating local impact. Tephra fallout is also frequent and can impact larger areas. It is driven by the wind direction that can change with both altitude and season, making impossible to predict a priori where the tephra will be deposited during the next eruptions. Most of the volcanic activity in Iceland occurs in the central eastern part, over 100 km to the east of the main population centre around the capital Reykjavík. Therefore, the hazard from tephra fallout in Reykjavík is expected to be smaller than for communities settled near the main volcanic systems. However, within the framework of quantitative hazard and risk analyses, less frequent and/or less intense phenomena should not be neglected, since their risk evaluation depends on the effects suffered by the selected target. This is particularly true if the target is highly vulnerable, as large urban areas or important infrastructures. In this work we present the preliminary analysis aiming to perform a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra fallout focused on the target area which includes the municipality of Reykjavík and the Keflavík international airport. This approach reverts the more common perspective where the hazard analysis is focused on the source (the volcanic system) and it follows a multi-source approach: indeed, the idea is to quantify, homogeneously, the hazard due to the main hazardous volcanoes that could pose a tephra fallout threat for the municipality of Reykjavík and the Keflavík airport. PVHA for each volcanic system is calculated independently and the results

  17. A hazard rate analysis of fertility using duration data from Malaysia.

    PubMed

    Chang, C

    1988-01-01

    Data from the Malaysia Fertility and Family Planning Survey (MFLS) of 1974 were used to investigate the effects of biological and socioeconomic variables on fertility based on the hazard rate model. Another study objective was to investigate the robustness of the findings of Trussell et al. (1985) by comparing the findings of this study with theirs. The hazard rate of conception for the jth fecundable spell of the ith woman, hij, is determined by duration dependence, tij, measured by the waiting time to conception; unmeasured heterogeneity (HETi; the time-invariant variables, Yi (race, cohort, education, age at marriage); and time-varying variables, Xij (age, parity, opportunity cost, income, child mortality, child sex composition). In this study, all the time-varying variables were constant over a spell. An asymptotic X2 test for the equality of constant hazard rates across birth orders, allowing time-invariant variables and heterogeneity, showed the importance of time-varying variables and duration dependence. Under the assumption of fixed effects heterogeneity and the Weibull distribution for the duration of waiting time to conception, the empirical results revealed a negative parity effect, a negative impact from male children, and a positive effect from child mortality on the hazard rate of conception. The estimates of step functions for the hazard rate of conception showed parity-dependent fertility control, evidence of heterogeneity, and the possibility of nonmonotonic duration dependence. In a hazard rate model with piecewise-linear-segment duration dependence, the socioeconomic variables such as cohort, child mortality, income, and race had significant effects, after controlling for the length of the preceding birth. The duration dependence was consistant with the common finding, i.e., first increasing and then decreasing at a slow rate. The effects of education and opportunity cost on fertility were insignificant.

  18. Development and Analysis of a Hurricane Hazard Model for Disaster Risk Assessment in Central America

    NASA Astrophysics Data System (ADS)

    Pita, G. L.; Gunasekera, R.; Ishizawa, O. A.

    2014-12-01

    Hurricane and tropical storm activity in Central America has consistently caused over the past decades thousands of casualties, significant population displacement, and substantial property and infrastructure losses. As a component to estimate future potential losses, we present a new regional probabilistic hurricane hazard model for Central America. Currently, there are very few openly available hurricane hazard models for Central America. This resultant hazard model would be used in conjunction with exposure and vulnerability components as part of a World Bank project to create country disaster risk profiles that will assist to improve risk estimation and provide decision makers with better tools to quantify disaster risk. This paper describes the hazard model methodology which involves the development of a wind field model that simulates the gust speeds at terrain height at a fine resolution. The HURDAT dataset has been used in this study to create synthetic events that assess average hurricane landfall angles and their variability at each location. The hazard model also then estimates the average track angle at multiple geographical locations in order to provide a realistic range of possible hurricane paths that will be used for risk analyses in all the Central-American countries. This probabilistic hurricane hazard model is then also useful for relating synthetic wind estimates to loss and damage data to develop and calibrate existing empirical building vulnerability curves. To assess the accuracy and applicability, modeled results are evaluated against historical events, their tracks and wind fields. Deeper analyses of results are also presented with a special reference to Guatemala. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the

  19. Flood hazards analysis based on changes of hydrodynamic processes in fluvial systems of Sao Paulo, Brazil.

    NASA Astrophysics Data System (ADS)

    Simas, Iury; Rodrigues, Cleide

    2016-04-01

    The metropolis of Sao Paulo, with its 7940 Km² and over 20 million inhabitants, is increasingly being consolidated with disregard for the dynamics of its fluvial systems and natural limitations imposed by fluvial terraces, floodplains and slopes. Events such as floods and flash floods became particularly persistent mainly in socially and environmentally vulnerable areas. The Aricanduva River basin was selected as the ideal area for the development of the flood hazard analysis since it presents the main geological and geomorphological features found in the urban site. According to studies carried out by Anthropic Geomorphology approach in São Paulo, to study this phenomenon is necessary to take into account the original hydromorphological systems and its functional conditions, as well as in which dimensions the Anthropic factor changes the balance between the main variables of surface processes. Considering those principles, an alternative model of geographical data was proposed and enabled to identify the role of different driving forces in terms of spatial conditioning of certain flood events. Spatial relationships between different variables, such as anthropogenic and original morphology, were analyzed for that purpose in addition to climate data. The surface hydrodynamic tendency spatial model conceived for this study takes as key variables: 1- The land use present at the observed date combined with the predominant lithological group, represented by a value ranging 0-100, based on indexes of the National Soil Conservation Service (NSCS-USA) and the Hydraulic Technology Center Foundation (FCTH-Brazil) to determine the resulting balance of runoff/infiltration. 2- The original slope, applying thresholds from which it's possible to determine greater tendency for runoff (in percents). 3- The minimal features of relief, combining the curvature of surface in plant and profile. Those three key variables were combined in a Geographic Information System in a series of

  20. A guide to CERCLA site assessment. Environmental Guidance

    SciTech Connect

    Not Available

    1994-02-01

    This Pocket Guide is a condensed version of information provided in three EPA documents: Guidance for Performing Preliminary Assessments Under CERCLA, Guidance for Performing Site Inspections Under CERCLA, and Hazard Ranking System Guidance Manual. Additionally the guide provides a DOE perspective on site assessment issues and information on the Federal Agency Hazardous Waste Compliance Docket as well as data sources for DOE site assessments. The guide is intended to present this information in a simple, portable, and direct manner that will allow the user to effectively focus on those aspects of the site assessment process of interest. The guide is not intended as a substitute for the three EPA guidance documents mentioned previously. DOE investigators should be thoroughly familiar with the EPA guidance before conducting site assessments. Use this pocketguide as an overview of procedures and requirements and as a field guide.

  1. Site Specific Probabilistic Seismic Hazard and Risk Analysis for Surrounding Communities of The Geysers Geothermal Development Area

    NASA Astrophysics Data System (ADS)

    Miah, M.; Hutchings, L. J.; Savy, J. B.

    2014-12-01

    We conduct a probabilistic seismic hazard and risk analysis from induced and tectonic earthquakes for a 50 km radius area centered on The Geysers, California and for the next ten years. We calculate hazard with both a conventional and physics-based approach. We estimate site specific hazard. We convert hazard to risk of nuisance and damage to structures per year and map the risk. For the conventional PSHA we assume the past ten years is indicative of hazard for the next ten years from M<4.5 earthquakes. Larger earthquakes are added from evaluation of b-values. For the physics-based appraoch, we utilize computations (Heidbach and Altmann, 2013) to calculate pressure as a function of position throughout The Geysers for the next 10 years. We then use another geo-mechanical modeling code (Bachmann et al., 2012) to calculate the number of earthquakes that will occur. We identify amplification factors for specific sites within each geologic unit from recordings of noise. Then, we interpolate within each geologic unit in finely gridded points. All grid points within a unit are weighted by distance from each data collection point. The entire process is repeated for all of the other types of geologic units until the entire area is gridded and assigned a hazard value for every grid points. We found that nuisance and damage risks calculated by both conventional and physics-based approaches provided almost identical results. This is very surprising since they were calculated by completely independent means. The conventional approach used the actual catalog of the past ten years of earthquakes to estimate the hazard for the next ten year. While the physics-based approach used geotechnical modeling to calculate the catalog for the next ten years. Similarly, for the conventional PSHA, we utilized attenuation relations from past earthquakes recorded at the Geysers to translate the ground motion from the source to the site. While for the physics-based approach we calculated ground

  2. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... table I417-2 falls into the category of unique or uncommon propellants, such as those identified in... conditions: (i) Surface wind speed of 2.9 knots with a wind speed increase of 1.0 knot per 1000 feet of... distances such that any potentially affected populated area is outside the toxic hazard area. A...

  3. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... table I417-2 falls into the category of unique or uncommon propellants, such as those identified in... conditions: (i) Surface wind speed of 2.9 knots with a wind speed increase of 1.0 knot per 1000 feet of... distances such that any potentially affected populated area is outside the toxic hazard area. A...

  4. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... table I417-2 falls into the category of unique or uncommon propellants, such as those identified in... conditions: (i) Surface wind speed of 2.9 knots with a wind speed increase of 1.0 knot per 1000 feet of... distances such that any potentially affected populated area is outside the toxic hazard area. A...

  5. DEVELOPMENT AND ANALYSIS OF AIR QUALITY MODELING SIMULATIONS FOR HAZARDOUS AIR POLLUTANTS

    EPA Science Inventory

    The concentrations of five hazardous air pollutants were simulated using the Community Multi Scale Air Quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results a...

  6. Case history and hazard analysis of two lake-damming landslides in the Himalayas

    NASA Astrophysics Data System (ADS)

    Weidinger, Johannes T.

    1998-04-01

    In investigating the hazard and case history of natural dams in the Himalayas, two sites of landslides and their former dammed lakes in the hinterland were visited between May 1994 and October 1995 and analysed from the geomorphological, geotechnical, geohydrological, tectonic, sedimentary and climatic points of view. One of the examples studied, the landslide in the valley of Birahi Ganga (Northern India), is one of the most impressive examples of recent hazards in alpinotype high mountain regions. This study was complemented by a study of the Ghatta Khola landslide (Western Nepal). In both cases, lithotectonic and climatic conditions led to the destabilisation and failure of carbonate bedrock. The occurrence of lakes, dammed over long periods behind the barriers, is of great importance, because after the sometimes fatal landslide event itself, one is confronted and has to cope with a secondary natural hazard, the possibility of a major flood due to the failure of the dam. That is why the preparatory causal factors of the origin of the two lakes (by damming up the river due to the landslide) and their stepwise disappearance (by secondary landslides within the barrier and sedimentation into the basin) were additionally focused upon. It is shown that due to very special circumstances (availability of sediments, heavy rainfall) in the Himalayas there is a progressive decrease in the potential hazard to the landscape and to human beings lower down the main valleys over a period of only a few decades.

  7. Hazard analysis and critical control point evaluation of school food programs in Bahrain.

    PubMed

    Ali, A A; Spencer, N J

    1996-03-01

    Hazard analyses were conducted in six food preparation sites and 16 school canteens in the State of Bahrain. Sandwiches made with cheese, meat, eggs, liver, and beef burgers were prepared in small shops or a bakery outside schools. Foods were cooked between 4 and 5 A.M. Time-temperature exposure during cooking was adequate to kill vegetative microbes and their spores, but potential for recontamination existed from the hands of food workers, utensils, and cloths and sponges used for wiping. All foods were left at room temperature before they were transported in vans to schools where they were also kept at room temperature between 17 degrees C and 41 degrees C. Air temperature inside the canteens during this investigation was between 18.5 and 28 degrees C with a relative humidity of 65 to 70%. Hazard analyses, which included observation of operations inside school canteens and sites of food preparation, measuring temperatures, and interviewing workers and consumers (teachers, students) were carried out. Hazards were primarily associated with preparation of foods long before they were consumed, physical touching of products, and holding foods at room temperature after preparation. Holding foods at room temperature would have allowed germination of bacterial spores and multiplication of microbes. Reheating of foods was not practiced. Health promoters must be aware of these hazards and need to educate food workers, administrators, and the public on the methods of prevention.

  8. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation.

    PubMed

    Liu, Xiang; Saat, Mohd Rapik; Barkan, Christopher P L

    2014-07-15

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail.

  9. ON-SITE MERCURY ANALYSIS OF SOIL AT HAZARDOUS WASTE SITES BY IMMUNOASSAY AND ASV

    EPA Science Inventory

    Two field methods for Hg, immunoassay and anodic stripping voltammetry (ASV), that can provide onsite results for quick decisions at hazardous waste sites were evaluated. Each method was applied to samples from two Superfund sites that contain high levels of Hg; Sulphur Bank Me...

  10. Analysis of safety precautions for coal and gas outburst-hazardous strata

    SciTech Connect

    Hudecek, V.

    2008-09-15

    The author analyses coal and gas outbursts and generalizes the available data on the approaches to solving the problematics of these gas-dynamic events in the framework of Czech Republic Grant 'Estimate of the Safety Precautions for Coal and Gas Outburst Hazardous Strata'.

  11. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  12. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  13. Religiousness and Levels of Hazardous Alcohol Use: A Latent Profile Analysis.

    PubMed

    Jankowski, Peter J; Hardy, Sam A; Zamboanga, Byron L; Ham, Lindsay S; Schwartz, Seth J; Kim, Su Yeong; Forthun, Larry F; Bersamin, Melina M; Donovan, Roxanne A; Whitbourne, Susan Krauss; Hurley, Eric A; Cano, Miguel Ángel

    2015-10-01

    Prior person-centered research has consistently identified a subgroup of highly religious participants that uses significantly less alcohol when compared to the other subgroups. The construct of religious motivation is absent from existing examinations of the nuanced combinations of religiousness dimensions within persons, and alcohol expectancy valuations have yet to be included as outcome variables. Variable-centered approaches have found religious motivation and alcohol expectancy valuations to play a protective role against individuals' hazardous alcohol use. The current study examined latent religiousness profiles and hazardous alcohol use in a large, multisite sample of ethnically diverse college students. The sample consisted of 7412 college students aged 18-25 (M age = 19.77, SD age = 1.61; 75% female; 61% European American). Three latent profiles were derived from measures of religious involvement, salience, and religious motivations: Quest-Intrinsic Religiousness (highest levels of salience, involvement, and quest and intrinsic motivations; lowest level of extrinsic motivation), Moderate Religiousness (intermediate levels of salience, involvement, and motivations) and Extrinsic Religiousness (lowest levels of salience, involvement, and quest and intrinsic motivations; highest level of extrinsic motivation). The Quest-Intrinsic Religiousness profile scored significantly lower on hazardous alcohol use, positive expectancy outcomes, positive expectancy valuations, and negative expectancy valuations, and significantly higher on negative expectancy outcomes, compared to the other two profiles. The Extrinsic and Moderate Religiousness profiles did not differ significantly on positive expectancy outcomes, negative expectancy outcomes, negative expectancy valuations, or hazardous alcohol use. The results advance existing research by demonstrating that the protective influence of religiousness on college students' hazardous alcohol use may involve high levels on

  14. Sensitivity analysis of seismic hazard for the northwestern portion of the state of Gujarat, India

    USGS Publications Warehouse

    Petersen, M.D.; Rastogi, B.K.; Schweig, E.S.; Harmsen, S.C.; Gomberg, J.S.

    2004-01-01

    We test the sensitivity of seismic hazard to three fault source models for the northwestern portion of Gujarat, India. The models incorporate different characteristic earthquake magnitudes on three faults with individual recurrence intervals of either 800 or 1600 years. These recurrence intervals imply that large earthquakes occur on one of these faults every 266-533 years, similar to the rate of historic large earthquakes in this region during the past two centuries and for earthquakes in intraplate environments like the New Madrid region in the central United States. If one assumes a recurrence interval of 800 years for large earthquakes on each of three local faults, the peak ground accelerations (PGA; horizontal) and 1-Hz spectral acceleration ground motions (5% damping) are greater than 1 g over a broad region for a 2% probability of exceedance in 50 years' hazard level. These probabilistic PGAs at this hazard level are similar to median deterministic ground motions. The PGAs for 10% in 50 years' hazard level are considerably lower, generally ranging between 0.2 g and 0.7 g across northwestern Gujarat. Ground motions calculated from our models that consider fault interevent times of 800 years are considerably higher than other published models even though they imply similar recurrence intervals. These higher ground motions are mainly caused by the application of intraplate attenuation relations, which account for less severe attenuation of seismic waves when compared to the crustal interplate relations used in these previous studies. For sites in Bhuj and Ahmedabad, magnitude (M) 7 3/4 earthquakes contribute most to the PGA and the 0.2- and 1-s spectral acceleration ground motion maps at the two considered hazard levels. ?? 2004 Elsevier B.V. All rights reserved.

  15. Debris-flow susceptibility and hazard assessment at a regional scale from GIS analysis

    NASA Astrophysics Data System (ADS)

    Bertrand, M.; Liébault, F.; Piégay, H.

    2012-12-01

    Small torrents of the Southern French Alps are prone to extreme events. Depending on the rainfall conditions, the sediment supply from hillslopes, and the gravitational energy, these events can occur under different forms, from floods to debris-flows. Debris-flows are recognized as the most dangerous phenomena and may have dramatic consequences for exposed people and infrastructures. As a first step of hazard assessment, we evaluated the debris-flow susceptibility, i.e. the likelihood that an event occurs in an area under particular physical conditions, not including the temporal dimension. The susceptibility is determined by (i) the morphometric controls of small upland catchments for debris-flows triggering and propagation, and by (ii) sediment supply conditions, i.e. erosion patterns feeding the channels. The morphometric controls are evaluated with indicators calculated from basic topographic variables. The sediment supply is evaluated by considering the cumulated surface of erosion area connected to the hydrographic network. We developed a statistical model to predict the geomorphic responses of the catchments (fluvial vs. debris-flow) and we apply this model within a GIS for regional-scale prediction. The model is based on two morphometric indicators, i.e. fan / channel slope and the Melton ruggedness index, and is based on a wide set of data including the Southern French Alps. We developed a GIS procedure to extract the indicators automatically using a 25m DEM and the hydrographic network as raw data. This model and its application have been validated with historical data. Sediment sources feeding debris-flow prone torrents are identified by first automatically mapping the erosion patches from the infrared orthophotos analysis then identifying the ones connected to the stream network. A classification method has been developed (segmentation into homogeneous objects classified with a neural network algorithm) and validated with expert interpretation on the

  16. Risk prediction of Critical Infrastructures against extreme natural hazards: local and regional scale analysis

    NASA Astrophysics Data System (ADS)

    Rosato, Vittorio; Hounjet, Micheline; Burzel, Andreas; Di Pietro, Antonio; Tofani, Alberto; Pollino, Maurizio; Giovinazzi, Sonia

    2016-04-01

    Natural hazard events can induce severe impacts on the built environment; they can hit wide and densely populated areas, where there is a large number of (inter)dependent technological systems whose damages could cause the failure or malfunctioning of further different services, spreading the impacts on wider geographical areas. The EU project CIPRNet (Critical Infrastructures Preparedness and Resilience Research Network) is realizing an unprecedented Decision Support System (DSS) which enables to operationally perform risk prediction on Critical Infrastructures (CI) by predicting the occurrence of natural events (from long term weather to short nowcast predictions, correlating intrinsic vulnerabilities of CI elements with the different events' manifestation strengths, and analysing the resulting Damage Scenario. The Damage Scenario is then transformed into an Impact Scenario, where punctual CI element damages are transformed into micro (local area) or meso (regional) scale Services Outages. At the smaller scale, the DSS simulates detailed city models (where CI dependencies are explicitly accounted for) that are of important input for crisis management organizations whereas, at the regional scale by using approximate System-of-Systems model describing systemic interactions, the focus is on raising awareness. The DSS has allowed to develop a novel simulation framework for predicting earthquakes shake maps originating from a given seismic event, considering the shock wave propagation in inhomogeneous media and the subsequent produced damages by estimating building vulnerabilities on the basis of a phenomenological model [1, 2]. Moreover, in presence of areas containing river basins, when abundant precipitations are expected, the DSS solves the hydrodynamic 1D/2D models of the river basins for predicting the flux runoff and the corresponding flood dynamics. This calculation allows the estimation of the Damage Scenario and triggers the evaluation of the Impact Scenario

  17. An Earthquake Source Ontology for Seismic Hazard Analysis and Ground Motion Simulation

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Jordan, T. H.; Gil, Y.; Ratnakar, V.

    2005-12-01

    Representation of the earthquake source is an important element in seismic hazard analysis and earthquake simulations. Source models span a range of conceptual complexity - from simple time-independent point sources to extended fault slip distributions. Further computational complexity arises because the seismological community has established so many source description formats and variations thereof; what this means is that conceptually equivalent source models are often expressed in different ways. Despite the resultant practical difficulties, there exists a rich semantic vocabulary for working with earthquake sources. For these reasons, we feel it is appropriate to create a semantic model of earthquake sources using an ontology, a computer science tool from the field of knowledge representation. Unlike the domain of most ontology work to date, earthquake sources can be described by a very precise mathematical framework. Another uniqueness associated with developing such an ontology is that earthquake sources are often used as computational objects. A seismologist generally wants more than to simply construct a source and have it be well-formed and properly described; additionally, the source will be used for performing calculations. Representation and manipulation of complex mathematical objects presents a challenge to the ontology development community. In order to enable simulations involving many different types of source models, we have completed preliminary development of a seismic point source ontology. The use of an ontology to represent knowledge provides machine interpretability and the ability to validate logical consistency and completeness. Our ontology, encoded using the OWL Web Ontology Language - a standard from the World Wide Web Consortium, contains the conceptual definitions and relationships necessary for source translation services. For example, specification of strike, dip, rake, and seismic moment will automatically translate into a double

  18. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Torrential processes like flooding, heavy bedload transport or debris flows in steep mountain channels emerge during intense, highly localized rainfall events. They pose a serious risk on the densely populated Alpine region. Hydrogeomorphic hazards are profoundly nonlinear, threshold mediated phenomena frequently causing costly damage to infrastructure and people. Thus, in the context of climate change, there is an ever rising interest in whether sediment cascades of small alpine catchments react to changing precipitation patterns and how the climate signal is propagated through the fluvial system. We intend to answer the following research questions: (i) What are critical meteorological characteristics triggering torrential events in the Eastern Alps of Austria? (ii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which factors control the internal susceptibility? (iii) Do torrential processes show an increase in magnitude and frequency or a shift in seasonality in the recent past? (iv) Which future changes can be expected under different climate scenarios? Quantifications of bedload transport in small alpine catchments are rare and often associated with high uncertainties. Detailed knowledge though exists for the Schöttlbach catchment, a 71 km2 study area in Styria in the Eastern Alps. The torrent is monitored since a heavy precipitation event resulted in a disastrous flood in July 2011. Sediment mobilisation from slopes as well as within-channel storage and fluxes are regularly measured by photogrammetric methods and sediment impact sensors (SIS). The associated hydro-meteorological conditions are known from a dense station network. Changing states of connectivity can thus be related to precipitation and internal dynamics (sediment availability, cut-and-fill cycles). The site-specific insights are then conceptualized for application to a broader scale. Therefore, a Styria wide database of torrential

  19. VIIRS Unique Fires Compared to the NOAA Hazard Mapping System Fire Analysis

    NASA Astrophysics Data System (ADS)

    Ruminski, M.; Liddick, K.

    2014-12-01

    The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument on the Suomi National Polar-orbiting Partnership (S-NPP) satellite provides radiometric measurements for automated fire detection. The baseline VIIRS Active Fire Product (AFP) is very similar to the collection 4 legacy fire detection algorithm developed for the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on NASA's Terra and Aqua spacecraft and is expected to become operational and validated in the Fall of 2014. VIIRS (imagery and the AFP) will soon be incorporated into NESDIS' operational Hazard Mapping System (HMS) fire and smoke analysis. The HMS incorporates a wide variety of satellite data for use in fire detection, including GOES-East and GOES-West at least every 15 minutes, five NOAA and METOP polar orbiting satellites with the Advanced Very High Resolution Radiometer (AVHRR) instrument and MODIS Aqua/Terra. The HMS utilizes the automated fire detections from each of the sensors which are then quality controlled by an analyst. The VIIRS AFP became available for evaluation with the HMS in the Spring of 2014. The AFP was compared with the final quality controlled HMS product over the contiguous US between 8 April and 8 June 2014, which is primarily the agricultural and prescribed fire season, in order to determine the number of VIIRS unique fires. In making the comparison, any VIIRS AFP fire that was within 4 km of an HMS fire would not be considered unique, due to navigational accuracy and the 4km nominal resolution of GOES. Any VIIRS fire that was within 2km of a power plant or a known false detect location was also not considered. Based on these criteria there were 5876 VIIRS AFP unique locations compared to 71,705 HMS detections, approximately 8 percent of the HMS total. These extra locations potentially represent additional emissions that could affect air quality. The geographic distribution resembled the burning pattern during this period with the majority over the

  20. Reading Guidance: Death and Grief.

    ERIC Educational Resources Information Center

    Smith, Alice Gullen

    1989-01-01

    Gives guidelines for the librarian using reading guidance (similar to bibliotherapy). Provides a nine-item annotated bibliography of novels for children and adolescents on the subject of death and grief. Appends an embryo list of categories suitable for content analysis of any file librarians might wish to keep on books suitable for use in this…