Science.gov

Sample records for accident analysis methodology

  1. A structured methodology for waste management facility accident analysis

    SciTech Connect

    Mueller, C.J.; Folga, S.M.; Roglans-Ribas, J.; Nabelssi, B.; Mishima, J.

    1998-06-01

    Guidance from the US Department of Energy (DOE) for the performance of accident analysis in support of environmental impact assessments calls for a graded approach that considers frequencies as well as consequences of accidents, focuses on high-risk scenarios, and avoids bounding analyses that can obfuscate comparisons of alternative actions. This guidance reflects the fact that at the heart of an environmental impact statement is a comparative analysis of alternatives, including the proposed action; this analysis should address the environmental impacts in proportion to their potential significance, avoid addressing insignificant impacts in detail, and focus analysis resources to be as cost-effective as possible. Accordingly, a probabilistic risk analysis-based methodology to satisfy DOE guidance was developed and implemented for the DOE Waste Management Programmatic Environmental Impact Statement (WM PEIS). The methods are described and illustrated, and an outline of the computational framework is presented. This methodology, although developed for the WM PEIS, is, of course, applicable to general safety analyses. The implementation of the methods for the WM PEIS is summarized, and the extension of the methods to site-specific applications is explained.

  2. Linguistic methodology for the analysis of aviation accidents

    NASA Technical Reports Server (NTRS)

    Goguen, J. A.; Linde, C.

    1983-01-01

    A linguistic method for the analysis of small group discourse, was developed and the use of this method on transcripts of commercial air transpot accidents is demonstrated. The method identifies the discourse types that occur and determine their linguistic structure; it identifies significant linguistic variables based upon these structures or other linguistic concepts such as speech act and topic; it tests hypotheses that support significance and reliability of these variables; and it indicates the implications of the validated hypotheses. These implications fall into three categories: (1) to train crews to use more nearly optimal communication patterns; (2) to use linguistic variables as indices for aspects of crew performance such as attention; and (3) to provide guidelines for the design of aviation procedures and equipment, especially those that involve speech.

  3. Accident characterization methodology

    SciTech Connect

    Camp, A.L.; Harper, F.T.

    1986-01-01

    The Nuclear Regulatory Commission (NRC) is preparing NUREG-1150 to examine the risk from a selected group of nuclear power plants. NUREG-1150 will provide technical bases for comparison of NRC research to industry results and resolution of numerous severe accident issues. In support of NUREG-1150, Sandia National Laboratories has directed the production of Level 3 Probabilistic Risk Assessments (PRAs) for the Surry, Sequoyah, Peach Bottom, and Grand Gulf nuclear power plants. The Accident Sequence Evaluation Program (ASEP) at Sandia has been responsible for the Level 1 portion of the analyses, which includes estimation of core damage frequency and characterization of the dominant sequences. The ASEP analyses are being documented in NUREG/CR-4550. The purpose of this paper is to briefly describe and evaluate the methodology utilized in these analyses. The methodology will soon be published in more detail as Reference 5. The results produced for NUREG/CR-4550 using this methodology are summarized in another paper to be presented at this conference.

  4. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Sharp, G.L.; McCracken, R.T.

    2003-05-13

    The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  5. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Gregg L. Sharp; R. T. McCracken

    2003-06-01

    The regulatory requirement to develop an upgraded safety basis for a DOE nuclear facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830).1 Subpart B of 10 CFR 830, “Safety Basis Requirements,” requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements.1 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, “Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants”2 as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  6. Methodology for coupling physics and systems codes for accident simulations

    SciTech Connect

    Beard, C.; Hilton, P.; Lider, S.; Risher, D.

    2006-07-01

    Westinghouse has developed the RAVE{sup TM} three-dimensional core transient analysis system for PWR non-LOCA accident analysis. Methodologies have been developed for the use of this code and approved by the US NRC. This paper discusses the linkages between the codes for nuclear and thermal-hydraulic analysis of the core and the system analysis code along with the associated software and methodology decisions. (authors)

  7. Risk Estimation Methodology for Launch Accidents.

    SciTech Connect

    Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

    2014-02-01

    As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

  8. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  9. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1931-01-01

    The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

  10. Accident Tolerant Fuel Analysis

    SciTech Connect

    Curtis Smith; Heather Chichester; Jesse Johns; Melissa Teague; Michael Tonks; Robert Youngblood

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and evaluate margin recovery strategies.

  11. Accident tolerant fuel analysis

    SciTech Connect

    Smith, Curtis; Chichester, Heather; Johns, Jesse; Teague, Melissa; Tonks, Michael Idaho National Laboratory; Youngblood, Robert

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced ''RISMC toolkit'' that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional ''accident-tolerant'' (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and evaluate margin recovery strategies.

  12. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  13. Final report of the accident phenomenology and consequence (APAC) methodology evaluation. Spills Working Group

    SciTech Connect

    Brereton, S.; Shinn, J.; Hesse, D; Kaninich, D.; Lazaro, M.; Mubayi, V.

    1997-08-01

    The Spills Working Group was one of six working groups established under the Accident Phenomenology and Consequence (APAC) methodology evaluation program. The objectives of APAC were to assess methodologies available in the accident phenomenology and consequence analysis area and to evaluate their adequacy for use in preparing DOE facility safety basis documentation, such as Basis for Interim Operation (BIO), Justification for Continued Operation (JCO), Hazard Analysis Documents, and Safety Analysis Reports (SARs). Additional objectives of APAC were to identify development needs and to define standard practices to be followed in the analyses supporting facility safety basis documentation. The Spills Working Group focused on methodologies for estimating four types of spill source terms: liquid chemical spills and evaporation, pressurized liquid/gas releases, solid spills and resuspension/sublimation, and resuspension of particulate matter from liquid spills.

  14. Decision-problem state analysis methodology

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1980-01-01

    A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.

  15. Nuclear fuel cycle facility accident analysis handbook

    SciTech Connect

    Ayer, J E; Clark, A T; Loysen, P; Ballinger, M Y; Mishima, J; Owczarski, P C; Gregory, W S; Nichols, B D

    1988-05-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH.

  16. Regional Shelter Analysis Methodology

    SciTech Connect

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave; Walker, Hoyt; Miller, Paul

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  17. Behavior Analysis: Methodological Foundations.

    ERIC Educational Resources Information Center

    Owen, James L.

    Behavior analysis provides a unique way of coming to understand intrapersonal and interpersonal communication behaviors, and focuses on control techniques available to a speaker and counter-control techniques available to a listener. "Time-series methodology" is a convenient term because it subsumes under one label a variety of baseline or…

  18. Reactor Safety Gap Evaluation of Accident Tolerant Components and Severe Accident Analysis

    SciTech Connect

    Farmer, Mitchell T.; Bunt, R.; Corradini, M.; Ellison, Paul B.; Francis, M.; Gabor, John D.; Gauntt, R.; Henry, C.; Linthicum, R.; Luangdilok, W.; Lutz, R.; Paik, C.; Plys, M.; Rabiti, Cristian; Rempe, J.; Robb, K.; Wachowiak, R.

    2015-01-31

    The overall objective of this study was to conduct a technology gap evaluation on accident tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist, given the current state of light water reactor (LWR) severe accident research, and additionally augmented by insights obtained from the Fukushima accident. The ultimate benefit of this activity is that the results can be used to refine the Department of Energy’s (DOE) Reactor Safety Technology (RST) research and development (R&D) program plan to address key knowledge gaps in severe accident phenomena and analyses that affect reactor safety and that are not currently being addressed by the industry or the Nuclear Regulatory Commission (NRC).

  19. Accident progression event tree analysis for postulated severe accidents at N Reactor

    SciTech Connect

    Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. ); Medford, G.T. )

    1990-06-01

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

  20. Probabilistic analysis of air carrier accidents

    NASA Technical Reports Server (NTRS)

    1979-01-01

    In order to estimate the potential risks due to carbon fibers (CF) released from aircraft accidents, it was necessary to quantify the probability of an accident or incident at a major hub airport. This probability was contingent upon various conditions surrounding the incident including the phase of operation, aircraft type, and the weather conditions. The type of accident predicted was categorized according to its location relative to the runway and the severity of damage sustained. The methodology utilized to estimate the probability of a specific type of accident is outlined and the various models that were developed in the course of this work are described.

  1. ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.

    PubMed

    Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

    2006-03-31

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage. PMID:16126337

  2. A methodology for the transfer of probabilities between accident severity categories

    SciTech Connect

    Whitlow, J. D.; Neuhauser, K. S.

    1991-01-01

    A methodology has been developed which allows the accident probabilities associated with one accident-severity category scheme to be transferred to another severity category scheme. The methodology requires that the schemes use a common set of parameters to define the categories. The transfer of accident probabilities is based on the relationships between probability of occurrence and each of the parameters used to define the categories. Because of the lack of historical data describing accident environments in engineering terms, these relationships may be difficult to obtain directly for some parameters. Numerical models or experienced judgement are often needed to obtain the relationships. These relationships, even if they are not exact, allow the accident probability associated with any severity category to be distributed within that category in a manner consistent with accident experience, which in turn will allow the accident probability to be appropriately transferred to a different category scheme.

  3. Aircraft Loss-of-Control Accident Analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Foster, John V.

    2010-01-01

    Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

  4. Calculation of relative tube/tube support plate displacements in steam generators under accident condition loads using non-linear dynamic analysis methodologies

    SciTech Connect

    Smith, R.E.; Waisman, R.; Hu, M.H.; Frick, T.M.

    1995-12-01

    A non-linear analysis has been performed to determine relative motions between tubes and tube support plates (TSP) during a steam line break (SLB) event for steam generators. The SLB event results in blowdown of steam and water out of the steam generator. The fluid blowdown generates pressure drops across the TSPS, resulting in out-of-plane motion. The SLB induced pressure loads are calculated with a computer program that uses a drift-flux modeling of the two-phase flow. In order to determine the relative tube/TSP motions, a nonlinear dynamic time-history analysis is performed using a structural model that considers all of the significant component members relative to the tube support system. The dynamic response of the structure to the pressure loads is calculated using a special purpose computer program. This program links the various substructures at common degrees of freedom into a combined mass and stiffness matrix. The program accounts for structural non-linearities, including potential tube and TSP interaction at any given tube position. The program also accounts for structural damping as part of the dynamic response. Incorporating all of the above effects, the equations of motion are solved to give TSP displacements at the reduced set of DOF. Using the displacement results from the dynamic analysis, plate stresses are then calculated using the detailed component models. Displacements form the dynamic analysis are imposed as boundary conditions at the DOF locations, and the finite element program then solves for the overall distorted geometry. Calculations are also performed to assure that assumptions regarding elastic response of the various structural members and support points are valid.

  5. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  6. An analysis of aircraft accidents involving fires

    NASA Technical Reports Server (NTRS)

    Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

    1975-01-01

    All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

  7. Learning from Accident Analysis: The Dynamics Leading Up to a Rafting Accident.

    ERIC Educational Resources Information Center

    Hovelynck, Johan

    1998-01-01

    Analysis of a case study of a whitewater rafting accident reveals that such accidents tend to result from multiple actions. Many events leading up to such accidents include procedural and process factors, suggesting that hard-skills technical training is an insufficient approach to accident prevention. Contains 26 references. (SAS)

  8. Upgrading the safety toolkit: Initiatives of the accident analysis subgroup

    SciTech Connect

    O'Kula, K.R.; Chung, D.Y.

    1999-07-01

    Since its inception, the Accident Analysis Subgroup (AAS) of the Energy Facility Contractors Group (EFCOG) has been a leading organization promoting development and application of appropriate methodologies for safety analysis of US Department of Energy (DOE) installations. The AAS, one of seven chartered by the EFCOG Safety Analysis Working Group, has performed an oversight function and provided direction to several technical groups. These efforts have been instrumental toward formal evaluation of computer models, improving the pedigree on high-use computer models, and development of the user-friendly Accident Analysis Guidebook (AAG). All of these improvements have improved the analytical toolkit for best complying with DOE orders and standards shaping safety analysis reports (SARs) and related documentation. Major support for these objectives has been through DOE/DP-45.

  9. FSAR fire accident analysis for a plutonium facility

    SciTech Connect

    Lam, K.

    1997-06-01

    The Final Safety Analysis Report (FSAR) for a plutonium facility as required by DOE Orders 5480.23 and 5480.22 has recently been completed and approved. The facility processes and stores radionuclides such as Pu-238, Pu-239, enriched uranium, and to a lesser degree other actinides. This facility produces heat sources. DOE Order 5480.23 and DOE-STD-3009-94 require analysis of different types of accidents (operational accidents such as fires, explosions, spills, criticality events, and natural phenomena such as earthquakes). The accidents that were analyzed quantitatively, or the Evaluation Basis Accidents (EBAs), were selected based on a multi-step screening process that utilizes extensively the Hazards Analysis (HA) performed for the facility. In the HA, specific accident scenarios, with estimated frequency and consequences, were developed for each identified hazard associated with facility operations and activities. Analysis of the EBAs and comparison of their consequences to the evaluation guidelines established the safety envelope for the facility and identified the safety-class structures, systems, and components. This paper discusses the analysis of the fire EBA. This fire accident was analyzed in relatively great detail in the FSAR because of its potential off-site consequences are more severe compared to other events. In the following, a description of the scenario is first given, followed by a brief summary of the methodology for calculating the source term. Finally, the author discuss how a key parameter affecting the source term, the leakpath factor, was determined, which is the focus of this paper.

  10. Anthropotechnological analysis of industrial accidents in Brazil.

    PubMed Central

    Binder, M. C.; de Almeida, I. M.; Monteau, M.

    1999-01-01

    The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

  11. Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods

    SciTech Connect

    Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

    2000-08-01

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  12. Simplified plant analysis risk (SPAR) human reliability analysis (HRA) methodology: Comparisons with other HRA methods

    SciTech Connect

    J. C. Byers; D. I. Gertman; S. G. Hill; H. S. Blackman; C. D. Gentillon; B. P. Hallbert; L. N. Haney

    2000-07-31

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  13. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.; Morrisete, J. A.

    1982-01-01

    The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

  14. Accident analysis for US fast burst reactors

    SciTech Connect

    Paternoster, R.; Flanders, M.; Kazi, H.

    1994-09-01

    In the US fast burst reactor (FBR) community there has been increasing emphasis and scrutiny on safety analysis and understanding of possible accident scenarios. This paper summarizes recent work in these areas that is going on at the different US FBR sites. At this time, all of the FBR facilities have or in the process of updating and refining their accident analyses. This effort is driven by two objectives: to obtain a more realistic scenario for emergency response procedures and contingency plans, and to determine compliance with changing regulatory standards.

  15. Recent Methodology in Ginseng Analysis

    PubMed Central

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  16. Analysis of commercial mini-bus accidents.

    PubMed

    Hamed, M M; Jaradat, A S; Easa, S M

    1998-09-01

    This paper presents a study of mini-bus traffic accidents aimed at gaining insight into the factors affecting accident occurrence and severity. Understanding these factors can help to bring forth realistic strategies to improve the safety of these buses. Two disaggregate models related to the time until accident occurrence and the number of accident injuries were specified and estimated. The models were estimated using data collected from 438 mini-bus drivers in Jordan. The estimated models yielded significant associations between the independent variables and both the time until accident occurrence (first, second and third accidents) and their corresponding number of injuries (accident severity). Higher accident rates were associated with drivers who were unmarried, took too few rest breaks and had short time intervals since previous accidents. Lower accident rates were associated with drivers who had long bus-driving and private vehicle-driving experience. The results indicate that the longer a mini-bus driver goes without an accident, the less likely it is that he will have an accident. The results also indicate that previous accident type affects the duration of the upcoming traffic accident. Greater accident severity was associated with single-vehicle accidents, rural intercity routes and speeding. Accident severity decreased and the time between two accidents increased when the previous accident was severe. The results seem to indicate that post- and immediate accident history affect the severity of upcoming accidents. Seven recommendations based on these findings are made in an attempt to improve mini-bus safety. PMID:9678210

  17. Comparing the Identification of Recommendations by Different Accident Investigators Using a Common Methodology

    NASA Technical Reports Server (NTRS)

    Johnson, Chris W.; Oltedal, H. A.; Holloway, C. M.

    2012-01-01

    Accident reports play a key role in the safety of complex systems. These reports present the recommendations that are intended to help avoid any recurrence of past failures. However, the value of these findings depends upon the causal analysis that helps to identify the reasons why an accident occurred. Various techniques have been developed to help investigators distinguish root causes from contributory factors and contextual information. This paper presents the results from a study into the individual differences that can arise when a group of investigators independently apply the same technique to identify the causes of an accident. This work is important if we are to increase the consistency and coherence of investigations following major accidents.

  18. A review of risk analysis and helicopter air ambulance accidents.

    PubMed

    Nix, Sam; Buckner, Steven; Cercone, Richard

    2014-01-01

    The Federal Aviation Administration announced a final rule in February 2014 that includes a requirement for helicopter air ambulance operators to institute preflight risk analysis programs. This qualitative study examined risk factors that were described in 22 preliminary, factual, and probable cause helicopter air ambulance accident and incident reports that were initiated by the National Transportation Safety Board between January 1, 2011, and December 31, 2013. Insights into the effectiveness of existing preflight risk analysis strategies were gained by comparing these risk factors with the preflight risk analysis guidance that is published by the Federal Aviation Administration in the Flight Standards Information Management System. When appropriate, a deeper understanding of the human factors that may have contributed to occurrences was gained through methodologies that are described in the Human Factors Analysis and Classification System. The results of this study suggest that there are some vulnerabilities in existing preflight risk analysis guidelines that may affect safety in the helicopter air ambulance industry. The likelihood that human factors contributed to most of the helicopter air ambulance accidents and incidents that occurred during the study period was also evidenced. The results of this study suggest that effective risk analysis programs should provide pilots with both preflight and in-flight resources. PMID:25179955

  19. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  20. Canister storage building design basis accident analysis documentation

    SciTech Connect

    KOPELIC, S.D.

    1999-02-25

    This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  1. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.

    1999-09-09

    This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  2. Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation

    SciTech Connect

    PIEPHO, M.G.

    1999-10-20

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.

  3. A POTENTIAL APPLICATION OF UNCERTAINTY ANALYSIS TO DOE-STD-3009-94 ACCIDENT ANALYSIS

    SciTech Connect

    Palmrose, D E; Yang, J M

    2007-05-10

    The objective of this paper is to assess proposed transuranic waste accident analysis guidance and recent software improvements in a Windows-OS version of MACCS2 that allows the inputting of parameter uncertainty. With this guidance and code capability, there is the potential to perform a quantitative uncertainty assessment of unmitigated accident releases with respect to the 25 rem Evaluation Guideline (EG) of DOE-STD-3009-94 CN3 (STD-3009). Historically, the classification of safety systems in a U.S. Department of Energy (DOE) nuclear facility's safety basis has involved how subject matter experts qualitatively view uncertainty in the STD-3009 Appendix A accident analysis methodology. Specifically, whether consequence uncertainty could be larger than previously evaluated so the site-specific accident consequences may challenge the EG. This paper assesses whether a potential uncertainty capability for MACCS2 could provide a stronger technical basis as to when the consequences from a design basis accident (DBA) truly challenges the 25 rem EG.

  4. PERSPECTIVES ON A DOE CONSEQUENCE INPUTS FOR ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    , K; Jonathan Lowrie, J; David Thoman , D; Austin Keller , A

    2008-07-30

    Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases.

  5. Categorizing accident sequences in the external radiotherapy for risk analysis

    PubMed Central

    2013-01-01

    Purpose This study identifies accident sequences from the past accidents in order to help the risk analysis application to the external radiotherapy. Materials and Methods This study reviews 59 accidental cases in two retrospective safety analyses that have collected the incidents in the external radiotherapy extensively. Two accident analysis reports that accumulated past incidents are investigated to identify accident sequences including initiating events, failure of safety measures, and consequences. This study classifies the accidents by the treatments stages and sources of errors for initiating events, types of failures in the safety measures, and types of undesirable consequences and the number of affected patients. Then, the accident sequences are grouped into several categories on the basis of similarity of progression. As a result, these cases can be categorized into 14 groups of accident sequence. Results The result indicates that risk analysis needs to pay attention to not only the planning stage, but also the calibration stage that is committed prior to the main treatment process. It also shows that human error is the largest contributor to initiating events as well as to the failure of safety measures. This study also illustrates an event tree analysis for an accident sequence initiated in the calibration. Conclusion This study is expected to provide sights into the accident sequences for the prospective risk analysis through the review of experiences. PMID:23865005

  6. [An analysis of industrial accidents in the working field with a particular emphasis on repeated accidents].

    PubMed

    Wakisaka, I; Yanagihashi, T; Tomari, T; Sato, M

    1990-03-01

    The present study is based on an analysis of routinely submitted reports of occupational accidents experienced by the workers of industrial enterprises under the jurisdiction of Kagoshima Labor Standard Office during a 5-year period 1983 to 1987. Officially notified injuries serious enough to keep employees away from their job for work at least 4 days were utilized in this study. Data was classified so as to give an observed frequency distribution for workers having any specified number of accidents. Also, the accident rate which is an indicator of the risk of accident was compared among different occupations, between age groups and between the sexes. Results obtained are as follows; 1) For the combined total of 6,324 accident cases for 8 types of occupation (Construction, Transportation, Mining & Quarrying, Forestry, Food manufacture, Lumber & Woodcraft, Manufacturing industry and Other business), the number of those who had at least one accident was 6,098, of which 5,837 were injured only once, 208 twice, 21 three times and 2 four times. When occupation type was fixed, however, the number of workers having one, two, three and four times of accidents were 5,895, 182, 19 and 2, respectively. This suggests that some workers are likely to have experienced repeated accidents in more than one type of occupation.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:2131982

  7. Methodological development for selection of significant predictors explaining fatal road accidents.

    PubMed

    Dadashova, Bahar; Arenas-Ramírez, Blanca; Mira-McWilliams, José; Aparicio-Izquierdo, Francisco

    2016-05-01

    Identification of the most relevant factors for explaining road accident occurrence is an important issue in road safety research, particularly for future decision-making processes in transport policy. However model selection for this particular purpose is still an ongoing research. In this paper we propose a methodological development for model selection which addresses both explanatory variable and adequate model selection issues. A variable selection procedure, TIM (two-input model) method is carried out by combining neural network design and statistical approaches. The error structure of the fitted model is assumed to follow an autoregressive process. All models are estimated using Markov Chain Monte Carlo method where the model parameters are assigned non-informative prior distributions. The final model is built using the results of the variable selection. For the application of the proposed methodology the number of fatal accidents in Spain during 2000-2011 was used. This indicator has experienced the maximum reduction internationally during the indicated years thus making it an interesting time series from a road safety policy perspective. Hence the identification of the variables that have affected this reduction is of particular interest for future decision making. The results of the variable selection process show that the selected variables are main subjects of road safety policy measures. PMID:26928290

  8. TMI-2 accident: core heat-up analysis

    SciTech Connect

    Ardron, K.H.; Cain, D.G.

    1981-01-01

    This report summarizes NSAC study of reactor core thermal conditions during the accident at Three Mile Island, Unit 2. The study focuses primarily on the time period from core uncovery (approximately 113 minutes after turbine trip) through the initiation of sustained high pressure injection (after 202 minutes). The transient analysis is based upon established sequences of events; plant data; post-accident measurements; interpretation or indirect use of instrument responses to accident conditions.

  9. Study of possibility using LANL PSA-methodology for accident probability RBMK researches

    SciTech Connect

    Petrin, S.V.; Yuferev, V.Y.; Zlobin, A.M.

    1995-12-31

    The reactor facility probabilistic safety analysis methodologies are considered which are used at U.S. LANL and RF NIKIET. The methodologies are compared in order to reveal their similarity and differences, determine possibilities of using the LANL technique for RBMK type reactor safety analysis. It is found that at the PSA-1 level the methodologies practically do not differ. At LANL the PHA, HAZOP hazards analysis methods are used for more complete specification of the accounted initial event list which can be also useful at performance of PSA for RBMK. Exchange of information regarding the methodology of detection of dependent faults and consideration of human factor impact on reactor safety is reasonable. It is accepted as useful to make a comparative study result analysis for test problems or PSA fragments using various computer programs employed at NIKIET and LANL.

  10. Aircraft accidents.method of analysis

    NASA Technical Reports Server (NTRS)

    1937-01-01

    This report is a revision of NACA-TR-357. It was prepared by the Committee on Aircraft Accidents. The purpose of this report is to provide a basis for the classification and comparison of aircraft accidents, both civil and military.

  11. Development of Database for Accident Analysis in Indian Mines

    NASA Astrophysics Data System (ADS)

    Tripathy, Debi Prasad; Guru Raghavendra Reddy, K.

    2015-08-01

    Mining is a hazardous industry and high accident rates associated with underground mining is a cause of deep concern. Technological developments notwithstanding, rate of fatal accidents and reportable incidents have not shown corresponding levels of decline. This paper argues that adoption of appropriate safety standards by both mine management and the government may result in appreciable reduction in accident frequency. This can be achieved by using the technology in improving the working conditions, sensitising workers and managers about causes and prevention of accidents. Inputs required for a detailed analysis of an accident include information on location, time, type, cost of accident, victim, nature of injury, personal and environmental factors etc. Such information can be generated from data available in the standard coded accident report form. This paper presents a web based application for accident analysis in Indian mines during 2001-2013. An accident database (SafeStat) prototype based on Intranet of the TCP/IP agreement, as developed by the authors, is also discussed.

  12. Methodology for Validating Building Energy Analysis Simulations

    SciTech Connect

    Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  13. NASA's Accident Precursor Analysis Process and the International Space Station

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Lutomski, Michael

    2010-01-01

    This viewgraph presentation reviews the implementation of Accident Precursor Analysis (APA), as well as the evaluation of In-Flight Investigations (IFI) and Problem Reporting and Corrective Action (PRACA) data for the identification of unrecognized accident potentials on the International Space Station.

  14. Rat sperm motility analysis: methodologic considerations

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  15. A methodology for the quantitative risk assessment of major accidents triggered by seismic events.

    PubMed

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-08-17

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units. PMID:17276591

  16. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  17. The Methodology of Data Envelopment Analysis.

    ERIC Educational Resources Information Center

    Sexton, Thomas R.

    1986-01-01

    The methodology of data envelopment analysis, (DEA) a linear programming-based method, is described. Other procedures often used for measuring relative productive efficiency are discussed in relation to DEA, including ratio analysis and multiple regression analysis. The DEA technique is graphically illustrated for only two inputs and one output.…

  18. Scenario analysis of freight vehicle accident risks in Taiwan.

    PubMed

    Tsai, Ming-Chih; Su, Chien-Chih

    2004-07-01

    This study develops a quantitative risk model by utilizing Generalized Linear Interactive Model (GLIM) to analyze the major freight vehicle accidents in Taiwan. Eight scenarios are established by interacting three categorical variables of driver ages, vehicle types and road types, each of which contains two levels. The database that consists of 2043 major accidents occurring between 1994 and 1998 in Taiwan is utilized to fit and calibrate the model parameters. The empirical results indicate that accident rates of freight vehicles in Taiwan were high in the scenarios involving trucks and non-freeway systems, while; accident consequences were severe in the scenarios involving mature drivers or non-freeway systems. Empirical evidences also show that there is no significant relationship between accident rates and accident consequences. This is to stress that safety studies that describe risk merely as accident rates rather than the combination of accident rates and consequences by definition might lead to biased risk perceptions. Finally, the study recommends using number of vehicle as an alternative of traffic exposure in commercial vehicle risk analysis. The merits of this would be that it is simple and thus reliable; meanwhile, the resulted risk that is termed as fatalities per vehicle could provide clear and direct policy implications for insurance practices and safety regulations. PMID:15094423

  19. An analysis of pileup accidents in highway systems

    NASA Astrophysics Data System (ADS)

    Chang, Jau-Yang; Lai, Wun-Cing

    2016-02-01

    Pileup accident is a multi-vehicle collision occurring in the lane and producing by successive following vehicles. It is a special collision on highway. The probability of the occurrence of pileup accident is lower than that of the other accidents in highway systems. However, the pileup accident leads to injuries and damages which are often serious. In this paper, we analyze the occurrence of pileup accidents by considering the three types of dangerous collisions in highway systems. We evaluate those corresponding to rear-end collision, lane-changing collision, and double lane-changing collision. We simulate four road driving strategies to investigate the relationships between different vehicle collisions and pileup accidents. In accordance with the simulation and analysis, it is shown that the double lane-changing collisions result in an increase of the occurrence of pileup accidents. Additionally, we found that the probability of the occurrence of pileup accidents can be reduced when the speeds of vehicles are suitably constrained in highway systems.

  20. Analysis of Credible Accidents for Argonaut Reactors

    SciTech Connect

    Hawley, S. C.; Kathern, R. L.; Robkin, M. A.

    1981-04-01

    Five areas of potential accidents have been evaluated for the Argonaut-UTR reactors. They are: • insertion of excess reactivity • catastrophic rearrangement of the core • explosive chemical reaction • graphite fire • fuel-handling accident. A nuclear excursion resulting from the rapid insertion of the maximum available excess reactivity would produce only 12 MWs which is insufficient to cause fuel melting even with conservative assumptions. Although precise structural rearrangement of the core would create a potential hazard, it is simply not credible to assume that such an arrangement would result from the forces of an earthquake or other catastrophic event. Even damage to the fuel from falling debris or other objects is unlikely given the normal reactor structure. An explosion from a metal-water reaction could not occur because there is no credible source of sufficient energy to initiate the reaction. A graphite fire could conceivably create some damage to the reactor but not enough to melt any fuel or initiate a metal-water reaction. The only credible accident involving offsite doses was determined to be a fuel-handling accident which, given highly conservative assumptions, would produce a whole-body dose equivalent of 2 rem from noble gas immersion and a lifetime dose equivalent commitment to the thyroid of 43 rem from radioiodines.

  1. Disposal Criticality Analysis Methodology Topical Report

    SciTech Connect

    D.G. Horton

    1998-11-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda.

  2. ADAM: An Accident Diagnostic,Analysis and Management System - Applications to Severe Accident Simulation and Management

    SciTech Connect

    Zavisca, M.J.; Khatib-Rahbar, M.; Esmaili, H.; Schulz, R.

    2002-07-01

    The Accident Diagnostic, Analysis and Management (ADAM) computer code has been developed as a tool for on-line applications to accident diagnostics, simulation, management and training. ADAM's severe accident simulation capabilities incorporate a balance of mechanistic, phenomenologically based models with simple parametric approaches for elements including (but not limited to) thermal hydraulics; heat transfer; fuel heatup, meltdown, and relocation; fission product release and transport; combustible gas generation and combustion; and core-concrete interaction. The overall model is defined by a relatively coarse spatial nodalization of the reactor coolant and containment systems and is advanced explicitly in time. The result is to enable much faster than real time (i.e., 100 to 1000 times faster than real time on a personal computer) applications to on-line investigations and/or accident management training. Other features of the simulation module include provision for activation of water injection, including the Engineered Safety Features, as well as other mechanisms for the assessment of accident management and recovery strategies and the evaluation of PSA success criteria. The accident diagnostics module of ADAM uses on-line access to selected plant parameters (as measured by plant sensors) to compute the thermodynamic state of the plant, and to predict various margins to safety (e.g., times to pressure vessel saturation and steam generator dryout). Rule-based logic is employed to classify the measured data as belonging to one of a number of likely scenarios based on symptoms, and a number of 'alarms' are generated to signal the state of the reactor and containment. This paper will address the features and limitations of ADAM with particular focus on accident simulation and management. (authors)

  3. Case for integral core-disruptive accident analysis

    SciTech Connect

    Luck, L B; Bell, C R

    1985-01-01

    Integral analysis is an approach used at the Los Alamos National Laboratory to cope with the broad multiplicity of accident paths and complex phenomena that characterize the transition phase of core-disruptive accident progression in a liquid-metal-cooled fast breeder reactor. The approach is based on the combination of a reference calculation, which is intended to represent a band of similar accident paths, and associated system- and separate-effect studies, which are designed to determine the effect of uncertainties. Results are interpreted in the context of a probabilistic framework. The approach was applied successfully in two studies; illustrations from the Clinch River Breeder Reactor licensing assessment are included.

  4. When a Crash Is Really an Accident: A Concept Analysis.

    PubMed

    Knechel, Nancy

    2015-01-01

    The debate over using the word accident has encouraged some groups to adopt the word crash, while other groups retain using accident. This article addresses the inconsistent and interchangeable use of the terms accident and crash. This conceptual analysis used a Critical Review Method, with Critical Theory as the theoretical framework. A literature search was conducted in MEDLINE and CINAHL for articles published through 2011. An extensive review of literature was followed by purposive sampling of articles published in 2011 across countries, disciplines, and contexts. Forty-seven articles were read in entirety, resulting in 2 themes for accident: intent and injury. Seven articles were critically analyzed for intent, injury, and underrepresented margins of society (5 articles using the term accident, 1 article using crash and accident interchangeably, and 1 using only crash). There was congruency on injury across all 7 articles. Results were mixed for intent and the incorporation of marginalized people. Although there is evidence that the use of the word accident should be maintained when the event could not have reasonably been prevented, the theoretical framework highlights this will likely perpetuate the conceptual confusion. The recommendation is to (1) identify the mechanism of injury, (2) identify event as intentional versus nonintentional, and (3) identify event as preventable versus nonpreventable. PMID:26574946

  5. OFFSITE RADIOLOGICAL CONSEQUENCE ANALYSIS FOR THE BOUNDING FLAMMABLE GAS ACCIDENT

    SciTech Connect

    KRIPPS, L.J.

    2005-02-18

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a SST. The calculation applies reasonably conservative input parameters in accordance with guidance in DOE-STD-3009, Appendix A. The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety-class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. A detonation in an SST versus a double-shell tank (DST) was selected as the bounding accident because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  6. Analysis of tritium mission FMEF/FAA fuel handling accidents

    SciTech Connect

    Van Keuren, J.C.

    1997-11-18

    The Fuels Material Examination Facility/Fuel Assembly Area is proposed to be used for fabrication of mixed oxide fuel to support the Fast Flux Test Facility (FFTF) tritium/medical isotope mission. The plutonium isotope mix for the new mission is different than that analyzed in the FMEF safety analysis report. A reanalysis was performed of three representative accidents for the revised plutonium mix to determine the impact on the safety analysis. Current versions computer codes and meterology data files were used for the analysis. The revised accidents were a criticality, an explosion in a glovebox, and a tornado. The analysis concluded that risk guidelines were met with the revised plutonium mix.

  7. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.

  8. Hanford Waste Tank Bump Accident and Consequence Analysis

    SciTech Connect

    BRATZEL, D.R.

    2000-06-20

    This report provides a new evaluation of the Hanford tank bump accident analysis and consequences for incorporation into the Authorization Basis. The analysis scope is for the safe storage of waste in its current configuration in single-shell and double-shell tanks.

  9. Intelligent speed adaptation: accident savings and cost-benefit analysis.

    PubMed

    Carsten, O M J; Tate, F N

    2005-05-01

    The UK External Vehicle Speed Control (EVSC) project has made a prediction of the accident savings with intelligent speed adaptation (ISA), and estimated the costs and benefits of national implementation. The best prediction of accident reduction was that the fitting on all vehicles of a simple mandatory system, with which it would be impossible for vehicles to exceed the speed limit, would save 20% of injury accidents and 37% of fatal accidents. A more complex version of the mandatory system, including a capability to respond to current network and weather conditions, would result in a reduction of 36% in injury accidents and 59% in fatal accidents. The implementation path recommended by the project would lead to compulsory usage in 2019. The cost-benefit analysis carried out showed that the benefit-cost ratios for this implementation strategy were in a range from 7.9 to 15.4, i.e. the payback for the system could be up to 15 times the cost of implementing and running it. PMID:15784194

  10. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    SciTech Connect

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  11. Accident Sequence Evaluation Program: Human reliability analysis procedure

    SciTech Connect

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  12. Accident analysis of the windowless target system

    SciTech Connect

    Bianchi, F.; Ferri, R.

    2006-07-01

    Transmutation systems are able to reduce the radio-toxicity and amount of High-Level Wastes (HLW), which are the main concerns related to the peaceful use of nuclear energy, and therefore they should make nuclear energy more easily acceptable by population. A transmutation system consists of a sub-critical fast reactor, an accelerator and a Target System, where the spallation reactions needed to sustain the chain reaction take place. Three options were proposed for the Target System within the European project PDS-XADS (Preliminary Design Studies on an Experimental Accelerator Driven System): window, windowless and solid. This paper describes the constraints taken into account in the design of the windowless Target System for the large Lead-Bismuth-Eutectic cooled XADS and deals with the results of the calculations performed to assess the behaviour of the target during some accident sequences related to pump trips. (authors)

  13. MELCOR accident analysis for ARIES-ACT

    SciTech Connect

    Paul W. Humrickhouse; Brad J. Merrill

    2012-08-01

    We model a loss of flow accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with LiPb as coolant and breeder, a helium cooled steel structural ring and tungsten divertors, a thin-walled, helium cooled vacuum vessel, and a room temperature water-cooled shield outside the vacuum vessel. The water heat transfer system is designed to remove heat by natural circulation during a LOFA. The MELCOR model uses time-dependent decay heats for each component determined by 1-D modeling. The MELCOR model shows that, despite periodic boiling of the water coolant, that structures are kept adequately cool by the passive safety system.

  14. Human factors review for Severe Accident Sequence Analysis (SASA)

    SciTech Connect

    Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

    1984-01-01

    The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure.

  15. Impact of meltdown-accident modeling developments on PWR analysis

    SciTech Connect

    Haskin, F.E.; Shaffer, C.J.

    1982-01-01

    Models recently incorporated into a new version of the MARCH computer code have altered earlier estimates of some parameter responses during meltdown accidents of PWRs. These models include improved fuel-coolant heat transfer models, new core radiative heat transfer models, improved in-vessel flashing models, improved models for burning of combustible gases in containment, and CORCON1 models for core-concrete interactions. Studies performed with these models have been used to identify and, in some cases, bound meltdown accident analysis uncertainties.

  16. Core Disruptive Accident Analysis using ASTERIA-FBR

    NASA Astrophysics Data System (ADS)

    Ishizu, Tomoko; Endo, Hiroshi; Yamamoto, Toshihisa; Tatewaki, Isao

    2014-06-01

    JNES is developing a core disruptive accident analysis code, ASTERIA-FBR, which tightly couples the thermal-hydraulics and the neutronics to simulate the core behavior during core disruptive accidents of fast breeder reactors (FBRs). ASTERIA-FBR consists of the three-dimensional thermal-hydraulics calculation module: CONCORD, the fuel pin behavior calculation module: FEMAXI-FBR, and the space-time neutronics module: Dynamic-GMVP or PARTISN/RKIN. This paper describes a comparison between characteristics of GMVP and PARTISN and summarizes the challenging issues on applying Dynamic-GMVP to the calculation against unprotected loss-of-flow (ULOF) event which is a typical initiator of core disruptive accident of FBR. The statistical error included in the calculation results may affect the super-prompt criticality during ULOF event and thus the amount of released energy.

  17. Mass Spectrometry Methodology in Lipid Analysis

    PubMed Central

    Li, Lin; Han, Juanjuan; Wang, Zhenpeng; Liu, Jian’an; Wei, Jinchao; Xiong, Shaoxiang; Zhao, Zhenwen

    2014-01-01

    Lipidomics is an emerging field, where the structures, functions and dynamic changes of lipids in cells, tissues or body fluids are investigated. Due to the vital roles of lipids in human physiological and pathological processes, lipidomics is attracting more and more attentions. However, because of the diversity and complexity of lipids, lipid analysis is still full of challenges. The recent development of methods for lipid extraction and analysis and the combination with bioinformatics technology greatly push forward the study of lipidomics. Among them, mass spectrometry (MS) is the most important technology for lipid analysis. In this review, the methodology based on MS for lipid analysis was introduced. It is believed that along with the rapid development of MS and its further applications to lipid analysis, more functional lipids will be identified as biomarkers and therapeutic targets and for the study of the mechanisms of disease. PMID:24921707

  18. Cold Vacuum Drying facility design basis accident analysis documentation

    SciTech Connect

    CROWE, R.D.

    2000-08-08

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

  19. Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident

    NASA Technical Reports Server (NTRS)

    Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

    1994-01-01

    Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

  20. Analysis of drug combinations: current methodological landscape.

    PubMed

    Foucquier, Julie; Guedj, Mickael

    2015-06-01

    Combination therapies exploit the chances for better efficacy, decreased toxicity, and reduced development of drug resistance and owing to these advantages, have become a standard for the treatment of several diseases and continue to represent a promising approach in indications of unmet medical need. In this context, studying the effects of a combination of drugs in order to provide evidence of a significant superiority compared to the single agents is of particular interest. Research in this field has resulted in a large number of papers and revealed several issues. Here, we propose an overview of the current methodological landscape concerning the study of combination effects. First, we aim to provide the minimal set of mathematical and pharmacological concepts necessary to understand the most commonly used approaches, divided into effect-based approaches and dose-effect-based approaches, and introduced in light of their respective practical advantages and limitations. Then, we discuss six main common methodological issues that scientists have to face at each step of the development of new combination therapies. In particular, in the absence of a reference methodology suitable for all biomedical situations, the analysis of drug combinations should benefit from a collective, appropriate, and rigorous application of the concepts and methods reviewed here. PMID:26171228

  1. Analysis of drug combinations: current methodological landscape.

    TOXLINE Toxicology Bibliographic Information

    Foucquier J; Guedj M

    2015-06-01

    Combination therapies exploit the chances for better efficacy, decreased toxicity, and reduced development of drug resistance and owing to these advantages, have become a standard for the treatment of several diseases and continue to represent a promising approach in indications of unmet medical need. In this context, studying the effects of a combination of drugs in order to provide evidence of a significant superiority compared to the single agents is of particular interest. Research in this field has resulted in a large number of papers and revealed several issues. Here, we propose an overview of the current methodological landscape concerning the study of combination effects. First, we aim to provide the minimal set of mathematical and pharmacological concepts necessary to understand the most commonly used approaches, divided into effect-based approaches and dose-effect-based approaches, and introduced in light of their respective practical advantages and limitations. Then, we discuss six main common methodological issues that scientists have to face at each step of the development of new combination therapies. In particular, in the absence of a reference methodology suitable for all biomedical situations, the analysis of drug combinations should benefit from a collective, appropriate, and rigorous application of the concepts and methods reviewed here.

  2. Analysis of drug combinations: current methodological landscape

    PubMed Central

    Foucquier, Julie; Guedj, Mickael

    2015-01-01

    Combination therapies exploit the chances for better efficacy, decreased toxicity, and reduced development of drug resistance and owing to these advantages, have become a standard for the treatment of several diseases and continue to represent a promising approach in indications of unmet medical need. In this context, studying the effects of a combination of drugs in order to provide evidence of a significant superiority compared to the single agents is of particular interest. Research in this field has resulted in a large number of papers and revealed several issues. Here, we propose an overview of the current methodological landscape concerning the study of combination effects. First, we aim to provide the minimal set of mathematical and pharmacological concepts necessary to understand the most commonly used approaches, divided into effect-based approaches and dose–effect-based approaches, and introduced in light of their respective practical advantages and limitations. Then, we discuss six main common methodological issues that scientists have to face at each step of the development of new combination therapies. In particular, in the absence of a reference methodology suitable for all biomedical situations, the analysis of drug combinations should benefit from a collective, appropriate, and rigorous application of the concepts and methods reviewed here. PMID:26171228

  3. RAMS (Risk Analysis - Modular System) methodology

    SciTech Connect

    Stenner, R.D.; Strenge, D.L.; Buck, J.W.

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  4. Requirements Analysis in the Value Methodology

    SciTech Connect

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  5. Consequence analysis of a steam generator tube rupture accident

    SciTech Connect

    Wu, J.M.; Chuang, C.F.

    1984-12-01

    A flashing droplet model was developed to examine the rupture flow of reactor coolant and its transport phenomena through the stream generator during a steam generator tube rupture accident. The model includes flashing flow; droplet formation; droplet removal by tube bundles, bubble scrubbing, steam separators, and steam dryers; and droplet size change by evaporation and condensation. The calculation follows the actual sequence of events during the accident. Those reactor coolant droplets escaping from the steam generator are used to estimate the radioactivity released into the environment. The steam generator tube rupture accident that occurred at the Prairie Island Plant on October 2, 1979, was studied using the model. The model estimated a release of 204 ..mu..Ci of /sup 131/I equivalent activity. The U.S. Nuclear Regulatory Commission estimated a 210-..mu..Ci release, assuming an iodine partition factor of 1/100 in the steam-generator. The model was also used to analyze a hypothetical steam generator tube rupture accident coupled with loss of off-site power in a large 1100-MW (electric) Westinghouse four-loop plant. The model estimated that 45 Ci of /sup 131/I equivalent activity could be released through the relief valves, which were stuck open for 30 min. The number is eight times higher than the estimate from the Westinghouse safety analysis report using a uniform mixing model.

  6. Combining task analysis and fault tree analysis for accident and incident analysis: a case study from Bulgaria.

    PubMed

    Doytchev, Doytchin E; Szwillus, Gerd

    2009-11-01

    Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation. PMID:19819365

  7. Accident analysis of heavy water cooled thorium breeder reactor

    SciTech Connect

    Yulianti, Yanti; Su’ud, Zaki; Takaki, Naoyuki

    2015-04-16

    Thorium has lately attracted considerable attention because it is accumulating as a by-product of large scale rare earth mining. The objective of research is to analyze transient behavior of a heavy water cooled thorium breeder that is designed by Tokai University and Tokyo Institute of Technology. That is oxide fueled, PWR type reactor with heavy water as primary coolant. An example of the optimized core has relatively small moderator to fuel volume ratio (MFR) of 0.6 and the characteristics of the core are burn-up of 67 GWd/t, breeding ratio of 1.08, burn-up reactivity loss during cycles of < 0.2% dk/k, and negative coolant reactivity coefficient. One of the nuclear reactor accidents types examined here is Unprotected Transient over Power (UTOP) due to withdrawing of the control rod that result in the positive reactivity insertion so that the reactor power will increase rapidly. Another accident type is Unprotected Loss of Flow (ULOF) that caused by failure of coolant pumps. To analyze the reactor accidents, neutron distribution calculation in the nuclear reactor is the most important factor. The best expression for the neutron distribution is the Boltzmann transport equation. However, solving this equation is very difficult so that the space-time diffusion equation is commonly used. Usually, space-time diffusion equation is solved by employing a point kinetics approach. However, this approach is less accurate for a spatially heterogeneous nuclear reactor and the nuclear reactor with quite large reactivity input. Direct method is therefore used to solve space-time diffusion equation which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference method is solved by using iterative methods. The indication of UTOP accident is decreasing macroscopic absorption cross-section that results large external reactivity, and ULOF accident is indicated by decreasing coolant flow. The power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  8. Accident analysis of heavy water cooled thorium breeder reactor

    NASA Astrophysics Data System (ADS)

    Yulianti, Yanti; Su'ud, Zaki; Takaki, Naoyuki

    2015-04-01

    Thorium has lately attracted considerable attention because it is accumulating as a by-product of large scale rare earth mining. The objective of research is to analyze transient behavior of a heavy water cooled thorium breeder that is designed by Tokai University and Tokyo Institute of Technology. That is oxide fueled, PWR type reactor with heavy water as primary coolant. An example of the optimized core has relatively small moderator to fuel volume ratio (MFR) of 0.6 and the characteristics of the core are burn-up of 67 GWd/t, breeding ratio of 1.08, burn-up reactivity loss during cycles of < 0.2% dk/k, and negative coolant reactivity coefficient. One of the nuclear reactor accidents types examined here is Unprotected Transient over Power (UTOP) due to withdrawing of the control rod that result in the positive reactivity insertion so that the reactor power will increase rapidly. Another accident type is Unprotected Loss of Flow (ULOF) that caused by failure of coolant pumps. To analyze the reactor accidents, neutron distribution calculation in the nuclear reactor is the most important factor. The best expression for the neutron distribution is the Boltzmann transport equation. However, solving this equation is very difficult so that the space-time diffusion equation is commonly used. Usually, space-time diffusion equation is solved by employing a point kinetics approach. However, this approach is less accurate for a spatially heterogeneous nuclear reactor and the nuclear reactor with quite large reactivity input. Direct method is therefore used to solve space-time diffusion equation which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference method is solved by using iterative methods. The indication of UTOP accident is decreasing macroscopic absorption cross-section that results large external reactivity, and ULOF accident is indicated by decreasing coolant flow. The power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  9. Offsite radiological consequence analysis for the bounding flammable gas accident

    SciTech Connect

    CARRO, C.A.

    2003-03-19

    The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. As will be shown, the consequences of a detonation in either an SST or a double-shell tank (DST) are approximately equal. A detonation in an SST was selected as the bounding condition because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are generally greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  10. A general methodology for population analysis

    NASA Astrophysics Data System (ADS)

    Lazov, Petar; Lazov, Igor

    2014-12-01

    For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ρ, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ν, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn, n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.

  11. NASA Accident Precursor Analysis Handbook, Version 1.0

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

    2011-01-01

    Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate accident precursors by evaluating anomaly occurrences for their system safety implications and, through both analytical and deliberative methods used to project to other circumstances, identifying those that portend more serious consequences to come if effective corrective action is not taken. APA builds upon existing safety analysis processes currently in practice within NASA, leveraging their results to provide an improved understanding of overall system risk. As such, APA represents an important dimension of safety evaluation; as operational experience is acquired, precursor information is generated such that it can be fed back into system safety analyses to risk-inform safety improvements. Importantly, APA utilizes anomaly data to predict risk whereas standard reliability and PRA approaches utilize failure data which often is limited and rare.

  12. Comprehensive Analysis of Two Downburst-Related Aircraft Accidents

    NASA Technical Reports Server (NTRS)

    Shen, J.; Parks, E. K.; Bach, R. E.

    1996-01-01

    Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.

  13. Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis

    SciTech Connect

    Gilles Youinou; R. Sonat Sen

    2013-09-01

    The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

  14. A grounded theory model for analysis of marine accidents.

    PubMed

    Mullai, Arben; Paulsson, Ulf

    2011-07-01

    The purpose of this paper was to design a conceptual model for analysis of marine accidents. The model is grounded on large amounts of empirical data, i.e. the Swedish Maritime Administration database, which was thoroughly studied. This database contains marine accidents organized by ship and variable. The majority of variables are non-metric and some have never been analyzed because of the large number of values. Summary statistics were employed in the data analysis. In order to develop a conceptual model, the database variables were clustered into eleven main categories or constructs, which were organized according to their properties and connected with the path diagram of relationships. For demonstration purposes, one non-metric and five metric variables were selected, namely fatality, ship's properties (i.e. age, gross register tonnage, and length), number of people on board, and marine accidents. These were analyzed using the structural equation modeling (SEM) approach. The combined prediction power of the 'ship's properties' and 'number of people on board' independent variables accounted for 65% of the variance of the fatality. The model development was largely based on the data contained in the Swedish database. However, as this database shares a number of variables in common with other databases in the region and the world, the model presented in this paper could be applied to other datasets. The model has both theoretical and practical values. Recommendations for improvements in the database are also suggested. PMID:21545894

  15. Civil helicopter wire strike assessment study. Volume 2: Accident analysis briefs

    NASA Technical Reports Server (NTRS)

    Tuomela, C. H.; Brennan, M. F.

    1980-01-01

    A description and analysis of each of the 208 civil helicopter wire strike accidents reported to the National Transportation Safety Board (NTSB) for the ten year period 1970-1979 is given. The accident analysis briefs were based on pilot reports, FAA investigation reports, and such accident photographs as were made available. Briefs were grouped by year and, within year, by NTSB accident report number.

  16. Cost analysis methodology: Photovoltaic Manufacturing Technology Project

    SciTech Connect

    Whisnant, R.A. )

    1992-09-01

    This report describes work done under Phase 1 of the Photovoltaic Manufacturing Technology (PVMaT) Project. PVMaT is a five-year project to support the translation of research and development in PV technology into the marketplace. PVMaT, conceived as a DOE/industry partnership, seeks to advanced PV manufacturing technologies, reduce PV module production costs, increase module performance, and expand US commercial production capacities. Under PVMaT, manufacturers will propose specific manufacturing process improvements that may contribute to the goals of the project, which is to lessen the cost, thus hastening entry into the larger scale, grid-connected applications. Phase 1 of the PVMaT project is to identify obstacles and problems associated with manufacturing processes. This report describes the cost analysis methodology required under Phase 1 that will allow subcontractors to be ranked and evaluated during Phase 2.

  17. Road Traffic Accident Analysis of Ajmer City Using Remote Sensing and GIS Technology

    NASA Astrophysics Data System (ADS)

    Bhalla, P.; Tripathi, S.; Palria, S.

    2014-12-01

    With advancement in technology, new and sophisticated models of vehicle are available and their numbers are increasing day by day. A traffic accident has multi-facet characteristics associated with it. In India 93% of crashes occur due to Human induced factor (wholly or partly). For proper traffic accident analysis use of GIS technology has become an inevitable tool. The traditional accident database is a summary spreadsheet format using codes and mileposts to denote location, type and severity of accidents. Geo-referenced accident database is location-referenced. It incorporates a GIS graphical interface with the accident information to allow for query searches on various accident attributes. Ajmer city, headquarter of Ajmer district, Rajasthan has been selected as the study area. According to Police records, 1531 accidents occur during 2009-2013. Maximum accident occurs in 2009 and the maximum death in 2013. Cars, jeeps, auto, pickup and tempo are mostly responsible for accidents and that the occurrence of accidents is mostly concentrated between 4PM to 10PM. GIS has proved to be a good tool for analyzing multifaceted nature of accidents. While road safety is a critical issue, yet it is handled in an adhoc manner. This Study is a demonstration of application of GIS for developing an efficient database on road accidents taking Ajmer City as a study. If such type of database is developed for other cities, a proper analysis of accidents can be undertaken and suitable management strategies for traffic regulation can be successfully proposed.

  18. A DISCIPLINED APPROACH TO ACCIDENT ANALYSIS DEVELOPMENT AND CONTROL SELECTION

    SciTech Connect

    Ortner, T; Mukesh Gupta, M

    2007-04-13

    The development and use of a Safety Input Review Committee (SIRC) process promotes consistent and disciplined Accident Analysis (AA) development to ensure that it accurately reflects facility design and operation; and that the credited controls are effective and implementable. Lessons learned from past efforts were reviewed and factored into the development of this new process. The implementation of the SIRC process has eliminated many of the problems previously encountered during Safety Basis (SB) document development. This process has been subsequently adopted for use by several Savannah River Site (SRS) facilities with similar results and expanded to support other analysis activities.

  19. Analysis of Three Mile Island-Unit 2 accident

    SciTech Connect

    Not Available

    1980-03-01

    The Nuclear Safety Analysis Center (NSAC) of the Electric Power Research Institute has analyzed the Three Mile Island-2 accident. Early results of this analysis were a brief narrative summary, issued in mid-May 1979 and an initial version of this report issued later in 1979 as noted in the Foreword. The present report is a revised version of the 1979 report, containing summaries, a highly detailed sequence of events, a comparison of that sequence of events with those from other sources, 25 appendices, references and a list of abbreviations and acronyms. A matrix of equipment and system actions is included as a folded insert.

  20. Consequence analysis for a steam generator tube rupture accident

    SciTech Connect

    Chuang, C.F.

    1983-01-01

    Radioiodine release to the environment following a PWR steam generator tube rupture accident is studied. The objective is to develop a realistic model which may be used to examine the rupture flow of primary coolant and the transport process of leaked primary coolant in the steam generator. Estimated radioiodine release from the model is first compared with past accident data. The result indicates that NRC staff and the licensee may have underestimated the actual releases in the Prairie Island steam generator tube rupture accident. Results also show that a tube rupture without offsite power available may yield a radioiodine release of 8 times larger than the value given in the SAR analysis. Possible modification of recovery operation and equipment design are recommended in order to mitigate the radioiodine release. They are (1) trip reactor as soon as possible, (2) maintain reactor coolant pumps operation, (3) depressurize primary system as quickly as possible, (4) maintain feedwater flow properly, (5) actuate safety injection to draw borated water sooner, (6) do not isolate the faulty loop too early, and (7) modify the steam dryer design to include horizontal zig-zag plates.

  1. Injury pattern analysis of helicopter wire strike accidents (-Gz load).

    PubMed

    Farr, W D; Ruehle, C J; Posey, D M; Wagner, G N

    1985-12-01

    Injury patterns in rotary wing aircraft wire strike accidents were reviewed to determine mechanisms of injury. It was found that U.S. Army Safety Center data showed that between 1 January 1974 and 31 August 1981 there were 167 wire strikes involving Army helicopters which resulted in 60 injuries and 34 fatalities at a cost of $12,809,100. Updated data on all military rotary wing aircraft accidents investigated between 1978 and 1982 were screened by the Division of Aerospace Pathology to determine the mechanisms of injury to flight deck personnel. From 13 December 1978 to 23 June 1982, three types of rotary wing aircraft were in eight fatal accidents. These mishaps accounted for 28 casualties: 14 fatalities and 14 injuries. Aviators comprised 64.4% of the fatalities. Injury pattern analysis showed 100% had major head and neck injuries with 66% having basilar skull fractures. Two-thirds had associated mandibular fractures or evidence of impact forces transmitted through the mandible to the skull. The same number had wedge-shaped chin lacerations from impact with the cyclic control stick. We postulate transmission of lethal impact forces primarily in the +Gz direction through the mandible to the skull. This suggests either improper use and/or failure of the seat and restraint systems. PMID:4084179

  2. BESAFE II: Accident safety analysis code for MFE reactor designs

    NASA Astrophysics Data System (ADS)

    Sevigny, Lawrence Michael

    The viability of controlled thermonuclear fusion as an alternative energy source hinges on its desirability from an economic and an environmental and safety standpoint. It is the latter which is the focus of this thesis. For magnetic fusion energy (MFE) devices, the safety concerns equate to a design's behavior during a worst-case accident scenario which is the loss of coolant accident (LOCA). In this dissertation, we examine the behavior of MFE devices during a LOCA and how this behavior relates to the safety characteristics of the machine; in particular the acute, whole-body, early dose. In doing so, we have produced an accident safety code, BESAFE II, now available to the fusion reactor design community. The Appendix constitutes the User's Manual for BESAFE II. The theory behind early dose calculations including the mobilization of activation products is presented in Chapter 2. Since mobilization of activation products is a strong function of temperature, it becomes necessary to calculate the thermal response of a design during a LOCA in order to determine the fraction of the activation products which are mobilized and thus become the source for the dose. The code BESAFE II is designed to determine the temperature history of each region of a design and determine the resulting mobilization of activation products at each point in time during the LOCA. The BESAFE II methodology is discussed in Chapter 4, followed by demonstrations of its use for two reference design cases: a PCA-Li tokamak and a SiC-He tokamak. Of these two cases, it is shown that the SiC-He tokamak is a better design from an accident safety standpoint than the PCA-Li tokamak. It is also found that doses derived from temperature-dependent mobilization data are different than those predicted using set mobilization categories such as those that involve Piet fractions. This demonstrates the need for more experimental data on fusion materials. The possibility for future improvements and modifications to BESAFE II is discussed in Chapter 6, for example, by adding additional environmental indices such as a waste disposal index. The biggest improvement to BESAFE II would be an increase in the database of activation product mobilization for a larger spectrum of fusion reactor materials. The ultimate goal we have is for BESAFE II to become part of a systems design program which would include economic factors and allow both safety and the cost of electricity to influence design.

  3. Fault seal analysis: Methodology and case studies

    SciTech Connect

    Badley, M.E.; Freeman, B.; Needham, D.T.

    1996-12-31

    Fault seal can arise from reservoir/non-reservoir juxtaposition or by development of fault rock of high entry-pressure. The methodology for evaluating these possibilities uses detailed seismic mapping and well analysis. A {open_quote}first-order{close_quote} seal analysis involves identifying reservoir juxtaposition areas over the fault surface, using the mapped horizons and a refined reservoir stratigraphy defined by isochores at the fault surface. The {open_quote}second-order{close_quote} phase of the analysis assesses whether the sand-sand contacts are likely to support a pressure difference. We define two lithology-dependent attributes {open_quote}Gouge Ratio{close_quote} and {open_quote}Smear Factor{close_quote}. Gouge Ratio is an estimate of the proportion of fine-grained material entrained into the fault gouge from the wall rocks. Smear Factor methods estimate the profile thickness of a ductile shale drawn along the fault zone during faulting. Both of these parameters vary over the fault surface implying that faults cannot simply be designated {open_quote}sealing{close_quote} or {open_quote}non-sealing{close_quote}. An important step in using these parameters is to calibrate them in areas where across-fault pressure differences are explicitly known from wells on both sides of a fault. Our calibration for a number of datasets shows remarkably consistent results despite their diverse settings (e.g. Brent Province, Niger Delta, Columbus Basin). For example, a Shale Gouge Ratio of c. 20% (volume of shale in the slipped interval) is a typical threshold between minimal across-fault pressure difference and significant seal.

  4. Fault seal analysis: Methodology and case studies

    SciTech Connect

    Badley, M.E.; Freeman, B.; Needham, D.T. )

    1996-01-01

    Fault seal can arise from reservoir/non-reservoir juxtaposition or by development of fault rock of high entry-pressure. The methodology for evaluating these possibilities uses detailed seismic mapping and well analysis. A [open quote]first-order[close quote] seal analysis involves identifying reservoir juxtaposition areas over the fault surface, using the mapped horizons and a refined reservoir stratigraphy defined by isochores at the fault surface. The [open quote]second-order[close quote] phase of the analysis assesses whether the sand-sand contacts are likely to support a pressure difference. We define two lithology-dependent attributes [open quote]Gouge Ratio[close quote] and [open quote]Smear Factor[close quote]. Gouge Ratio is an estimate of the proportion of fine-grained material entrained into the fault gouge from the wall rocks. Smear Factor methods estimate the profile thickness of a ductile shale drawn along the fault zone during faulting. Both of these parameters vary over the fault surface implying that faults cannot simply be designated [open quote]sealing[close quote] or [open quote]non-sealing[close quote]. An important step in using these parameters is to calibrate them in areas where across-fault pressure differences are explicitly known from wells on both sides of a fault. Our calibration for a number of datasets shows remarkably consistent results despite their diverse settings (e.g. Brent Province, Niger Delta, Columbus Basin). For example, a Shale Gouge Ratio of c. 20% (volume of shale in the slipped interval) is a typical threshold between minimal across-fault pressure difference and significant seal.

  5. Traffic accident analysis using GIS: a case study of Kyrenia City

    NASA Astrophysics Data System (ADS)

    Kara, Can; Akçit, Nuhcan

    2015-06-01

    Traffic accidents are causing major deaths in urban environments, so analyzing locations of the traffic accidents and their reasons is crucial. In this manner, patterns of accidents and hotspot distribution are analyzed by using geographic information technology. Locations of the traffic accidents in the years 2011, 2012 and 2013 are combined to generate the kernel distribution map of Kyrenia City. This analysis aims to find high dense intersections and segments within the city. Additionally, spatial autocorrelation methods Local Morans I and Getis-Ord Gi are employed . The results are discussed in detail for further analysis. Finally, required changes for numerous intersections are suggested to decrease potential risks of high dense accident locations.

  6. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  7. Extension of ship accident analysis to multiple-package shipments

    SciTech Connect

    Mills, G.S.; Neuhauser, K.S.

    1997-11-01

    Severe ship accidents and the probability of radioactive material release from spent reactor fuel casks were investigated previously. Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one to a few casks. These smaller, more numerous packagings are typically placed in ISO containers for ease of handling, and several ISO containers may be placed in one of several holds of a cargo ship. In such cases, the size of a radioactive release resulting from a severe collision with another ship is determined not by the likelihood of compromising a single, robust package but by the probability that a certain fraction of 10`s or 100`s of individual packagings is compromised. The previous analysis involved a statistical estimation of the frequency of accidents which would result in damage to a cask located in one of seven cargo holds in a collision with another ship. The results were obtained in the form of probabilities (frequencies) of accidents of increasing severity and of release fractions for each level of severity. This paper describes an extension of the same general method in which the multiple packages are assumed to be compacted by an intruding ship`s bow until there is no free space in the hold. At such a point, the remaining energy of the colliding ship is assumed to be dissipated by progressively crushing the RAM packagings and the probability of a particular fraction of package failures is estimated by adaptation of the statistical method used previously. The parameters of a common, well characterized packaging, the 6M with 2R inner containment vessel, were employed as an illustrative example of this analysis method. However, the method is readily applicable to other packagings for which crush strengths have been measured or can be estimated with satisfactory confidence.

  8. A Review of Citation Analysis Methodologies for Collection Management

    ERIC Educational Resources Information Center

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  9. A Review of Citation Analysis Methodologies for Collection Management

    ERIC Educational Resources Information Center

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  10. Offsite radiological consequence analysis for the bounding aircraft crash accident

    SciTech Connect

    OBERG, B.D.

    2003-03-22

    The purpose of this calculation note is to quantitatively analyze a bounding aircraft crash accident for comparison to the DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', Appendix A, Evaluation Guideline of 25 rem. The potential of aircraft impacting a facility was evaluated using the approach given in DOE-STD-3014-96, ''Accident Analysis for Aircraft Crash into Hazardous Facilities''. The following aircraft crash frequencies were determined for the Tank Farms in RPP-11736, ''Assessment Of Aircraft Crash Frequency For The Hanford Site 200 Area Tank Farms'': (1) The total aircraft crash frequency is ''extremely unlikely.'' (2) The general aviation crash frequency is ''extremely unlikely.'' (3) The helicopter crash frequency is ''beyond extremely unlikely.'' (4) For the Hanford Site 200 Areas, other aircraft type, commercial or military, each above ground facility, and any other type of underground facility is ''beyond extremely unlikely.'' As the potential of aircraft crash into the 200 Area tank farms is more frequent than ''beyond extremely unlikely,'' consequence analysis of the aircraft crash is required.

  11. Cross-database analysis to identify relationships between aircraft accidents and incidents

    NASA Astrophysics Data System (ADS)

    Nazeri, Zohreh

    Air transportation systems are designed to ensure that aircraft accidents are rare events. To minimize these accidents, factors causing or contributing to accidents must be understood and prevented. Despite many efforts by the aviation safety community to reduce the accidents, accident rates have been stable for decades. One explanation could be that direct and obvious causes that previously occurred with a relatively high frequency have already been addressed over the past few decades. What remains is a much more difficult challenge---identifying less obvious causes, combinations of factors that, when occurring together, may potentially lead to an accident. Contributions of this research to the aviation safety community are two-fold: (1) The analyses conducted in this research, identified significant accident factors. Detection and prevention of these factors, could prevent potential future accidents. The holistic study made it possible to compare the factors in a common framework. The identified factors were compared and ranked in terms of their likelihood of being involved in accidents. Corrective actions by the FAA Aviation Safety Oversight (ASO), air carrier safety offices, and the aviation safety community in general, could target the high-ranked factors first. The aviation safety community can also use the identified factors as a benchmark to measure and compare safety levels in different periods and/or at different regions. (2) The methodology established in this study, can be used by researchers in future studies. By applying this methodology to the safety data, areas prone to future accidents can be detected and addressed. Air carriers can apply this methodology to analyze their proprietary data and find detailed safety factors specific to their operations. The Factor Support Ratio metric introduced in this research, can be used to measure and compare different safety factors. (Abstract shortened by UMI.)

  12. Analysis of surface powered haulage accidents, January 1990--July 1996

    SciTech Connect

    Fesak, G.M.; Breland, R.M.; Spadaro, J.

    1996-12-31

    This report addresses surface haulage accidents that occurred between January 1990 and July 1996 involving haulage trucks (including over-the-road trucks), front-end-loaders, scrapers, utility trucks, water trucks, and other mobile haulage equipment. The study includes quarries, open pits and surface coal mines utilizing self-propelled mobile equipment to transport personnel, supplies, rock, overburden material, ore, mine waste, or coal for processing. A total of 4,397 accidents were considered. This report summarizes the major factors that led to the accidents and recommends accident prevention methods to reduce the frequency of these accidents.

  13. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  14. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    SciTech Connect

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  15. Integral Test and Engineering Analysis of Coolant Depletion During a Large-Break Loss-of-Coolant Accident

    SciTech Connect

    Kim, Yong Soo; Park, Chang Hwan; Bae, Byoung Uhn; Park, Goon Cherl; Suh, Kune Yull; Lee, Un Chul

    2005-02-15

    This study concerns the development of an integrated calculation methodology with which to continually and consistently analyze the progression of an accident from the design-basis accident phase via core uncovery to the severe accident phase. The depletion rate of reactor coolant inventory was experimentally investigated after the safety injection failure during a large-break loss-of-coolant accident utilizing the Seoul National University Integral Test Facility (SNUF), which is scaled down to 1/6.4 in length and 1/178 in area from the APR1400 [Advanced Power Reactor 1400 MW(electric)]. The experimental results showed that the core coolant inventory decreased five times faster before than after the extinction of sweepout in the reactor downcomer, which is induced by the incoming steam from the intact cold legs. The sweepout occurred on top of the spillover from the downcomer region and expedited depletion of the core coolant inventory. The test result was simulated with the MAAP4 severe accident analysis code. The calculation results of the original MAAP4 deviated from the test data in terms of coolant inventory distribution in the test vessel. After the calculation algorithm of coolant level distribution was improved by including the subroutine of pseudo pressure buildup, which accounts for the differential pressure between the core and downcomer in MAAP4, the core melt progression was delayed by hundreds of seconds, and the code prediction was in reasonable agreement with the overall behavior of the SNUF experiment.

  16. Geographical information systems aided traffic accident analysis system case study: city of Afyonkarahisar.

    PubMed

    Erdogan, Saffet; Yilmaz, Ibrahim; Baybura, Tamer; Gullu, Mevlut

    2008-01-01

    Geographical Information System (GIS) technology has been a popular tool for visualization of accident data and analysis of hot spots in highways. Many traffic agencies have been using GIS for accident analysis. Accident analysis studies aim at the identification of high rate accident locations and safety deficient areas on the highways. So, traffic officials can implement precautionary measures and provisions for traffic safety. Since accident reports are prepared in textual format in Turkey, this situation makes it difficult to analyze accident results. In our study, we developed a system transforming these textual data to tabular form and then this tabular data were georeferenced onto the highways. Then, the hot spots in the highways in Afyonkarahisar administrative border were explored and determined with two different methods of Kernel Density analysis and repeatability analysis. Subsequently, accident conditions at these hot spots were examined. We realized that the hot spots determined with two methods reflect really problematic places such as cross roads, junction points etc. Many of previous studies introduced GIS only as a visualization tool for accident locations. The importance of this study was to use GIS as a management system for accident analysis and determination of hot spots in Turkey with statistical analysis methods. PMID:18215546

  17. Radionuclide Analysis on Bamboos following the Fukushima Nuclear Accident

    PubMed Central

    Higaki, Takumi; Higaki, Shogo; Hirota, Masahiro; Akita, Kae; Hasezawa, Seiichiro

    2012-01-01

    In response to contamination from the recent Fukushima nuclear accident, we conducted radionuclide analysis on bamboos sampled from six sites within a 25 to 980 km radius of the Fukushima Daiichi nuclear power plant. Maximum activity concentrations of radiocesium 134Cs and 137Cs in samples from Fukushima city, 65 km away from the Fukushima Daiichi plant, were in excess of 71 and 79 kBq/kg, dry weight (DW), respectively. In Kashiwa city, 195 km away from the Fukushima Daiichi, the sample concentrations were in excess of 3.4 and 4.3 kBq/kg DW, respectively. In Toyohashi city, 440 km away from the Fukushima Daiichi, the concentrations were below the measurable limits of up to 4.5 Bq/kg DW. In the radiocesium contaminated samples, the radiocesium activity was higher in mature and fallen leaves than in young leaves, branches and culms. PMID:22496858

  18. Accident sequence analysis for sites producing and storing explosives.

    PubMed

    Papazoglou, Ioannis A; Aneziris, Olga; Konstandinidou, Myrto; Giakoumatos, Ieronymos

    2009-11-01

    This paper presents a QRA-based approach for assessing and evaluating the safety of installations handling explosive substances. Comprehensive generic lists of immediate causes and initiating events of detonation and deflagration of explosive substances as well as safety measures preventing these explosions are developed. Initiating events and corresponding measures are grouped under the more general categories of explosion due to shock wave, explosion due to mechanical energy, thermal energy, electrical energy, chemical energy, and electromagnetic radiation. Generic accident sequences are developed using Event Trees. This analysis is adapted to plant-specific conditions and potentially additional protective measures are rank-ordered in terms of the induced reduction in the frequency of explosion, by including also uncertainty. This approach has been applied to 14 plants in Greece with very satisfactory results. PMID:19819362

  19. Hypothetical accident conditions thermal analysis of the 5320 package

    SciTech Connect

    Hensel, S.J.; Gromada, R.J.

    1995-12-31

    An axisymmetric model of the 5320 package was created to perform hypothetical accident conditions (HAC) thermal calculations. The analyses assume the 5320 package contains 359 grams of plutonium-238 (203 Watts) in the form of an oxide powder at a minimum density of 2.4 g/cc or at a maximum density of 11.2 g/cc. The solution from a non-solar 100 F ambient steady-state analysis was used as the initial conditions for the fire transient. A 30 minute 1,475 F fire transient followed by cooling via natural convection and thermal radiation to a 100 F non-solar environment was analyzed to determine peak component temperatures and vessel pressures. The 5320 package was considered to be horizontally suspended within the fire during the entire transient.

  20. Aircraft Accident Prevention: Loss-of-Control Analysis

    NASA Technical Reports Server (NTRS)

    Kwatny, Harry G.; Dongmo, Jean-Etienne T.; Chang, Bor-Chin; Bajpai, Guarav; Yasar, Murat; Belcastro, Christine M.

    2009-01-01

    The majority of fatal aircraft accidents are associated with loss-of-control . Yet the notion of loss-of-control is not well-defined in terms suitable for rigorous control systems analysis. Loss-of-control is generally associated with flight outside of the normal flight envelope, with nonlinear influences, and with an inability of the pilot to control the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft and the state and control constraints within which the aircraft must operate. In this paper we examine how these nonlinearities affect the ability to control the aircraft and how they may contribute to loss-of-control. Examples are provided using NASA s Generic Transport Model.

  1. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    SciTech Connect

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-10-15

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report.

  2. An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)

    NASA Technical Reports Server (NTRS)

    Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

    2002-01-01

    A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

  3. Major Accidents (Gray Swans) Likelihood Modeling Using Accident Precursors and Approximate Reasoning.

    PubMed

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-07-01

    Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. PMID:26032965

  4. Analysis of Waste Leak and Toxic Chemical Release Accidents from Waste Feed Delivery (WFD) Diluent System

    SciTech Connect

    WILLIAMS, J.C.

    2000-09-15

    Radiological and toxicological consequences are calculated for 4 postulated accidents involving the Waste Feed Delivery (WFD) diluent addition systems. Consequences for the onsite and offsite receptor are calculated. This analysis contains technical information used to determine the accident consequences for the River Protection Project (RPP) Final Safety Analysis Report (FSAR).

  5. Otorhinolaryngologic disorders and diving accidents: an analysis of 306 divers.

    PubMed

    Klingmann, Christoph; Praetorius, Mark; Baumann, Ingo; Plinkert, Peter K

    2007-10-01

    Diving is a very popular leisure activity with an increasing number of participants. As more than 80% of the diving related problems involve the head and neck region, every otorhinolaryngologist should be familiar with diving medical standards. We here present an analysis of more than 300 patients we have treated in the past four years. Between January 2002 and October 2005, 306 patients presented in our department with otorhinological disorders after diving, or after diving accidents. We collected the following data: name, sex, age, date of treatment, date of accident, diagnosis, special aspects of the diagnosis, number of dives, diving certification, whether and which surgery had been performed, history of acute diving accidents or follow up treatment, assessment of fitness to dive and special remarks. The study setting was a retrospective cohort study. The distribution of the disorders was as follows: 24 divers (8%) with external ear disorders, 140 divers (46%) with middle ear disorders, 56 divers (18%) with inner ear disorders, 53 divers (17%) with disorders of the nose and sinuses, 24 divers (8%) with decompression illness (DCI) and 9 divers (3%) who complained of various symptoms. Only 18% of the divers presented with acute disorders. The most common disorder (24%) was Eustachian tube dysfunction. Female divers were significantly more often affected. Chronic sinusitis was found to be associated with a significantly higher number of performed dives. Conservative treatment failed in 30% of the patients but sinus surgery relieved symptoms in all patients of this group. The middle ear is the main problem area for divers. Middle ear ventilation problems due to Eustachian tube dysfunction can be treated conservatively with excellent results whereas pathology of the tympanic membrane and ossicular chain often require surgery. More than four out of five patients visited our department to re-establish their fitness to dive. Although the treatment of acute diving-related disorders is an important field for the treatment of divers, the main need of divers seems to be assessment and recovery of their fitness to dive. PMID:17639445

  6. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    SciTech Connect

    Su'ud, Zaki; Anshari, Rio

    2012-06-06

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  7. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Su'ud, Zaki; Anshari, Rio

    2012-06-01

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  8. Methodological Aspects Regarding The Organizational Stress Analysis

    NASA Astrophysics Data System (ADS)

    Irimie, Sabina; Pricope (Muntean), Lumini?a Doina; Pricope, Sorin; Irimie, Sabin Ioan

    2015-07-01

    This work presents a research of methodology in occupational stress analyse in the educational field, as a part of a larger study. The objectives of the work are in finding accents in existence of significant relations between stressors and effects, meaning the differences between the indicators of occupational stress to teaching staff in primary and gymnasium school, taking notice of each specific condition: the institution as an entity, the working community, the discipline he/she is teaching others, the geographic and administrative district (urban/rural) and the quantification of stress level.

  9. MACCS usage at Rocky Flats Plant for consequence analysis of postulated accidents

    SciTech Connect

    Foppe, T.L.; Peterson, V.L.

    1993-10-01

    The MELCOR Accident Consequence Code System (MACCS) has been applied to the radiological consequence assessment of potential accidents from a non-reactor nuclear facility. MACCS has been used in a variety of applications to evaluate radiological dose and health effects to the public from postulated plutonium releases and from postulated criticalities. These applications were conducted to support deterministic and probabilistic accident analyses for safety analyses for safety analysis reports, radiological sabotage studies, and other regulatory requests.

  10. PWR integrated safety analysis methodology using multi-level coupling algorithm

    NASA Astrophysics Data System (ADS)

    Ziabletsev, Dmitri Nickolaevich

    Coupled three-dimensional (3D) neutronics/thermal-hydraulic (T-H) system codes give a unique opportunity for a realistic modeling of the plant transients and design basis accidents (DBA) occurring in light water reactors (LWR). Examples of such DBAs are the rod ejection accidents (REA) and the main steam line break (MSLB) that constitute the bounding safety problems for pressurized water reactors (PWR). These accidents involve asymmetric 3D spatial neutronic and T-H effects during the course of the transients. The thermal margins (the peak fuel temperature, and departure from nucleate boiling ratio (DNBR)) are the measures of safety at a particular transient and need to be evaluated as accurate as possible. Modern 3D neutronics/T-H coupled codes estimate the safety margins coarsely on an assembly level, i.e. for an average fuel pin. More accurate prediction of the safety margins requires the evaluation of the transient fuel rod response involving locally coupled neutronics/T-H calculations. The proposed approach is to perform an on-line hot-channel safety analysis not for the whole core but for a selected local region, for example for the highest power loaded fuel assembly. This approach becomes feasible if an on-line algorithm capable to extract the necessary input data for a sub-channel module is available. The necessary input data include the detailed pin-power distributions and the T-H boundary conditions for each sub-channel in the considered problem. Therefore, two potential challenges are faced in the development of refined methodology for evaluation of local safety parameters. One is the development of an efficient transient pin-power reconstruction algorithm with a consistent cross-section modeling. The second is the development of a multi-level coupling algorithm for the T-H boundary and feed-back data exchange between the sub-channel module and the main 3D neutron kinetics/T-H system code, which already uses one level of coupling scheme between 3D neutronics and core thermal-hydraulics models. The major accomplishment of the thesis is the development of an integrated PWR safety analysis methodology with locally refined safety evaluations. This involved introduction of an improved method capable of efficiently restoring the fine pin-power distribution with a high degree of accuracy. In order to apply the methodology to evaluate the safety margins on a pin level, a refined on-line hot channel model was developed accounting for the cross-flow effects. Finally, this methodology was applied to best estimate safety analysis to more accurately calculate the thermal safety margins occurring during a design basis accident in PWR.

  11. An analysis of three weather-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Fujita, T. T.; Caracena, F.

    1977-01-01

    Two aircraft accidents in 1975, one at John F. Kennedy International Airport in New York City on 24 June and the other at Stapleton International Airport in Denver on 7 August, were examined in detail. A third accident on 23 June 1976 at Philadelphia International Airport is being investigated. Amazingly, there was a spearhead echo just to the north of each accident site. The echoes formed from 5 to 50 min in advance of the accident and moved faster than other echoes in the vicinity. These echoes were photographed by National Weather Service radars, 130-205 km away. At closer ranges, however, one or more circular echoes were depicted by airborne and ground radars. These cells were only 3-5 km in diameter, but they were accompanied by downdrafts of extreme intensity, called downbursts. All accidents occurred as aircraft, either descending or climbing, lost altitude while experiencing strong wind shear inside downburst cells.

  12. GPHS-RTG launch accident analysis for Galileo and Ulysses

    SciTech Connect

    Bradshaw, C.T. )

    1991-01-01

    This paper presents the safety program conducted to determine the response of the General Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) to potential launch accidents of the Space Shuttle for the Galileo and Ulysses missions. The National Aeronautics and Space Administration (NASA) provided definition of the Shuttle potential accidents and characterized the environments. The Launch Accident Scenario Evaluation Program (LASEP) was developed by GE to analyze the RTG response to these accidents. RTG detailed response to Solid Rocket Booster (SRB) fragment impacts, as well as to other types of impact, was obtained from an extensive series of hydrocode analyses. A comprehensive test program was conducted also to determine RTG response to the accident environments. The hydrocode response analyses coupled with the test data base provided the broad range response capability which was implemented in LASEP.

  13. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    SciTech Connect

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  14. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the experienced user-base and the experimental validation base was decaying away quickly.

  15. Probabilistic analysis of accident precursors in the nuclear industry.

    PubMed

    Hulsmans, M; De Gelder, P

    2004-07-26

    Feedback of operating experience has always been an important issue in the nuclear industry. A probabilistic safety analysis (PSA) can be used as a tool to analyse how an operational event might have developed adversely in order to obtain a quantitative assessment of the safety significance of the event. This process is called PSA-based event analysis (PSAEA). A comprehensive set of PSAEA guidelines was developed by an international project. The main characteristics of this methodology are summarised. This approach to analyse incidents can be used to meet different objectives of utilities or nuclear regulators. The paper describes the main objectives and the experiences of the Belgian nuclear regulatory organisation AVN with the application of PSA-based event analysis. Some interesting aspects of the process of PSAEA are further developed and underlined. Several case studies are discussed and an overview of the obtained results is given. Finally, the interest of a broad and interactive forum on PSAEA is highlighted. PMID:15231351

  16. Analysis of Convair 990 rejected-takeoff accident with emphasis on decision making, training and procedures

    NASA Technical Reports Server (NTRS)

    Batthauer, Byron E.

    1987-01-01

    This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.

  17. Analysis of Construction Accidents in Turkey and Responsible Parties

    PubMed Central

    GÜRCANLI, G. Emre; MÜNGEN, Uğur

    2013-01-01

    Construction is one of the world’s biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972–2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00–12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

  18. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    SciTech Connect

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  19. GAS CURTAIN EXPERIMENTAL TECHNIQUE AND ANALYSIS METHODOLOGIES

    SciTech Connect

    J. R. KAMM; ET AL

    2001-01-01

    The qualitative and quantitative relationship of numerical simulation to the physical phenomena being modeled is of paramount importance in computational physics. If the phenomena are dominated by irregular (i. e., nonsmooth or disordered) behavior, then pointwise comparisons cannot be made and statistical measures are required. The problem we consider is the gas curtain Richtmyer-Meshkov (RM) instability experiments of Rightley et al. (13), which exhibit complicated, disordered motion. We examine four spectral analysis methods for quantifying the experimental data and computed results: Fourier analysis, structure functions, fractal analysis, and continuous wavelet transforms. We investigate the applicability of these methods for quantifying the details of fluid mixing.

  20. [Free will and neurobiology: a methodological analysis].

    PubMed

    Brücher, K; Gonther, U

    2006-04-01

    Whether or not the neurobiological basis of mental processes is compatible with the philosophical postulate of free will is a matter of committed debating in our days. What is the meaning of those frequently-quoted experiments concerning voluntary action? Both convictions, being autonomous subjects and exercising a strong influence on the world by applying sciences, have become most important for modern human self-conception. Now these two views are growing apart and appear contradictory because neurobiology tries to reveal the illusionary character of free will. In order to cope with this ostensible dichotomy it is recommended to return to the core of scientific thinking, i. e. to the reflection about truth and methods. The neurobiological standpoint referring to Libet as well as the philosophical approaches to free will must be analysed, considering pre-conceptions and context-conditions. Hence Libet's experiments can be criticised on different levels: methods, methodology and epistemology. Free will is a highly complex system, not a simple fact. Taking these very complicated details into account it is possible to define conditions of compatibility and to use the term free will still in a meaningful way, negotiating the obstacles called pure chance and determinism. PMID:16671159

  1. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  2. RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...

  3. Radiochemical Analysis Methodology for uranium Depletion Measurements

    SciTech Connect

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  4. Analysis of Loss-of-Coolant Accidents in the NBSR

    SciTech Connect

    Baek J. S.; Cheng L.; Diamond, D.

    2014-05-23

    This report documents calculations of the fuel cladding temperature during loss-of-coolant accidents in the NBSR. The probability of a pipe failure is small and procedures exist to minimize the loss of water and assure emergency cooling water flows into the reactor core during such an event. Analysis in the past has shown that the emergency cooling water would provide adequate cooling if the water filled the flow channels within the fuel elements. The present analysis is to determine if there is adequate cooling if the water drains from the flow channels. Based on photographs of how the emergency water flows into the fuel elements from the distribution pan, it can be assumed that this water does not distribute uniformly across the flow channels but rather results in a liquid film flowing downward on the inside of one of the side plates in each fuel element and only wets the edges of the fuel plates. An analysis of guillotine breaks shows the cladding temperature remains below the blister temperature in fuel plates in the upper section of the fuel element. In the lower section, the fuel plates are also cooled by water outside the element that is present due to the hold-up pan and temperatures are lower than in the upper section. For small breaks, the simulation results show that the fuel elements are always cooled on the outside even in the upper section and the cladding temperature cannot be higher than the blister temperature. The above results are predicated on assumptions that are examined in the study to see their influence on fuel temperature.

  5. Nonresponse analysis and adjustment in a mail survey on car accidents.

    PubMed

    Tivesten, Emma; Jonsson, Sofia; Jakobsson, Lotta; Norin, Hans

    2012-09-01

    Statistical accident data plays an important role for traffic safety development involving the road system, vehicle design, and driver education. Vehicle manufacturers use data from accident mail surveys as an integral part of the product development process. Low response rates has, however, lead to concerns on whether estimates from a mail survey can be trusted as a source for making strategic decisions. The main objective of this paper was to investigate nonresponse bias in a mail survey addressing driver behaviour in accident situations. Insurance data, available for both respondents and nonrespondents were used to analyze, as well as adjust for nonresponse. Response propensity was investigated by using descriptive statistics and logistic regression analyses. The survey data was then weighted by using inverse propensity weights. Two specific examples of survey estimates are addressed, namely driver vigilance and driver's distraction just before the accident. The results from this paper reveal that driver age and accident type were the most influential variables for nonresponse weighting. Driver gender and size of town where the driver resides also had some influence, but not for all survey variables investigated. The main conclusion of this paper is that nonresponse weighting can increase confidence in accident data collected by a mail survey, especially when response rates are low. Weighting has a moderate influence on this survey, but a larger influence may be expected if applied on a more diverse driver population. The development of auxiliary data collection can further improve accident mail survey methodology in future. PMID:22664706

  6. Accident investigation: Analysis of aircraft motions from ATC radar recordings

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1976-01-01

    A technique was developed for deriving time histories of an aircraft's motion from air traffic control (ATC) radar records. This technique uses the radar range and azimuth data, along with the downlinked altitude data (from an onboard Mode-C transponder), to derive an expanded set of data which includes airspeed, lift, thrust-drag, attitude angles (pitch, roll, and heading), etc. This method of analyzing aircraft motions was evaluated through flight experiments which used the CV-990 research aircraft and recordings from both the enroute and terminal ATC radar systems. The results indicate that the values derived from the ATC radar records are for the most part in good agreement with the corresponding values obtained from airborne measurements. In an actual accident, this analysis of ATC radar records can complement the flight-data recorders, now onboard airliners, and provide a source of recorded information for other types of aircraft that are equipped with Mode-C transponders but not with onboard recorders.

  7. BNL severe accident sequence experiments and analysis program

    SciTech Connect

    Greene, G.A.; Ginsberg, T.; Tutu, N.K.

    1985-01-01

    A major source of containment pressurization during severe accidents is the transfer of stored energy from the hot core material to available cooling water. One mode of thermal interaction involves the quench of superheated beds of debris which could be present in the reactor cavity following melt-through or failure of the reactor vessel. This work supports development of models of superheated bed quench phenomena which are to be incorporated into containment analysis computer codes such as MARCH, CONTAIN, and MEDICI. A program directed towards characterization of the behavior of superheated debris beds has been completed. This work addressed the quench of superheated debris which is postulated to exist in the reactor cavity of a PWR following melt ejection from the primary system. The debris is assumed to be cooled by a pool of water overlying the bed of hot debris. This work has led to the development of models to predict rate of steam generation during the quench process and, in addition, the ability to assess the coolability of the debris during the transient quench process. A final report on this work has been completed. This report presents a brief description of some relevant results and conclusions. 15 refs.

  8. Offsite Radiological Consequence Analysis for the Bounding Flammable Gas Accident

    SciTech Connect

    CARRO, C.A.

    2003-07-30

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a single-shell tank The calculation applies reasonably conservation input parameters in accordance with DOE-STD-3009, Appendix A, guidance. Revision 1 incorporates comments received from Office of River Protection.

  9. A method for modeling and analysis of directed weighted accident causation network (DWACN)

    NASA Astrophysics Data System (ADS)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Ding, Jing

    2015-11-01

    Using complex network theory to analyze accidents is effective to understand the causes of accidents in complex systems. In this paper, a novel method is proposed to establish directed weighted accident causation network (DWACN) for the Rail Accident Investigation Branch (RAIB) in the UK, which is based on complex network and using event chains of accidents. DWACN is composed of 109 nodes which denote causal factors and 260 directed weighted edges which represent complex interrelationships among factors. The statistical properties of directed weighted complex network are applied to reveal the critical factors, the key event chains and the important classes in DWACN. Analysis results demonstrate that DWACN has characteristics of small-world networks with short average path length and high weighted clustering coefficient, and display the properties of scale-free networks captured by that the cumulative degree distribution follows an exponential function. This modeling and analysis method can assist us to discover the latent rules of accidents and feature of faults propagation to reduce accidents. This paper is further development on the research of accident analysis methods using complex network.

  10. Accidents at Work and Costs Analysis: A Field Study in a Large Italian Company

    PubMed Central

    BATTAGLIA, Massimo; FREY, Marco; PASSETTI, Emilio

    2014-01-01

    Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

  11. Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives

    SciTech Connect

    Kohout, E.F.; Folga, S.; Mueller, C.; Nabelssi, B.

    1996-03-01

    This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure will allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.

  12. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. PMID:26433361

  13. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  14. Pedestrian accident analysis with a silicone dummy block.

    PubMed

    Lee, Youngnae; Park, Sungji; Yoon, Seokhyun; Kong, Youngsu; Goh, Jae-Mo

    2012-07-10

    When a car is parked in an inclined plane in a parking lot, the car can roll down the slope and cause a pedestrian accident, even when the angle of inclination is small. A rolling car on a gentle slope seems to be easily halted by human power to prevent damage to the car or a possible accident. However, even if the car rolls down very slowly, it can cause severe injuries to a pedestrian, especially when the pedestrian cannot avoid the rolling car. In an accident case that happened in our province, a pedestrian was injured by a rolling car, which had been parked on a slope the night before. The accident occurred in the parking lot of an apartment complex. The parking lot seemed almost flat with the naked eye. We conducted a rolling test with the accident vehicle at the site. The car was made to roll down the slope by purely gravitational pull and was made to collide with the silicone block leaning against the retaining wall. Silicone has characteristics similar to those of a human body, especially with respect to stiffness. In the experiment, we measured the shock power quantitatively. The results showed that a rolling car could severely damage the chest of a pedestrian, even if it moved very slowly. PMID:22455985

  15. Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.

    2005-01-01

    NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.

  16. A methodology for probabilistic fault displacement hazard analysis (PFDHA)

    USGS Publications Warehouse

    Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H., III; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

    2003-01-01

    We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

  17. Statistical Methodology Content Analysis of Selected Educational Research Journals.

    ERIC Educational Resources Information Center

    Kennedy, Robert L.

    Sixty-seven educational research journals were investigated to determine the frequency of usage of inferential statistical techniques therein. The most frequently used statistical methodologies in the literature reviewed, which utilized inferential approaches, are the following: analysis of variance, correlation, t-test, multiple analysis of…

  18. NMR methodologies in the analysis of blueberries.

    PubMed

    Capitani, Donatella; Sobolev, Anatoly P; Delfini, Maurizio; Vista, Silvia; Antiochia, Riccarda; Proietti, Noemi; Bubici, Salvatore; Ferrante, Gianni; Carradori, Simone; De Salvador, Flavio Roberto; Mannina, Luisa

    2014-06-01

    An NMR analytical protocol based on complementary high and low field measurements is proposed for blueberry characterization. Untargeted NMR metabolite profiling of blueberries aqueous and organic extracts as well as targeted NMR analysis focused on anthocyanins and other phenols are reported. Bligh-Dyer and microwave-assisted extractions were carried out and compared showing a better recovery of lipidic fraction in the case of microwave procedure. Water-soluble metabolites belonging to different classes such as sugars, amino acids, organic acids, and phenolic compounds, as well as metabolites soluble in organic solvent such as triglycerides, sterols, and fatty acids, were identified. Five anthocyanins (malvidin-3-glucoside, malvidin-3-galactoside, delphinidin-3-glucoside, delphinidin-3-galactoside, and petunidin-3-glucoside) and 3-O-?-l-rhamnopyranosyl quercetin were identified in solid phase extract. The water status of fresh and withered blueberries was monitored by portable NMR and fast-field cycling NMR. (1) H depth profiles, T2 transverse relaxation times and dispersion profiles were found to be sensitive to the withering. PMID:24668393

  19. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence

    NASA Technical Reports Server (NTRS)

    Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

    2004-01-01

    Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

  20. Fast Transient And Spatially Non-Homogenous Accident Analysis Of Two-Dimensional Cylindrical Nuclear Reactor

    SciTech Connect

    Yulianti, Yanti; Su'ud, Zaki; Waris, Abdul; Khotimah, S. N.; Shafii, M. Ali

    2010-12-23

    The research about fast transient and spatially non-homogenous nuclear reactor accident analysis of two-dimensional nuclear reactor has been done. This research is about prediction of reactor behavior is during accident. In the present study, space-time diffusion equation is solved by using direct methods which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference discretization method is solved by using iterative methods ADI (Alternating Direct Implicit). The indication of accident is decreasing macroscopic absorption cross-section that results large external reactivity. The power reactor has a peak value before reactor has new balance condition. Changing of temperature reactor produce a negative Doppler feedback reactivity. The reactivity will reduce excess positive reactivity. Temperature reactor during accident is still in below fuel melting point which is in secure condition.

  1. Advanced Power Plant Development and Analysis Methodologies

    SciTech Connect

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  2. Sensitivity analysis of radionuclides atmospheric dispersion following the Fukushima accident

    NASA Astrophysics Data System (ADS)

    Girard, Sylvain; Korsakissok, Irène; Mallet, Vivien

    2014-05-01

    Atmospheric dispersion models are used in response to accidental releases with two purposes: - minimising the population exposure during the accident; - complementing field measurements for the assessment of short and long term environmental and sanitary impacts. The predictions of these models are subject to considerable uncertainties of various origins. Notably, input data, such as meteorological fields or estimations of emitted quantities as function of time, are highly uncertain. The case studied here is the atmospheric release of radionuclides following the Fukushima Daiichi disaster. The model used in this study is Polyphemus/Polair3D, from which derives IRSN's operational long distance atmospheric dispersion model ldX. A sensitivity analysis was conducted in order to estimate the relative importance of a set of identified uncertainty sources. The complexity of this task was increased by four characteristics shared by most environmental models: - high dimensional inputs; - correlated inputs or inputs with complex structures; - high dimensional output; - multiplicity of purposes that require sophisticated and non-systematic post-processing of the output. The sensitivities of a set of outputs were estimated with the Morris screening method. The input ranking was highly dependent on the considered output. Yet, a few variables, such as horizontal diffusion coefficient or clouds thickness, were found to have a weak influence on most of them and could be discarded from further studies. The sensitivity analysis procedure was also applied to indicators of the model performance computed on a set of gamma dose rates observations. This original approach is of particular interest since observations could be used later to calibrate the input variables probability distributions. Indeed, only the variables that are influential on performance scores are likely to allow for calibration. An indicator based on emission peaks time matching was elaborated in order to complement classical statistical scores which were dominated by deposit dose rates and almost insensitive to lower atmosphere dose rates. The substantial sensitivity of these performance indicators is auspicious for future calibration attempts and indicates that the simple perturbations used here may be sufficient to represent an essential part of the overall uncertainty.

  3. ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS

    SciTech Connect

    WILLIAMS, J.C.

    2003-11-15

    This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.

  4. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    SciTech Connect

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and at cask-loading-specific conditions could be performed to demonstrate that release is within the allowable leak rates of the cask.

  5. Four applications of a software data collection and analysis methodology

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

  6. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    SciTech Connect

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  7. Realistic Small- and Intermediate-Break Loss-of-Coolant Accident Analysis Using WCOBRA/TRAC

    SciTech Connect

    Bajorek, Stephen M.; Petkov, Nikolay; Ohkawa, Katsuhiro; Kemper, Robert M.; Ginsberg, Arthur P.

    2001-10-15

    Since the 1988 Appendix K Rulemaking change, there has been significant interest in the development of codes and methodologies for 'best-estimate' analysis of loss-of-coolant accidents (LOCAs). Most development has been directed toward large-break (LB) LOCAs (LBLOCAs), since for most pressurized water reactors (PWRs), the LBLOCA generates the limiting peak cladding temperature (PCT). As plants age, are uprated, and continue to seek improved operating efficiencies, the small break (SB) LOCA (SBLOCA) and intermediate-break (IB) LOCA (IBLOCA) can become a concern.Modifications have been made to the WCOBRA/TRAC-MOD7A code to enable it to make realistic calculations of SBLOCAs and IBLOCAs in a Westinghouse PWR. The MOD7A version has recently been approved for use as part of the Westinghouse best-estimate LOCA methodology for LBLOCAs. Thus, the modifications and improvements potentially allow LOCA calculations ranging from SBLOCAs to LBLOCAs using a single code version.The WCOBRA/TRAC-MOD7A, Rev. 4 SB02 version was used to calculate the transient response of a four-loop PWR for a range of break sizes located at the bottom of one of the cold legs. The break sizes ranged from a 0.051-m (2-in.) to a 0.406-m (16-in.) equivalent hole diameter. Each calculation was performed assuming American Nuclear Society (ANS) 1979 decay heat. The plant input assumed the loss of one train of safety injection as well as a power shape that was highly top skewed, which imposed some conservatism on the calculations but allowed a meaningful comparison to Appendix K-type analysis results. The realistic SBLOCA and IBLOCA results showed significantly reduced PCTs compared to those typically obtained from Appendix K LOCA calculations. The realistic results also can be categorized into four separate types of breaks, from a conventional slowly draining SBLOCA to an LBLOCA.

  8. 'Doing' health policy analysis: methodological and conceptual reflections and challenges.

    PubMed

    Walt, Gill; Shiffman, Jeremy; Schneider, Helen; Murray, Susan F; Brugha, Ruairi; Gilson, Lucy

    2008-09-01

    The case for undertaking policy analysis has been made by a number of scholars and practitioners. However, there has been much less attention given to how to do policy analysis, what research designs, theories or methods best inform policy analysis. This paper begins by looking at the health policy environment, and some of the challenges to researching this highly complex phenomenon. It focuses on research in middle and low income countries, drawing on some of the frameworks and theories, methodologies and designs that can be used in health policy analysis, giving examples from recent studies. The implications of case studies and of temporality in research design are explored. Attention is drawn to the roles of the policy researcher and the importance of reflexivity and researcher positionality in the research process. The final section explores ways of advancing the field of health policy analysis with recommendations on theory, methodology and researcher reflexivity. PMID:18701552

  9. The Data Analysis In Epidemiology: A Methodology To Reduce Complexity

    NASA Astrophysics Data System (ADS)

    Esposito, F.

    1982-11-01

    The paper represents a trial of defining and testing a systematic methodology to study medical phenomena in which large sets of variables of an essentially qualitative kind are envolved. The methodology con sists in the organized application of the various methods of the Data Analysis and the Multivariate Statistics to the epidemiology in order to analyze phenomena very complex as to the numerousness and variety of the variables, to evaluate the environment influence, to understand the phenomenon evolution. The data analysis techniques play the role both of tools for synthesizing and validating indexes and of methods for formulating models of the disease propagation, by defining the prognostic function. The methodology is applied to define the epidemiological model of the Essential Arterial Hypertension.

  10. Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion

    SciTech Connect

    Baek J.; Diamond D.; Cuadra, A.; Hanson, A.L.; Cheng, L-Y.; Brown, N.R.

    2012-09-30

    Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a model of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.

  11. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    ERIC Educational Resources Information Center

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  12. Interpersonal Dynamics in a Simulated Prison: A Methodological Analysis

    ERIC Educational Resources Information Center

    Banuazizi, Ali; Movahedi, Siamak

    1975-01-01

    A critical overview is presented of the Stanford Prison Experiment, conducted by Zimbardo and his coinvestigators in which they attempted a structural analysis of the problems of imprisonment. Key assumptions are questioned, primarily on methodological grounds, which casts doubts on the plausibility of the experimenters' final causal inferences.…

  13. Social Network Analysis: A New Methodology for Counseling Research.

    ERIC Educational Resources Information Center

    Koehly, Laura M.; Shivy, Victoria A.

    1998-01-01

    Social network analysis (SNA) uses indices of relatedness among individuals to produce representations of social structures and positions inherent in dyads or groups. SNA methods provide quantitative representations of ongoing transactional patterns in a given social environment. Methodological issues, applications and resources are discussed…

  14. 3D analysis of the reactivity insertion accident in VVER-1000

    SciTech Connect

    Abdullayev, A. M.; Zhukov, A. I.; Slyeptsov, S. M.

    2012-07-01

    Fuel parameters such as peak enthalpy and temperature during rod ejection accident are calculated. The calculations are performed by 3D neutron kinetics code NESTLE and 3D thermal-hydraulic code VIPRE-W. Both hot zero power and hot full power cases were studied for an equilibrium cycle with Westinghouse hex fuel in VVER-1000. It is shown that the use of 3D methodology can significantly increase safety margins for current criteria and met future criteria. (authors)

  15. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    USGS Publications Warehouse

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  16. THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT

    SciTech Connect

    Gupta, N.

    2011-02-14

    Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

  17. Methodological Variability Using Electronic Nose Technology For Headspace Analysis

    SciTech Connect

    Knobloch, Henri; Turner, Claire; Spooner, Andrew; Chambers, Mark

    2009-05-23

    Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspace analysis. However, the potential of e-nose technology is also discussed.

  18. Two methodologies for optical analysis of contaminated engine lubricants

    NASA Astrophysics Data System (ADS)

    Aghayan, Hamid; Bordatchev, Evgueni; Yang, Jun

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant-object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function, coherence function, etc) are used for the analysis of combined object-lubricant images. Both proposed methodologies utilize the comparison of measured parameters and calculated object shape-based and statistical characteristics for fresh and contaminated lubricants. Developed methodologies are verified experimentally showing an ability to distinguish lubricant with 0%, 3%, 7% and 10% water and coolant contamination. This proves the potential applicability of the developed methodologies for on-line measurement, monitoring and control of the engine lubricant condition.

  19. Intelligent signal analysis methodologies for nuclear detection, identification and attribution

    NASA Astrophysics Data System (ADS)

    Alamaniotis, Miltiadis

    Detection and identification of special nuclear materials can be fully performed with a radiation detector-spectrometer. Due to several physical and computational limitations, development of fast and accurate radioisotope identifier (RIID) algorithms is essential for automated radioactive source detection and characterization. The challenge is to identify individual isotope signatures embedded in spectral signature aggregation. In addition, background and isotope spectra overlap to further complicate the signal analysis. These concerns are addressed, in this thesis, through a set of intelligent methodologies recognizing signature spectra, background spectrum and, subsequently, identifying radionuclides. Initially, a method for detection and extraction of signature patterns is accomplished by means of fuzzy logic. The fuzzy logic methodology is applied on three types of radiation signal processing applications, where it exhibits high positive detection, low false alarm rate and very short execution time, while outperforming the maximum likelihood fitting approach. In addition, an innovative Pareto optimal multiobjective fitting of gamma ray spectra using evolutionary computing is presented. The methodology exhibits perfect identification while performs better than single objective fitting. Lastly, an innovative kernel based machine learning methodology was developed for estimating natural background spectrum in gamma ray spectra. The novelty of the methodology lies in the fact that it implements a data based approach and does not require any explicit physics modeling. Results show that kernel based method adequately estimates the gamma background, but algorithm's performance exhibits a strong dependence on the selected kernel.

  20. Approaches to accident analysis in recent US Department of Energy environmental impact statements

    SciTech Connect

    Mueller, C.; Folga, S.; Nabelssi, B.

    1996-12-31

    A review of accident analyses in recent US Department of Energy (DOE) Environmental Impact Statements (EISs) was conducted to evaluate the consistency among approaches and to compare these approaches with existing DOE guidance. The review considered several components of an accident analysis: the overall scope, which in turn should reflect the scope of the EIS; the spectrum of accidents considered; the methods and assumptions used to determine frequencies or frequency ranges for the accident sequences; and the assumption and technical bases for developing radiological and chemical atmospheric source terms and for calculating the consequences of airborne releases. The review also considered the range of results generated with respect to impacts on various worker and general populations. In this paper, the findings of these reviews are presented and methods recommended for improving consistency among EISs and bringing them more into line with existing DOE guidance.

  1. Risk Analysis Methodology for Kistler's K-1 Reusable Launch Vehicle

    NASA Astrophysics Data System (ADS)

    Birkeland, Paul W.

    2002-01-01

    Missile risk analysis methodologies were originally developed in the 1940s as the military experimented with intercontinental ballistic missile (ICBM) technology. As the range of these missiles increased, it became apparent that some means of assessing the risk posed to neighboring populations was necessary to gauge the relative safety of a given test. There were many unknowns at the time, and technology was unpredictable at best. Risk analysis itself was in its infancy. Uncertainties in technology and methodology led to an ongoing bias toward conservative assumptions to adequately bound the problem. This methodology ultimately became the Casualty Expectation Analysis that is used to license Expendable Launch Vehicles (ELVs). A different risk analysis approach was adopted by the commercial aviation industry in the 1950s. At the time, commercial aviation technology was more firmly in hand than ICBM technology. Consequently commercial aviation risk analysis focused more closely on the hardware characteristics. Over the years, this approach has enabled the advantages of technological and safety advances in commercial aviation hardware to manifest themselves in greater capabilities and opportunities. The Boeing 777, for example, received approval for trans-oceanic operations "out of the box," where all previous aircraft were required, at the very least, to demonstrate operations over thousands of hours before being granted such approval. This "out of the box" approval is likely to become standard for all subsequent designs. In short, the commercial aircraft approach to risk analysis created a more flexible environment for industry evolution and growth. In contrast, the continued use of the Casualty Expectation Analysis by the launch industry is likely to hinder industry maturation. It likely will cause any safety and reliability gains incorporated into RLV design to be masked by the conservative assumptions made to "bound the problem." Consequently, for the launch industry to mature, a different approach to RLV risk analysis must be adopted. This paper will present such a methodology for Kistler's K-1 reusable launch vehicle. This paper will develop an approach to risk analysis that represents an amalgamation of the two approaches. This methodology provides flexibility to the launch industry that will enable the regulatory environment to more efficiently accommodate new technologies and approaches. It will also present a derivation of an appropriate assessment threshold that is the equivalent of the currently accepted 30-in-a-million casualty expectation.

  2. Methodological aspects of exhaled breath condensate collection and analysis.

    PubMed

    Rosias, Philippe

    2012-06-01

    The collection and analysis of exhaled breath condensate (EBC) may be useful for the management of patients with chronic respiratory disease at all ages. It is a promising technique due to its apparent simplicity and non-invasiveness. EBC does not disturb an ongoing respiratory inflammation. However, the methodology remains controversial, as it is not yet standardized. The current diversity of the methods used to collect and preserve EBC, the analytical pitfalls and the high degree of within-subject variability are the main issues that hamper further development into a clinical useful technique. In order to facilitate the process of standardization, a simplified schematic approach is proposed. An update of available data identified open issues on EBC methodology. These issues were then classified into three separate conditions related to their influence before, during or after the condensation process: (1) pre-condenser conditions related to subject and/or environment; (2) condenser conditions related to condenser equipment; and (3) post-condenser conditions related to preservation and/or analysis. This simplified methodological approach highlights the potential influence of the many techniques used before, during and after condensation of exhaled breath. It may also serve as a methodological checklist for a more systematical approach of EBC research and development. PMID:22522968

  3. Comparative analysis of EPA cost-benefit methodologies

    SciTech Connect

    Poch, L.; Gillette, J.; Veil, J.

    1998-05-01

    In recent years, reforming the regulatory process has received much attention from diverse groups such as environmentalists, the government, and industry. A cost-benefit analysis can be a useful way to organize and compare the favorable and unfavorable impacts a proposed action night have on society. Since 1981, two Executive Orders have required the U.S. Environmental Protection Agency (EPA) and other regulatory agencies to perform cost-benefit analyses in support of regulatory decision making. At the EPA, a cost-benefit analysis is published as a document called a regulatory impact analysis (RIA). This report reviews cost-benefit methodologies used by three EPA program offices: Office of Air and Radiation, Office of Solid Waste, and Office of Water. These offices were chosen because they promulgate regulations that affect the policies of this study`s sponsor (U.S. Department of Energy, Office of Fossil Energy) and the technologies it uses. The study was conducted by reviewing 11 RIAs recently published by the three offices and by interviewing staff members in the offices. To draw conclusions about the EPA cost-benefit methodologies, their components were compared with those of a standard methodology (i.e., those that should be included in a comprehensive cost-benefit methodology). This study focused on the consistency of the approaches as well as their strengths and weaknesses, since differences in the cost-benefit methodologies themselves or in their application can cause confusion and preclude consistent comparison of regulations both within and among program offices.

  4. How Root Cause Analysis Can Improve the Value Methodology

    SciTech Connect

    Wixson, James Robert

    2002-05-01

    Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

  5. How Root Cause Analysis Can Improve the Value Methodology

    SciTech Connect

    Wixson, J. R.

    2002-02-05

    Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

  6. [Fatal skiing accidents: a forensic analysis taking the example of Salzburg].

    PubMed

    Kunz, Sebastian N; Keller, Thomas; Grove, Christina; Lochner, Stefanie; Monticelli, Fabio

    2015-01-01

    The rising popularity of Alpine skiing in recent years has led to an increase of skiing accidents, some with fatal outcome. In this paper, all fatal skiing accidents from the autopsy material of the Institute of Forensic Medicine of the Paris Lodron University Salzburg were evaluated and compared with statistical data of the Alpine Police. In the wintertime of 2005/2006 until 2013/2014, 22 deadly skiing accidents were autopsied. The age of the male and female victims ranged between 12 and 71 years. The main cause of death was craniocerebral and chest trauma. A relevant blood alcohol concentration was detected in only one case. Together with trauma-biomechanical and technical experts, forensic medicine serves as a necessary clarification interface between the investigating authorities and the judiciary. Determining the cause and manner of death as well as reconstructing the accident is the main task of the forensic pathologist. The present study shows that in the county of Salzburg, only a small percentage of fatal skiing accidents is evaluated from a forensic and trauma-biomechanical point of view. Thus the possibilities of an interdisciplinary accident analysis are not always fully utilized. PMID:26419087

  7. Injury patterns of seniors in traffic accidents: A technical and medical analysis

    PubMed Central

    Brand, Stephan; Otte, Dietmar; Mueller, Christian Walter; Petri, Maximilian; Haas, Philipp; Stuebig, Timo; Krettek, Christian; Haasper, Carl

    2012-01-01

    AIM: To investigate the actual injury situation of seniors in traffic accidents and to evaluate the different injury patterns. METHODS: Injury data, environmental circumstances and crash circumstances of accidents were collected shortly after the accident event at the scene. With these data, a technical and medical analysis was performed, including Injury Severity Score, Abbreviated Injury Scale and Maximum Abbreviated Injury Scale. The method of data collection is named the German In-Depth Accident Study and can be seen as representative. RESULTS: A total of 4430 injured seniors in traffic accidents were evaluated. The incidence of sustaining severe injuries to extremities, head and maxillofacial region was significantly higher in the group of elderly people compared to a younger age (P < 0.05). The number of accident-related injuries was higher in the group of seniors compared to other groups. CONCLUSION: Seniors are more likely to be involved in traffic injuries and to sustain serious to severe injuries compared to other groups. PMID:23173111

  8. Indonesian railway accidents--utilizing Human Factors Analysis and Classification System in determining potential contributing factors.

    PubMed

    Iridiastadi, Hardianto; Ikatrinasari, Zulfa Fitri

    2012-01-01

    The prevalence of Indonesian railway accidents has not been declining, with hundreds of fatalities reported in the past decade. As an effort to help the National Transportation Safety Committee (NTSC), this study was conducted that aimed at understanding factors that might have contributed to the accidents. Human Factors Analysis and Classification System (HFACS) was utilized for this purpose. A total of nine accident reports (provided by the Indonesian NTSC) involving fatalities were studied using the technique. Results of this study indicated 72 factors that were closely related to the accidents. Of these, roughly 22% were considered as operator acts while about 39% were related to preconditions for operator acts. Supervisory represented 14% of the factors, and the remaining (about 25%) were associated with organizational factors. It was concluded that, while train drivers indeed played an important role in the accidents, interventions solely directed toward train drivers may not be adequate. A more comprehensive approach in minimizing the accidents should be conducted that addresses all the four aspects of HFACS. PMID:22317372

  9. Risk-based Analysis of Construction Accidents in Iran During 2007-2011-Meta Analyze Study

    PubMed Central

    AMIRI, Mehran; ARDESHIR, Abdollah; FAZEL ZARANDI, Mohammad Hossein

    2014-01-01

    Abstract Background The present study aimed to investigate the characteristics of occupational accidents and frequency and severity of work related accidents in the construction industry among Iranian insured workers during the years 20072011. Methods The Iranian Social Security Organization (ISSO) accident database containing 21,864 cases between the years 2007-2011 was applied in this study. In the next step, Total Accident Rate (TRA), Total Severity Index (TSI), and Risk Factor (RF) were defined. The core of this work is devoted to analyzing the data from different perspectives such as age of workers, occupation and construction phase, day of the week, time of the day, seasonal analysis, regional considerations, type of accident, and body parts affected. Results Workers between 15-19 years old (TAR=13.4%) are almost six times more exposed to risk of accident than the average of all ages (TAR=2.51%). Laborers and structural workers (TAR=66.6%) and those working at heights (TAR=47.2%) experience more accidents than other groups of workers. Moreover, older workers over 65 years old (TSI=1.97%> average TSI=1.60%), work supervisors (TSI=12.20% >average TSI=9.09%), and night shift workers (TSI=1.89% >average TSI=1.47%) are more prone to severe accidents. Conclusion It is recommended that laborers, young workers, weekend and night shift workers be supervised more carefully in the workplace. Use of Personal Protective Equipment (PPE) should be compulsory in working environments, and special attention should be undertaken to people working outdoors and at heights. It is also suggested that policymakers pay more attention to the improvement of safety conditions in deprived and cold western regions. PMID:26005662

  10. [Model of Analysis and Prevention of Accidents - MAPA: tool for operational health surveillance].

    PubMed

    de Almeida, Ildeberto Muniz; Vilela, Rodolfo Andrade de Gouveia; da Silva, Alessandro José Nunes; Beltran, Sandra Lorena

    2014-12-01

    The analysis of work-related accidents is important for accident surveillance and prevention. Current methods of analysis seek to overcome reductionist views that see these occurrences as simple events explained by operator error. The objective of this paper is to analyze the Model of Analysis and Prevention of Accidents (MAPA) and its use in monitoring interventions, duly highlighting aspects experienced in the use of the tool. The descriptive analytical method was used, introducing the steps of the model. To illustrate contributions and or difficulties, cases where the tool was used in the context of service were selected. MAPA integrates theoretical approaches that have already been tried in studies of accidents by providing useful conceptual support from the data collection stage until conclusion and intervention stages. Besides revealing weaknesses of the traditional approach, it helps identify organizational determinants, such as management failings, system design and safety management involved in the accident. The main challenges lie in the grasp of concepts by users, in exploring organizational aspects upstream in the chain of decisions or at higher levels of the hierarchy, as well as the intervention to change the determinants of these events. PMID:25388176

  11. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

  12. Analysis of accident sequences and source terms at treatment and storage facilities for waste generated by US Department of Energy waste management operations

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.; Folga, S.; Policastro, A.; Freeman, W.; Jackson, R.; Mishima, J.; Turner, S.

    1996-12-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the US Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies assessed, and the resultant radiological and chemical source terms evaluated. A personal-computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for the calculation of human health risk impacts. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated, and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. Key assumptions in the development of the source terms are identified. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also discuss specific accident analysis data and guidance used or consulted in this report.

  13. MELCOR code analysis of a severe accident LOCA at Peach Bottom Plant

    SciTech Connect

    Carbajo, J.J. )

    1993-01-01

    A design-basis loss-of-coolant accident (LOCA) concurrent with complete loss of the emergency core cooling systems (ECCSs) has been analyzed for the Peach Bottom atomic station unit 2 using the MELCOR code, version 1.8.1. The purpose of this analysis is to calculate best-estimate times for the important events of this accident sequence and best-estimate source terms. Calculated pressures and temperatures at the beginning of the transient have been compared to results from the Peach Bottom final safety analysis report (FSAR). MELCOR-calculated source terms have been compared to source terms reported in the NUREG-1465 draft.

  14. An Efficient Analysis Methodology for Fluted-Core Composite Structures

    NASA Technical Reports Server (NTRS)

    Oremont, Leonard; Schultz, Marc R.

    2012-01-01

    The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

  15. Simplifying multivariate survival analysis using global score test methodology

    NASA Astrophysics Data System (ADS)

    Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz

    2015-12-01

    In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve multiple endpoints, and this situation further complicates the analysis of survival data. In the case of tumor patients, endpoints concerning survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For each patient, these endpoints are correlated, and the estimation of the correlation between two score statistics is fundamental in derivation of overall treatment advantage. In this paper, the bivariate survival analysis method using the global score test methodology is extended to multivariate setting.

  16. RELAP5 Application to Accident Analysis of the NIST Research Reactor

    SciTech Connect

    Baek, J.; Cuadra Gascon, A.; Cheng, L.Y.; Diamond, D.

    2012-03-18

    Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accident and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.

  17. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES.

    SciTech Connect

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. .

    2005-07-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice.

  18. An integrated risk analysis methodology in a multidisciplinary design environment

    NASA Astrophysics Data System (ADS)

    Hampton, Katrina Renee

    Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary environment consists of the interface between configuration and sizing analysis outputs and aerodynamic parameter computations. Uncertainties are analyzed for both simulation tools and their associated input parameters. Uncertainties are then propagated across the design environment and a robust design optimization is performed over the range of a critical input parameter. The results of this research indicate that including uncertainties into design processes may require modification of design constraints previously considered acceptable in deterministic analyses.

  19. Towards a Methodology for Identifying Program Constraints During Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo

    1997-01-01

    Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.

  20. Development of test methodology for dynamic mechanical analysis instrumentation

    NASA Technical Reports Server (NTRS)

    Allen, V. R.

    1982-01-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  1. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  2. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    SciTech Connect

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  3. Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept

    SciTech Connect

    Ricci, E.; McLean, R.B.

    1988-09-01

    The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided.

  4. Analysis of fission product release behavior during the TMI-2 accident

    SciTech Connect

    Petti, D. A.; Adams, J. P.; Anderson, J. L.; Hobbins, R. R.

    1987-01-01

    An analysis of fission product release during the Three Mile Island Unit 2 (TMI-2) accident has been initiated to provide an understanding of fission product behavior that is consistent with both the best estimate accident scenario and fission product results from the ongoing sample acquisition and examination efforts. ''First principles'' fission product release models are used to describe release from intact, disrupted, and molten fuel. Conclusions relating to fission product release, transport, and chemical form are drawn. 35 refs., 12 figs., 7 tabs.

  5. Status and validation of the SAS4A accident-analysis code system

    SciTech Connect

    Wider, H.U.; Bordner, G.L.; Briggs, L.L.; Cahalan, J.E.; Dunn, F.E.; Miles, K.J.; Morris, E.E.; Prohammer, F.G.; Tentner, A.M.; Weber, D.P.

    1982-01-01

    The SAS4A code system is a new tool for analyzing the initial phase of Hypothetical Core Disruptive Accidents (HCDAs) up to gross melting or failures of the subassembly walls. The objective in the development of SAS4A is to provide improved analytical models which represent experimentally demonstrated modes of material response in such accident scenarios. This paper discusses recent improvements in the phenomenological models, gives examples of verification and validation efforts that have been conducted, and illustrates the whole core-analysis implications of using this refined modeling capability.

  6. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    SciTech Connect

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  7. A Posteriori Analysis for Hydrodynamic Simulations Using Adjoint Methodologies

    SciTech Connect

    Woodward, C S; Estep, D; Sandelin, J; Wang, H

    2009-02-26

    This report contains results of analysis done during an FY08 feasibility study investigating the use of adjoint methodologies for a posteriori error estimation for hydrodynamics simulations. We developed an approach to adjoint analysis for these systems through use of modified equations and viscosity solutions. Targeting first the 1D Burgers equation, we include a verification of the adjoint operator for the modified equation for the Lax-Friedrichs scheme, then derivations of an a posteriori error analysis for a finite difference scheme and a discontinuous Galerkin scheme applied to this problem. We include some numerical results showing the use of the error estimate. Lastly, we develop a computable a posteriori error estimate for the MAC scheme applied to stationary Navier-Stokes.

  8. Defect analysis methodology for contact hole grapho epitaxy DSA

    NASA Astrophysics Data System (ADS)

    Harukawa, Ryota; Aoki, Masami; Cross, Andrew; Nagaswami, Venkat; Kawakami, Shinichiro; Yamauchi, Takashi; Tomita, Tadatoshi; Nagahara, Seiji; Muramatsu, Makoto; Kitano, Takahiro

    2014-04-01

    Next-generation lithography technology is required to meet the needs of advanced design nodes. Directed Self Assembly (DSA) is gaining momentum as an alternative or complementary technology to EUV lithography. We investigate defectivity on a 2xnm patterning of contacts for 25nm or less contact hole assembly by grapho epitaxy DSA technology with guide patterns printed using immersion ArF negative tone development. This paper discusses the development of an analysis methodology for DSA with optical wafer inspection, based on defect source identification, sampling and filtering methods supporting process development efficiency of DSA processes and tools.

  9. Analysis of dental materials as an aid to identification in aircraft accidents

    SciTech Connect

    Wilson, G.S.; Cruickshanks-Boyd, D.W.

    1982-04-01

    The failure to achieve positive identification of aircrew following an aircraft accident need not prevent a full autopsy and toxicological examination to ascertain possible medical factors involved in the accident. Energy-dispersive electron microprobe analysis provides morphological, qualitative, and accurate quantitative analysis of the composition of dental amalgam. Wet chemical analysis can be used to determine the elemental composition of crowns, bridges and partial dentures. Unfilled resin can be analyzed by infrared spectroscopy. Detailed analysis of filled composite restorative resins has not yet been achieved in the as-set condition to permit discrimination between manufacturers' products. Future work will involve filler studies and pyrolysis of the composite resins by thermogravimetric analysis to determine percentage weight loss when the sample examined is subjected to a controlled heating regime. With these available techniques, corroborative evidence achieved from the scientific study of materials can augment standard forensic dental results to obtain a positive identification.

  10. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  11. SOCRAT: The System of Codes for Realistic Analysis of Severe Accidents

    SciTech Connect

    Bolshov, Leonid; Strizhov, Valery

    2006-07-01

    For a long time in the Russian Federation the computer code for analysis of severe accidents is being developed. The main peculiarity of this code from the known computer codes for analysis of severe accidents at NPP such as MELCOR and ASTEC, is a consequent realization of the mechanistic approach for modeling of the melt progression processes, including beyond design basis accidents with the severe core damage. The motivation of the development is defined by the new design requirements to the safety of nuclear power plants with the improved economic factors, by the modernization of existing NPPs, by the development of instructions to the accident management and emergency planning. The realistic assessments of Nuclear power plants safety require usage of the best estimate codes allowing description of the melt progression processes accompanying severe accident at the nuclear installation and behavior of the containment under abnormal condition (in particular, rates of the steam and hydrogen release, relocation of molten materials to the concrete cavity after failure of the reactor vessel). The developed computer codes were used for the safety justification of NPP with the new generation of VVER type reactor such as Tyanvan NPP in China and Kudamkulam NPP in India. In particular using this code system the justification of the system for hydrogen safety, analysis of core degradation and relocation of the molten core to the core catcher used for the guarantied localization of the melt and prevention of the ex-vessel melt progression. The considered system of codes got recently name SOCRAT provides the self consistent analysis of in-vessel processes and processes, running in the containment, including melt localization device. In the paper the structure of the computer code SOCRAT is presented, functionality of the separate parts of the code is described, results of verification of different models of the code are also considered. (authors)

  12. Methodology, the matching law, and applied behavior analysis

    PubMed Central

    Vyse, Stuart A.

    1986-01-01

    The practical value of the quantitative analysis of behavior is limited by two methodological characteristics of this area of research: the use of (a) steady-state strategies and (b) relative vs. absolute response rates. Applied behavior analysts are concerned with both transition-state and steady-state behavior, and applied interventions are typically evaluated by their effects on absolute response rates. Quantitative analyses of behavior will have greater practical value when methods are developed for their extension to traditional rate-of-response variables measured across time. Although steady-state and relative-rate-of-response strategies are appropriate to the experimental analysis of many behavioral phenomena, these methods are rarely used by applied behavior analysts and further separate the basic and applied areas. PMID:22478657

  13. Segment clustering methodology for unsupervised Holter recordings analysis

    NASA Astrophysics Data System (ADS)

    Rodríguez-Sotelo, Jose Luis; Peluffo-Ordoñez, Diego; Castellanos Dominguez, German

    2015-01-01

    Cardiac arrhythmia analysis on Holter recordings is an important issue in clinical settings, however such issue implicitly involves attending other problems related to the large amount of unlabelled data which means a high computational cost. In this work an unsupervised methodology based in a segment framework is presented, which consists of dividing the raw data into a balanced number of segments in order to identify fiducial points, characterize and cluster the heartbeats in each segment separately. The resulting clusters are merged or split according to an assumed criterion of homogeneity. This framework compensates the high computational cost employed in Holter analysis, being possible its implementation for further real time applications. The performance of the method is measure over the records from the MIT/BIH arrhythmia database and achieves high values of sensibility and specificity, taking advantage of database labels, for a broad kind of heartbeats types recommended by the AAMI.

  14. Formal Analysis of an Airplane Accident in N{?}-Labeled Calculus

    NASA Astrophysics Data System (ADS)

    Mizutani, Tetsuya; Igarashi, Shigeru; Ikeda, Yasuwo; Shio, Masayuki

    N{?}-labeled calculus is a formal system for representation, verification and analysis of time-concerned recognition, knowledge, belief and decision of humans or computer programs together with related external physical or logical phenomena.In this paper, a formal verification and analysis of the JAL near miss accident is presented as an example of cooperating systems controlling continuously changing objects including human factor with misunderstanding or incorrect recognition.

  15. Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines.

    PubMed

    Baka, Aikaterini D; Uzunoglu, Nikolaos K

    2014-09-01

    Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

  16. Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines

    PubMed Central

    Baka, Aikaterini D.; Uzunoglu, Nikolaos K.

    2014-01-01

    Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

  17. SAMPSON Parallel Computation for Sensitivity Analysis of TEPCO's Fukushima Daiichi Nuclear Power Plant Accident

    NASA Astrophysics Data System (ADS)

    Pellegrini, M.; Bautista Gomez, L.; Maruyama, N.; Naitoh, M.; Matsuoka, S.; Cappello, F.

    2014-06-01

    On March 11th 2011 a high magnitude earthquake and consequent tsunami struck the east coast of Japan, resulting in a nuclear accident unprecedented in time and extents. After scram started at all power stations affected by the earthquake, diesel generators began operation as designed until tsunami waves reached the power plants located on the east coast. This had a catastrophic impact on the availability of plant safety systems at TEPCO's Fukushima Daiichi, leading to the condition of station black-out from unit 1 to 3. In this article the accident scenario is studied with the SAMPSON code. SAMPSON is a severe accident computer code composed of hierarchical modules to account for the diverse physics involved in the various phases of the accident evolution. A preliminary parallelization analysis of the code was performed using state-of-the-art tools and we demonstrate how this work can be beneficial to the nuclear safety analysis. This paper shows that inter-module parallelization can reduce the time to solution by more than 20%. Furthermore, the parallel code was applied to a sensitivity study for the alternative water injection into TEPCO's Fukushima Daiichi unit 3. Results show that the core melting progression is extremely sensitive to the amount and timing of water injection, resulting in a high probability of partial core melting for unit 3.

  18. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 1: Sections 1-9

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. The methodology is in compliance with the most recent guidance from DOE. It considers the spectrum of accident sequences that could occur in activities covered by the WM PEIS and uses a graded approach emphasizing the risk-dominant scenarios to facilitate discrimination among the various WM PEIS alternatives. Although it allows reasonable estimates of the risk impacts associated with each alternative, the main goal of the accident analysis methodology is to allow reliable estimates of the relative risks among the alternatives. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

  19. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  20. Methodology assessment and recommendations for the Mars science laboratory launch safety analysis.

    SciTech Connect

    Sturgis, Beverly Rainwater; Metzinger, Kurt Evan; Powers, Dana Auburn; Atcitty, Christopher B.; Robinson, David B; Hewson, John C.; Bixler, Nathan E.; Dodson, Brian W.; Potter, Donald L.; Kelly, John E.; MacLean, Heather J.; Bergeron, Kenneth Donald; Bessette, Gregory Carl; Lipinski, Ronald J.

    2006-09-01

    The Department of Energy has assigned to Sandia National Laboratories the responsibility of producing a Safety Analysis Report (SAR) for the plutonium-dioxide fueled Multi-Mission Radioisotope Thermoelectric Generator (MMRTG) proposed to be used in the Mars Science Laboratory (MSL) mission. The National Aeronautic and Space Administration (NASA) is anticipating a launch in fall of 2009, and the SAR will play a critical role in the launch approval process. As in past safety evaluations of MMRTG missions, a wide range of potential accident conditions differing widely in probability and seventy must be considered, and the resulting risk to the public will be presented in the form of probability distribution functions of health effects in terms of latent cancer fatalities. The basic descriptions of accident cases will be provided by NASA in the MSL SAR Databook for the mission, and on the basis of these descriptions, Sandia will apply a variety of sophisticated computational simulation tools to evaluate the potential release of plutonium dioxide, its transport to human populations, and the consequent health effects. The first step in carrying out this project is to evaluate the existing computational analysis tools (computer codes) for suitability to the analysis and, when appropriate, to identify areas where modifications or improvements are warranted. The overall calculation of health risks can be divided into three levels of analysis. Level A involves detailed simulations of the interactions of the MMRTG or its components with the broad range of insults (e.g., shrapnel, blast waves, fires) posed by the various accident environments. There are a number of candidate codes for this level; they are typically high resolution computational simulation tools that capture details of each type of interaction and that can predict damage and plutonium dioxide release for a range of choices of controlling parameters. Level B utilizes these detailed results to study many thousands of possible event sequences and to build up a statistical representation of the releases for each accident case. A code to carry out this process will have to be developed or adapted from previous MMRTG missions. Finally, Level C translates the release (or ''source term'') information from Level B into public risk by applying models for atmospheric transport and the health consequences of exposure to the released plutonium dioxide. A number of candidate codes for this level of analysis are available. This report surveys the range of available codes and tools for each of these levels and makes recommendations for which choices are best for the MSL mission. It also identities areas where improvements to the codes are needed. In some cases a second tier of codes may be identified to provide supporting or clarifying insight about particular issues. The main focus of the methodology assessment is to identify a suite of computational tools that can produce a high quality SAR that can be successfully reviewed by external bodies (such as the Interagency Nuclear Safety Review Panel) on the schedule established by NASA and DOE.

  1. A New Methodology of Spatial Cross-Correlation Analysis

    PubMed Central

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  2. [Analysis of radiation-hygienic and medical consequences of the Chernobyl accident].

    PubMed

    Onishchenko, G G

    2013-01-01

    Since the day of "the Chernobyl accident" in 1986 more than 25 years have been past. Radioactively contaminated areas 14 subjects of the Russian Federation with a total area of more than 50 thousand km2, where 1.5 million people now reside were exposed to radioactive contamination. Currently, a system of comprehensive evaluation of radiation doses of the population affected by the "Chernobyl accidents", including 11 guidance documents has been created. There are methodically provided works on the assessment of average annual, accumulated and predicted radiation doses of population and its critical groups, as well as doses to the thyroid gland The relevance of the analysis of the consequences of the "Chernobyl accident" is demonstrated by the events in Japan, at nuclear power Fukusima-1. In 2011 - 20/2 there were carried out comprehensive maritime expeditions under the auspices of the Russian Geographical Society with the participation of relevant ministries and agencies, leading academic institutions in Russia. In 2012, work was carried out on radiation protection of the population from the potential transboundary impact of the accident at the Japanese nuclear power plant Fukushima-l. The results provide a basis for the favorable outlook for the radiation environment in our Far East and the Pacific coast of Russia. PMID:24340594

  3. Pressure vessels and piping design, analysis, and severe accidents. PVP-Volume 331

    SciTech Connect

    Dermenjian, A.A.

    1996-12-31

    The primary objective of the Design and Analysis Committee of the ASME Pressure Vessels and Piping Division is to provide a forum for the dissemination of information and the advancement of current theories and practices in the design and analysis of pressure vessels, piping systems, and components. This volume is divided into the following six sections: power plant piping and supports 1--3; applied dynamic response analysis; severe accident analysis; and student papers. Separate abstracts were prepared for 22 papers in this volume.

  4. Computational methodology for ChIP-seq analysis

    PubMed Central

    Shin, Hyunjin; Liu, Tao; Duan, Xikun; Zhang, Yong; Liu, X. Shirley

    2015-01-01

    Chromatin immunoprecipitation coupled with massive parallel sequencing (ChIP-seq) is a powerful technology to identify the genome-wide locations of DNA binding proteins such as transcription factors or modified histones. As more and more experimental laboratories are adopting ChIP-seq to unravel the transcriptional and epigenetic regulatory mechanisms, computational analyses of ChIP-seq also become increasingly comprehensive and sophisticated. In this article, we review current computational methodology for ChIP-seq analysis, recommend useful algorithms and workflows, and introduce quality control measures at different analytical steps. We also discuss how ChIP-seq could be integrated with other types of genomic assays, such as gene expression profiling and genome-wide association studies, to provide a more comprehensive view of gene regulatory mechanisms in important physiological and pathological processes. PMID:25741452

  5. Aspects of uncertainty analysis in accident consequence modeling

    SciTech Connect

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data.

  6. A flammability and combustion model for integrated accident analysis. [Advanced light water reactors

    SciTech Connect

    Plys, M.G.; Astleford, R.D.; Epstein, M. )

    1988-01-01

    A model for flammability characteristics and combustion of hydrogen and carbon monoxide mixtures is presented for application to severe accident analysis of Advanced Light Water Reactors (ALWR's). Flammability of general mixtures for thermodynamic conditions anticipated during a severe accident is quantified with a new correlation technique applied to data for several fuel and inertant mixtures and using accepted methods for combining these data. Combustion behavior is quantified by a mechanistic model consisting of a continuity and momentum balance for the burned gases, and considering an uncertainty parameter to match the idealized process to experiment. Benchmarks against experiment demonstrate the validity of this approach for a single recommended value of the flame flux multiplier parameter. The models presented here are equally applicable to analysis of current LWR's. 21 refs., 16 figs., 6 tabs.

  7. Marine Propeller Analysis with an Immersed Boundary LES methodology

    NASA Astrophysics Data System (ADS)

    Schroeder, Seth; Balaras, Elias

    2011-11-01

    Modern marine propeller design and analysis techniques employ a wide range of computational methods to balance time constraints and accuracy requirements. Potential flow and RANS based methods have historically comprised the tools necessary for design and global performance analysis. However, for analysis where unsteady flow phenomena are of interest, eddy resolving methodologies are required. In the present study we use the large-eddy simulation (LES) approach coupled with an immersed-boundary (IB) method to perform computations of the flow around a rotating propeller at laboratory Reynolds numbers. Compared to classical boundary conforming strategies our formulation eliminates meshing overhead time and allows for body motion without additional treatments such as overset or dynamic meshing. The structured Cartesian grid also allows for a non-dissipative solver which conserves mass, momentum and energy. We will focus on the Italian Ship Model Basin (INSEAN) E1619 propeller and compare the predictions of our method to experimental results. The E1619 propeller is a 7-bladed submarine stock propeller that has been the subject of extensive experimental testing and previous computational studies. Supported by ONR.

  8. Exposure rate response analysis of criticality accident dectector at Savannah River Site

    SciTech Connect

    Zino, J.F.

    1995-01-01

    This analysis investigated the exposure response behavior of gamma-ray ionization chambers used in the criticality accident systems at the Savannah River Site (SRS). The project consisted of performing exposure response measurements with a calibrated {sup 137}Cs source for benchmarking of the MCNP Monte Carlo code. MCNP was then used to extrapolate the ion chamber`s response to gamma-rays with energies outside the current domain of measured data for criticality fission sources.

  9. Human and organisational factors in maritime accidents: analysis of collisions at sea using the HFACS.

    PubMed

    Chauvin, Christine; Lardjane, Salim; Morel, Gaël; Clostermann, Jean-Pierre; Langard, Benoît

    2013-10-01

    Over the last decade, the shipping industry has implemented a number of measures aimed at improving its safety level (such as new regulations or new forms of team training). Despite this evolution, shipping accidents, and particularly collisions, remain a major concern. This paper presents a modified version of the Human Factors Analysis and Classification System, which has been adapted to the maritime context and used to analyse human and organisational factors in collisions reported by the Marine Accident and Investigation Branch (UK) and the Transportation Safety Board (Canada). The analysis shows that most collisions are due to decision errors. At the precondition level, it highlights the importance of the following factors: poor visibility and misuse of instruments (environmental factors), loss of situation awareness or deficit of attention (conditions of operators), deficits in inter-ship communications or Bridge Resource Management (personnel factors). At the leadership level, the analysis reveals the frequent planning of inappropriate operations and non-compliance with the Safety Management System (SMS). The Multiple Accident Analysis provides an important finding concerning three classes of accidents. Inter-ship communications problems and Bridge Resource Management deficiencies are closely linked to collisions occurring in restricted waters and involving pilot-carrying vessels. Another class of collisions is associated with situations of poor visibility, in open sea, and shows deficiencies at every level of the socio-technical system (technical environment, condition of operators, leadership level, and organisational level). The third class is characterised by non-compliance with the SMS. This study shows the importance of Bridge Resource Management for situations of navigation with a pilot on board in restricted waters. It also points out the necessity to investigate, for situations of navigation in open sea, the masters' decisions in critical conditions as well as the causes of non-compliance with SMS. PMID:23764875

  10. Analysis of the FeCrAl Accident Tolerant Fuel Concept Benefits during BWR Station Blackout Accidents

    SciTech Connect

    Robb, Kevin R

    2015-01-01

    Iron-chromium-aluminum (FeCrAl) alloys are being considered for fuel concepts with enhanced accident tolerance. FeCrAl alloys have very slow oxidation kinetics and good strength at high temperatures. FeCrAl could be used for fuel cladding in light water reactors and/or as channel box material in boiling water reactors (BWRs). To estimate the potential safety gains afforded by the FeCrAl concept, the MELCOR code was used to analyze a range of postulated station blackout severe accident scenarios in a BWR/4 reactor employing FeCrAl. The simulations utilize the most recently known thermophysical properties and oxidation kinetics for FeCrAl. Overall, when compared to the traditional Zircaloy-based cladding and channel box, the FeCrAl concept provides a few extra hours of time for operators to take mitigating actions and/or for evacuations to take place. A coolable core geometry is retained longer, enhancing the ability to stabilize an accident. Finally, due to the slower oxidation kinetics, substantially less hydrogen is generated, and the generation is delayed in time. This decreases the amount of non-condensable gases in containment and the potential for deflagrations to inhibit the accident response.

  11. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  12. Review of accident analysis calculations, 232-Z seismic scenario

    SciTech Connect

    Ballinger, M.Y.

    1993-05-01

    The 232-Z Building houses what was previously the incinerator facility, which is no longer in service. It is constructed out of concrete blocks and is approximately 37 ft wide by 57 ft long. The building has a single story over the process areas and two stories over the service areas at the north end of the building. The respective roofs are 15 ft and 19 ft above grade and consist of concrete over a metal decking, with insulation and a built-up asphalt gravel covering. This facility is assumed to collapse in the seismic event evaluated in the safety analyses, resulting in the release of a portion of the residual plutonium inventory remaining in the building. The seismic scenario for 232-Z assumes that the block concrete walls collapse, allowing the roof to fall, crushing the contaminated duct and gloveboxes within. This paper is a review of the scenario and methods used to calculate the source term from the seismic event as presented in the Plutonium Finishing Plant Final Safety Analysis Report (WHC 1991) also referred to as the PFP FSAR. Alternate methods of estimating the source term are presented. The calculation of source terms based on the mechanisms of release expected in worst-case scenario is recommended.

  13. Space Shuttle Columbia Post-Accident Analysis and Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steven J.

    2006-01-01

    Although the loss of the Space Shuttle Columbia and its crew was tragic, the circumstances offered a unique opportunity to examine a multitude of components which had experienced one of the harshest environments ever encountered by engineered materials: a break up at a velocity in excess of Mach 18 and an altitude exceeding 200,000 feet (63 KM), resulting in a debris field 645 miles/l,038 KM long and 10 miles/16 KM wide. Various analytical tools were employed to ascertain the sequence of events leading to the disintegration of the Orbiter and to characterize the features of the debris. The testing and analyses all indicated that a breach in a left wing reinforced carbon/carbon composite leading edge panel was the access point for hot gasses generated during re-entry to penetrate the structure of the vehicle and compromise the integrity of the materials and components in that area of the Shuttle. The analytical and elemental testing utilized such techniques as X-Ray Diffraction (XRD), Energy Dispersive X-Ray (EDX) dot mapping, Electron Micro Probe Analysis (EMPA), and X-Ray Photoelectron Spectroscopy (XPS) to characterize the deposition of intermetallics adjacent to the suspected location of the plasma breach in the leading edge of the left wing, Fig. 1.

  14. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  15. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

  16. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  17. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  18. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.; Grupa, J.B.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  19. Analysis of pedestrian accident costs in Sudan using the willingness-to-pay method.

    PubMed

    Mofadal, Adam I A; Kanitpong, Kunnawee; Jiwattanakulpaisarn, Piyapong

    2015-05-01

    The willingness-to-pay (WTP) with contingent valuation (CV) method has been proven to be a valid tool for the valuation of non-market goods or socio-economic costs of road traffic accidents among communities in developed and developing countries. Research on accident costing tends to estimate the value of statistical life (VOSL) for all road users by providing a principle for the evaluation of road safety interventions in cost-benefit analysis. As in many other developing countries, the economic loss of traffic accidents in Sudan is noticeable; however, analytical research to estimate the magnitude and impact of that loss is lacking. Reports have shown that pedestrians account for more than 40% of the total number of fatalities. In this study, the WTP-CV approach was used to determine the amount of money that pedestrians in Sudan are willing to pay to reduce the risk of their own death. The impact of the socioeconomic factors, risk levels, and walking behaviors of pedestrians on their WTP for fatality risk reduction was also evaluated. Data were collected from two cities-Khartoum and Nyala-using a survey questionnaire that included 1400 respondents. The WTP-CV Payment Card Questionnaire was designed to ensure that Sudan pedestrians can easily determine the amount of money that would be required to reduce the fatality risk from a pedestrian-related accident. The analysis results show that the estimated VOSL for Sudanese pedestrians ranges from US$0.019 to US$0.101 million. In addition, the willingness-to-pay by Sudanese pedestrians to reduce their fatality risk tends to increase with age, household income, educational level, safety perception, and average time spent on social activities with family and community. PMID:25794921

  20. Calculation notes in support of TWRS FSAR spray leak accident analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document contains the detailed calculations that support the spray leak accident analysis in the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The consequence analyses in this document form the basis for the selection of controls to mitigate or prevent spray leaks throughout TWRS. Pressurized spray leaks can occur due to a breach in containment barriers along transfer routes, during waste transfers. Spray leaks are of particular safety concern because, depending on leak dimensions, and waste pressure, they can be relatively efficient generators of dispersible sized aerosols that can transport downwind to onsite and offsite receptors. Waste is transferred between storage tanks and between processing facilities and storage tanks in TWRS through a system of buried transfer lines. Pumps for transferring waste and jumpers and valves for rerouting waste are located inside below grade pits and structures that are normally covered. Pressurized spray leaks can emanate to the atmosphere due to breaches in waste transfer associated equipment inside these structures should the structures be uncovered at the time of the leak. Pressurized spray leaks can develop through holes or cracks in transfer piping, valve bodies or pump casings caused by such mechanisms as corrosion, erosion, thermal stress, or water hammer. Leaks through degraded valve packing, jumper gaskets, or pump seals can also result in pressurized spray releases. Mechanisms that can degrade seals, packing and gaskets include aging, radiation hardening, thermal stress, etc. An1782other common cause for spray leaks inside transfer enclosures are misaligned jumpers caused by human error. A spray leak inside a DST valve pit during a transfer of aging waste was selected as the bounding, representative accident for detailed analysis. Sections 2 through 5 below develop this representative accident using the DOE- STD-3009 format. Sections 2 describes the unmitigated and mitigated accident scenarios evaluated to determine the need for safety class SSCs or TSR controls. Section 3 develops the source terms associated with the unmitigated and mitigated accident scenarios. Section 4 estimates the radiological and toxicological consequences for the unmitigated and mitigated scenarios. Section 5 compares the radiological and toxicological consequences against the TWRS evaluation guidelines. Section 6 extrapolates from the representative accident case to other represented spray leak sites to assess the conservatism in using the representative case to define controls for other postulated spray leak sites throughout TWRS. Section 7 discusses the sensitivities of the consequence analyses to the key parameters and assumptions used in the analyses. Conclusions are drawn in Section 8. The analyses herein pertain to spray leaks initiated due to internal mechanisms (e.g., corrosion, erosion, thermal stress, etc). External initiators of spray leaks (e.g., excavation accidents), and natural phenomena initiators (e.g., seismic events) are to be covered in separate accident analyses.

  1. The methodology of multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    One of the greatest challenges facing the software engineering community is the ability to produce large and complex computer systems, such as ground support systems for unmanned scientific missions, that are reliable and cost effective. In order to build and maintain these systems, it is important that the knowledge in the system be suitably abstracted, structured, and otherwise clustered in a manner which facilitates its understanding, manipulation, testing, and utilization. Development of complex mission-critical systems will require the ability to abstract overall concepts in the system at various levels of detail and to consider the system from different points of view. Multi-ViewPoint - Clustering Analysis MVP-CA methodology has been developed to provide multiple views of large, complicated systems. MVP-CA provides an ability to discover significant structures by providing an automated mechanism to structure both hierarchically (from detail to abstract) and orthogonally (from different perspectives). We propose to integrate MVP/CA into an overall software engineering life cycle to support the development and evolution of complex mission critical systems.

  2. Methodology for proactive signal warrant analysis. Final report

    SciTech Connect

    Ziliaskopoulos, T.; Donnally, J.L.

    1997-11-18

    Typically, transportation agencies perform traffic studies only after conditions have deteriorated to an unacceptable level, accidents have taken place and customer complaints have mounted (reactive policy). By that time, problems have already begun to occur; unacceptable delays and accidents have been experienced for long periods of time before the problem is detected. Many of these problems would have been avoided if intersection with high probability to warrant a signal in the future could be identified in advance (proactive policy). This study produces a procedure to determine the intersections that will warrant a signal and the time the warrant will be met. A prioritized list of all intersections in the region (state, district, or county) show the needs for traffic signals.

  3. Analysis on the Density Driven Air-Ingress Accident in VHTRs

    SciTech Connect

    Eung Soo Kim; Chang Oh; Richard Schultz; David Petti

    2008-11-01

    Air-ingress following the pipe rupture is considered to be the most serious accident in the VHTRs due to its potential problems such as core heat-up, structural integrity and toxic gas release. Previously, it has been believed that the main air-ingress mechanism of this accident is the molecular diffusion process between the reactor core and the cavity. However, according to some recent studies, there is another fast air-ingress process that has not been considered before. It is called density-driven stratified flow. The potential for density-driven stratified air ingress into the VHTR following a large-break LOCA was first described in the NGNP Methods Technical Program based on stratified flow studies performed with liquid. Studies on densitygradient driven stratified flow in advanced reactor systems has been the subject of active research for well over a decade since density-gradient dominated stratified flow is an inherent characteristic of passive systems used in advanced reactors. Recently, Oh et al. performed a CFD analysis on the stratified flow in the VHTR, and showed that this effect can significantly accelerate the air-ingress process in the VHTRs. They also proposed to replace the original air-ingress scenario based on the molecular diffusion with the one based on the stratified flow. This paper is focusing on the effect of stratified flow on the results of the air-ingress accident in VHTR

  4. SAS4A: A computer model for the analysis of hypothetical core disruptive accidents in liquid metal reactors

    SciTech Connect

    Tentner, A.M.; Birgersson, G.; Cahalan, J.E.; Dunn, F.E.; Kalimullah; Miles, K.J.

    1987-01-01

    To ensure that the public health and safety are protected under any accident conditions in a Liquid Metal Fast Breeder Reactor (LMFBR), many accidents are analyzed for their potential consequences. The SAS4A code system, described in this paper, provides such an analysis capability, including the ability to analyze low probability events such as the Hypothetical Core Disruptive Accidents (HCDAs). The SAS4A code system has been designed to simulate all the events that occur in a LMFBR core during the initiating phase of a Hypothetical Core Disruptive Accident. During such postulated accident scenarios as the Loss-of-Flow and Transient Overpower events, a large number of interrelated physical phenomena occur during a relatively short time. These phenomena include transient heat transfer and hydrodynamic events, coolant boiling and fuel and cladding melting and relocation. During to the strong neutronic feedback present in a nuclear reactor, these events can significantly influence the reactor power. The SAS4A code system is used in the safety analysis of nuclear reactors, in order to estimate the energetic potential of very low probability accidents. The results of SAS4A simulations are also used by reactor designers in order to build safer reactors and eliminate the possibility of any accident which could endanger the public safety.

  5. Responsibility analysis: a methodology to study the effects of drugs in driving.

    PubMed

    Robertson, M D; Drummer, O H

    1994-04-01

    In order to study the role of drugs in driving, a responsibility analysis was developed to allow an assessment to be made of the driver's culpability or responsibility in an accident. Factors possibly mitigating drivers' responsibility in each accident were identified and scored. Factors considered were: condition of road, condition of vehicle, driving conditions, accident type, witness observations, road law obedience, difficulty of task, and level of fatigue. If a sufficient number of mitigating factors were identified a driver would be found to be either partly or totally exonerated from blameworthiness and scored either as a contributory or nonculpable driver. If drugs present in a driver contributed to accident causation, it would be expected that they would be overrepresented in culpable drivers, i.e. those drivers not exonerated from blame. A total of 341 driver fatalities occurring in Victoria were analysed for blood alcohol content (BAC). Twenty-nine percent had a BAC over .05% (the legal limit in Victoria). Alcohol-positive drivers were statistically overrepresented in the culpable group (p < .001), in single-vehicle accidents (p < .05) and those accidents in which vehicles left the road for no apparent reason (p < .001). Odds-ratio estimation of relative risk of culpable and nonculpable drivers showed that the relative risk rose disproportionately to BAC. PMID:8198693

  6. Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design

    NASA Astrophysics Data System (ADS)

    Reyes, S.; Latkowski, J. F.; Gomez del Rio, J.; Sanz, J.

    2001-05-01

    Previous studies of the safety and environmental aspects of the HYLIFE-II inertial fusion energy power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work, computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) have been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here we consider a severe loss of coolant accident (LOCA) in conjunction with simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the confinement) and of the two barriers surrounding the chamber (inner shielding and confinement building itself). Even though confinement failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product transport and release. The results of these calculations show that the estimated off-site dose is less than 5 mSv (0.5 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

  7. Light-Weight Radioisotope Heater Unit final safety analysis report (LWRHU-FSAR): Volume 2: Accident Model Document (AMD)

    SciTech Connect

    Johnson, E.W.

    1988-10-01

    The purpose of this volume of the LWRHU SAR, the Accident Model Document (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; Provide estimates of occurrence probabilities associated with these various accidents; Evaluate the response of the LWRHU (or its components) to the resultant accident environments; and Associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  8. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    SciTech Connect

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  9. Methods for Detector Placement and Analysis of Criticality Accident Alarm Systems

    SciTech Connect

    Peplow, Douglas E.; Wetzel, Larry

    2012-01-01

    Determining the optimum placement to minimize the number of detectors for a criticality accident alarm system (CAAS) in a large manufacturing facility is a complex problem. There is typically a target for the number of detectors that can be used over a given zone of the facility. A study to optimize detector placement typically begins with some initial guess at the placement of the detectors and is followed by either predictive calculations of accidents at specific locations or adjoint calculations based on preferred detector locations. Within an area of a facility, there may be a large number of potential criticality accident sites. For any given placement of the detectors, the list of accident sites can be reduced to a smaller number of locations at which accidents may be difficult for detectors to detect. Developing the initial detector placement and determining the list of difficult accident locations are both based on the practitioner's experience. Simulations following fission particles released from an accident location are called 'forward calculations.' These calculations can be used to answer the question 'where would an alarm be triggered?' by an accident at a specified location. Conversely, 'adjoint calculations' start at a detector site using the detector response function as a source and essentially run in reverse. These calculations can be used to answer the question 'where would an accident be detected?' by a specified detector location. If the number of accidents, P, is much less than the number of detectors, Q, then forward simulations may be more convenient and less time-consuming. If Q is large or the detectors are not placed yet, then a mesh tally of dose observed by a detector at any location must be computed over the entire zone. If Q is much less than P, then adjoint calculations may be more efficient. Adjoint calculations employing a mesh tally can be even more advantageous because they do not rely on a list of specific difficult-to-detect accident sites, which may not have included every possible accident location. Analog calculations (no biasing) simply follow particles naturally. For sparse buildings and line-of-sight calculations, analog Monte Carlo (MC) may be adequate. For buildings with internal walls or large amounts of heavy equipment (dense geometry), variance reduction may be required. Calculations employing the CADIS method use a deterministic calculation to create an importance map and a matching biased source distribution that optimize the final MC to quickly calculate one specific tally. Calculations employing the FW-CADIS method use two deterministic calculations (one forward and one adjoint) to create an importance map and a matching biased source distribution that are designed to make the MC calculate a mesh tally with more uniform uncertainties in both high-dose and low-dose areas. Depending on the geometry of the problem, the number of detectors, and the number of accident sites, different approaches to CAAS placement studies can be taken. These are summarized in Table I. SCALE 6.1 contains the MAVRIC sequence, which can be used to perform any of the forward-based approaches outlined in Table I. For analog calculations, MAVRIC simply calls the Monaco MC code. For CADIS and FW-CADIS, MAVRIC uses the Denovo discrete ordinates (SN) deterministic code to generate the importance map and biased source used by Monaco. An adjoint capability is currently being added to Monaco and should be available in the next release of SCALE. An adjoint-based approach could be performed with Denovo alone - although fine meshes, large amounts of memory, and long computation times may be required to obtain accurate solutions. Coarse-mesh SN simulations could be employed for adjoint-based scoping studies until the adjoint capability in Monaco is complete. CAAS placement studies, especially those dealing with mesh tallies, require some extra utilities to aid in the analysis. Detectors must receive a minimum dose rate in order to alarm; therefore, a simple yes/no plot could be more useful to the analyst t

  10. Radiation protection: an analysis of thyroid blocking. [Effectiveness of KI in reducing radioactive uptake following potential reactor accident

    SciTech Connect

    Aldrich, D.C.; Blond, R.M.

    1980-01-01

    An analysis was performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed, and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident probabilities, and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated. Finally, risk-benefit considerations are briefly discussed.

  11. Light-Weight Radioisotope Heater Unit Safety Analysis Report (LWRHU-SAR). Volume II. Accident model document

    SciTech Connect

    Johnson, E.W.

    1985-10-01

    Purposes of this volume (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; provide estimates of occurrence probabilities associated with these various accidents; evaluate the response of the LWRHU (or its components) to the resultant accident environments; and associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  12. A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2007-01-01

    In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

  13. Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident

    SciTech Connect

    Pasichnyk, I.; Perin, Y.; Velkov, K.

    2013-07-01

    The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

  14. Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 2, Part 1C: Analysis of core damage frequency from internal events for plant operational State 5 during a refueling outage, Main report (Sections 11--14)

    SciTech Connect

    Whitehead, D.; Darby, J.; Yakle, J.

    1994-06-01

    This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power analyses performed on Grand Gulf.

  15. Methodologies for analysis of patterning in the mouse RPE sheet

    PubMed Central

    Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

    2015-01-01

    Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer-analyzed results were compared. Whether tallied manually or automatically with software, the resulting cell measurements were in close agreement. We compared normal with diseased RPE cells during aging with quantitative cell size and shape metrics. Subtle differences between the RPE sheet characteristics of young and old mice were identified. The IRBP?/? mouse RPE sheet did not differ from C57BL/6J (wild type, WT), suggesting that IRBP does not play a direct role in maintaining the health of the RPE cell, while the slow loss of photoreceptor (PhR) cells previously established in this knockout does support a role in the maintenance of PhR cells. Rd8 mice exhibited several measurable changes in patterns of RPE cells compared to WT, suggesting a slow degeneration of the RPE sheet that had not been previously noticed in rd8. Conclusions An optimized dissection method and a series of programs were used to establish a rapid and hands-off analysis. The software-aided, high-sampling-size approach performed as well as trained human scorers, but was considerably faster and easier. This method allows tens to hundreds of thousands of cells to be analyzed, each with 23 metrics. With this combination of dissection and image analysis of the RPE sheet, we can now analyze cell-to-cell interactions of immediate neighbors. In the future, we may be able to observe interactions of second, third, or higher ring neighbors and analyze tension in sheets, which might be expected to deviate from normal near large bumps in the RPE sheet caused by druse or when large frank holes in the RPE sheet are observed in geographic atrophy. This method and software can be readily applied to other aspects of vision science, neuroscience, and epithelial biology where patterns may exist in a sheet or surface of cells. PMID:25593512

  16. Analysis of Maximum Reasonably Foreseeable Accidents for the Yucca Mountain Draft Environmental Impact Statement (DEIS)

    SciTech Connect

    S.B. Ross; R.E. Best; S.J. Maheras; T.I. McSweeney

    2001-08-17

    Accidents could occur during the transportation of spent nuclear fuel and high-level radioactive waste. This paper describes the risks and consequences to the public from accidents that are highly unlikely but that could have severe consequences. The impact of these accidents would include those to a collective population and to hypothetical maximally exposed individuals (MEIs). This document discusses accidents with conditions that have a chance of occurring more often than 1 in 10 million times in a year, called ''maximum reasonably foreseeable accidents''. Accidents and conditions less likely than this are not considered to be reasonably foreseeable.

  17. NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

    2005-01-01

    A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a cleavage-type failure. For the accident case, the predicted failure load for the right rear lug from the PFA is greater than 1.98 times the limit load of the lugs. I.

  18. Cost-Utility Analysis: Current Methodological Issues and Future Perspectives

    PubMed Central

    Nuijten, Mark J. C.; Dubois, Dominique J.

    2011-01-01

    The use of cost–effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost–effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost–effectiveness threshold (Cost/QALY). In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment, and modeling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity. PMID:21713127

  19. Hospital multifactor productivity: a presentation and analysis of two methodologies.

    PubMed

    Cylus, Jonathan D; Dickensheets, Bridget A

    In response to recent discussions regarding the ability of hospitals to achieve gains in productivity, we present two methodologies that attempt to measure multifactor productivity (MFP) in the hospital sector. We analyze each method and conclude that the inconsistencies in their outcomes make it difficult to estimate a precise level of MFP that hospitals have historically achieved. Our goal in developing two methodologies is to inform the debate surrounding the ability of hospitals to achieve gains in MFP, as well as to highlight some of the challenges that exist in measuring hospital MFP. PMID:18435223

  20. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    NASA Astrophysics Data System (ADS)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis of any high-risk operations, in not only the oil and gas industry but also in other industries such as nuclear power plants, aviation industry, and transportation sector.

  1. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    SciTech Connect

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.

  2. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    PubMed

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. PMID:21094332

  3. Development of integrated core disruptive accident analysis code for FBR - ASTERIA-FBR

    SciTech Connect

    Ishizu, T.; Endo, H.; Tatewaki, I.; Yamamoto, T.; Shirakawa, N.

    2012-07-01

    The evaluation of consequence at the severe accident is the most important as a safety licensing issue for the reactor core of liquid metal cooled fast breeder reactor (LMFBR), since the LMFBR core is not in an optimum condition from the viewpoint of reactivity. This characteristics might induce a super-prompt criticality due to the core geometry change during the core disruptive accident (CDA). The previous CDA analysis codes have been modeled in plural phases dependent on the mechanism driving a super-prompt criticality. Then, the following event is calculated by connecting different codes. This scheme, however, should introduce uncertainty and/or arbitrary to calculation results. To resolve the issues and obtain the consistent calculation results without arbitrary, JNES is developing the ASTERIA-FBR code for the purpose of providing the cross-check analysis code, which is another required scheme to confirm the validity of the evaluation results prepared by applicants, in the safety licensing procedure of the planned high performance core of Monju. ASTERIA-FBR consists of the three major calculation modules, CONCORD, dynamic-GMVP, and FEMAXI-FBR. CONCORD is a three-dimensional thermal-hydraulics calculation module with multi-phase, multi-component, and multi-velocity field model. Dynamic-GMVP is a space-time neutronics calculation module. FEMAXI-FBR calculates the fuel pellet deformation behavior and fuel pin failure behavior. This paper describes the needs of ASTERIA-FBR development, major module outlines, and the model validation status. (authors)

  4. Natural phenomena risk analysis - an approach for the tritium facilities 5480.23 SAR natural phenomena hazards accident analysis

    SciTech Connect

    Cappucci, A.J. Jr.; Joshi, J.R.; Long, T.A.; Taylor, R.P.

    1997-07-01

    A Tritium Facilities (TF) Safety Analysis Report (SAR) has been developed which is compliant with DOE Order 5480.23. The 5480.23 SAR upgrades and integrates the safety documentation for the TF into a single SAR for all of the tritium processing buildings. As part of the TF SAR effort, natural phenomena hazards (NPH) were analyzed. A cost effective strategy was developed using a team approach to take advantage of limited resources and budgets. During development of the Hazard and Accident Analysis for the 5480.23 SAR, a strategy was required to allow maximum use of existing analysis and to develop a cost effective graded approach for any new analysis in identifying and analyzing the bounding accidents for the TF. This approach was used to effectively identify and analyze NPH for the TF. The first part of the strategy consisted of evaluating the current SAR for the RTF to determine what NPH analysis could be used in the new combined 5480.23 SAR. The second part was to develop a method for identifying and analyzing NPH events for the older facilities which took advantage of engineering judgment, was cost effective, and followed a graded approach. The second part was especially challenging because of the lack of documented existing analysis considered adequate for the 5480.23 SAR and a limited budget for SAR development and preparation. This paper addresses the strategy for the older facilities.

  5. A Content Analysis of News Media Coverage of the Accident at Three Mile Island.

    ERIC Educational Resources Information Center

    Stephens, Mitchell; Edison, Nadyne G.

    A study was conducted for the President's Commission on the Accident at Three Mile Island to analyze coverage of the accident by ten news organizations: two wire services, three commercial television networks, and five daily newspapers. Copies of all stories and transcripts of news programs during the first week of the accident were examined from…

  6. Analysis of Occupational Accident Fatalities and Injuries Among Male Group in Iran Between 2008 and 2012

    PubMed Central

    Alizadeh, Seyed Shamseddin; Mortazavi, Seyed Bagher; Sepehri, Mohammad Mehdi

    2015-01-01

    Background: Because of occupational accidents, permanent disabilities and deaths occur and economic and workday losses emerge. Objectives: The purpose of the present study was to investigate the factors responsible for occupational accidents occurred in Iran. Patients and Methods: The current study analyzed 1464 occupational accidents recorded by the Ministry of Labor and Social Affairs’ offices in Iran during 2008 - 2012. At first, general understanding of accidents was obtained using descriptive statistics. Afterwards, the chi-square test and Cramer’s V statistic (Vc) were used to determine the association between factors influencing the type of injury as occupational accident outcomes. Results: There was no significant association between marital status and time of day with the type of injury. However, activity sector, cause of accident, victim’s education, age of victim and victim’s experience were significantly associated with the type of injury. Conclusions: Successful accident prevention relies largely on knowledge about the causes of accidents. In any accident control activity, particularly in occupational accidents, correctly identifying high-risk groups and factors influencing accidents is the key to successful interventions. Results of this study can cause to increase accident awareness and enable workplace’s management to select and prioritize problem areas and safety system weakness in workplaces. PMID:26568848

  7. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  8. Analysis of the SL-1 Accident Using RELAPS5-3D

    SciTech Connect

    Francisco, A.D. and Tomlinson, E. T.

    2007-11-08

    On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with a discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).

  9. Electrical equipment performance under severe accident conditions (BWR/Mark 1 plant analysis): Summary report

    SciTech Connect

    Bennett, P.R.; Kolaczkowski, A.M.; Medford, G.T.

    1986-09-01

    The purpose of the Performance Evaluation of Electrical Equipment during Severe Accident States Program is to determine the performance of electrical equipment, important to safety, under severe accident conditions. In FY85, a method was devised to identify important electrical equipment and the severe accident environments in which the equipment was likely to fail. This method was used to evaluate the equipment and severe accident environments for Browns Ferry Unit 1, a BWR/Mark I. Following this work, a test plan was written in FY86 to experimentally determine the performance of one selected component to two severe accident environments.

  10. Narrative text analysis of accident reports with tractors, self-propelled harvesting machinery and materials handling machinery in Austrian agriculture from 2008 to 2010 - a comparison.

    PubMed

    Mayrhofer, Hannes; Quendler, Elisabeth; Boxberger, Josef

    2014-01-01

    The aim of this study was the identification of accident scenarios and causes by analysing existing accident reports of recognized agricultural occupational accidents with tractors, self-propelled harvesting machinery and materials handling machinery from 2008 to 2010. As a result of a literature-based evaluation of past accident analyses, the narrative text analysis was chosen as an appropriate method. A narrative analysis of the text fields of accident reports that farmers used to report accidents to insurers was conducted to obtain detailed information about the scenarios and causes of accidents. This narrative analysis of reports was made the first time and yielded first insights for identifying antecedents of accidents and potential opportunities for technical based intervention. A literature and internet search was done to discuss and confirm the findings. The narrative text analysis showed that in more than one third of the accidents with tractors and materials handling machinery the vehicle rolled or tipped over. The most relevant accident scenarios with harvesting machinery were being trapped and falling down. The direct comparison of the analysed machinery categories showed that more than 10% of the accidents in each category were caused by technical faults, slippery or muddy terrain and incorrect or inappropriate operation of the vehicle. Accidents with tractors, harvesting machinery and materials handling machinery showed similarities in terms of causes, circumstances and consequences. Certain technical and communicative measures for accident prevention could be used for all three machinery categories. Nevertheless, some individual solutions for accident prevention, which suit each specific machine type, would be necessary. PMID:24738521

  11. [Analysis of hospital clinical services to elderly victims of accidents and violence].

    PubMed

    de Lima, Maria Luiza Carvalho; de Souza, Edinilsa Ramos; Acioli, Raquel Moura Lins; Bezerra, Eduardo Duque

    2010-09-01

    The increase of elderly population and more active life allow a greater exposure to accidents and violence in this population. A diagnostic analysis of hospital services for emergency and urgent care to victims of accidents and violence in the elderly population was carried out in five capitals of Brazil. The research was based on the principles of triangulation of methods, including quantitative approaches, through the application of questionnaires to managers and pre-hospital, hospital and rehabilitation service professionals, as well as qualitative, in which interviews were made with managers and professionals and with those in charge of elderly health. Based on the guidelines of the policies studied, it was realized that none of the capitals has met all requirements, with a poor attendance characterized by lack of structure to keep a companion for the elderly, referrals to reference services, specific clinical protocols, reporting sheets, support to the elderly, job training and definition of the flow for this population. The outcome showed that the selected health services do not have the appropriate and integral profile to look after the elderly population, demonstrating the need for adequacy of these services aiming the accomplishment of the guidelines of the policies reviewed. PMID:20922278

  12. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  13. Learning from the Piper Alpha accident: A postmortem analysis of technical and organizational factors

    SciTech Connect

    Pate-Cornell, M.E. )

    1993-04-01

    The accident that occurred on board the offshore platform Piper Alpha in July 1988 killed 167 people and cost billions of dollars in property damage. It was caused by a massive fire, which was not the result of an unpredictable act of God' but of an accumulation of errors and questionable decisions. Most of them were rooted in the organization, its structure, procedures, and culture. This paper analyzes the accident scenario using the risk analysis framework, determines which human decision and actions influenced the occurrence of the basic events, and then identifies the organizational roots of these decisions and actions. These organizational factors are generalizable to other industries and engineering systems. They include flaws in the design guidelines and design practices (e.g., tight physical couplings or insufficient redundancies), misguided priorities in the management of the tradeoff between productivity and safety, mistakes in the management of the personnel on board, and errors of judgement in the process by which financial pressures are applied on the production sector (i.e., the oil companies' definition of profit centers) resulting in deficiencies in inspection and maintenance operations. This analytical approach allows identification of risk management measures that go beyond the purely technical (e.g., add redundancies to a safety system) and also include improvements of management practices. 18 refs., 4 figs.

  14. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  15. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  16. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  17. DYNAMIC ANALYSIS OF HANFORD UNIRRADIATED FUEL PACKAGE SUBJECTED TO SEQUENTIAL LATERAL LOADS IN HYPOTHETICAL ACCIDENT CONDITIONS

    SciTech Connect

    Wu, T

    2008-04-30

    Large fuel casks present challenges when evaluating their performance in the Hypothetical Accident Conditions (HAC) specified in the Code of Federal Regulations Title 10 part 71 (10CFR71). Testing is often limited by cost, difficulty in preparing test units and the limited availability of facilities which can carry out such tests. In the past, many casks were evaluated without testing by using simplified analytical methods. This paper presents a numerical technique for evaluating the dynamic responses of large fuel casks subjected to sequential HAC loading. A nonlinear dynamic analysis was performed for a Hanford Unirradiated Fuel Package (HUFP) [1] to evaluate the cumulative damage after the hypothetical accident Conditions of a 30-foot lateral drop followed by a 40-inch lateral puncture as specified in 10CFR71. The structural integrity of the containment vessel is justified based on the analytical results in comparison with the stress criteria, specified in the ASME Code, Section III, Appendix F [2], for Level D service loads. The analyzed cumulative damages caused by the sequential loading of a 30-foot lateral drop and a 40-inch lateral puncture are compared with the package test data. The analytical results are in good agreement with the test results.

  18. A Common Methodology for Safety and Reliability Analysis for Space Reactor Missions

    SciTech Connect

    Frank, Michael V.

    2006-01-20

    The thesis of this paper is that the methodology of probabilistic risk management (PRM) has the capability to integrate both safety and reliability analyses for space nuclear missions. Practiced within a decision analysis framework, the concept of risk and the overall methodology of PRM are not dependent on whether the outcome affects mission success or mission safety. This paper presents the methodology by means of simplified exampl0008.

  19. An analysis of low-flow ground water sampling methodology

    SciTech Connect

    Sevee, J.E.; White, C.A.; Maher, D.J.

    2000-03-31

    Low-flow ground water sampling methodology can minimize well disturbance and aggravated colloid transport into samples obtained from monitoring wells. However, in low hydraulic conductivity formations, low-flow sampling methodology can cause excessive drawdown that can result in screen desaturation and high ground water velocities in the vicinity of the well, causing unwanted colloid and soil transport into ground water samples taken from the well. Ground water velocities may increase several fold above that of the natural setting. To examine the drawdown behavior of a monitoring well, mathematical relationships can be developed that allow prediction of the steady-state drawdown for constant low-flow pumping rates based on well geometry and aquifer properties. The equations also estimate the time necessary to reach drawdown equilibrium. These same equations can be used to estimate the relative contribution of water entering a sampling device from either the well standpipe or the aquifer. Such equations can be useful in planning a low-flow sampling program and may suggest when to collect a water sample. In low hydraulic conductivity formations, the equations suggest that drawdown may not stabilize for well depths, violating the minimal drawdown requirement of the low-flow technique. In such cases, it may be more appropriate to collect a slug or passive sample from the well screen, under the assumption that the water in the well screen is in equilibrium with the surrounding aquifer.

  20. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  1. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  2. [Cranial injuries among children in the county of Ringkøbing. 2. Analysis of accidents].

    PubMed

    Hansen, T B; Pless, S; Gravers, M

    1991-10-14

    Questionnaires were sent concerning all of the patients aged 0-15 years who were admitted to the Ringkøbing County Hospitals during the period 1.1.1982 to 31.8.1989 with head injuries. The object of the questionnaires was to elucidate the circumstances of the accident which resulted in hospitalization. Questionnaires concerning 988 patients (67.5%) were returned. Definite accumulation of the accidents was observed during the period 12.00-18.00 with 55.2% of all the accidents. The majority of the accidents (72.3%) occurred out-of-doors and the majority of these on transport areas. Bicycle accidents were most common, most frequently in children over the age of four years. Where younger children were concerned, accidents in the home were commonest. Accidents in playgrounds and during sports constituted only 13% of the total material. A relatively great proportion of the children (42.9%) were under supervision, usually from their own family, when the accident occurred. On the basis of this investigation, the authors recommend that possible prophylactic measures in the County of Ringkøbing should be directed towards bicycle accidents among the older children and accidents in the homes where smaller children are concerned. PMID:1949320

  3. The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2012-01-01

    In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

  4. Analysis of Radionuclide Releases from the Fukushima Dai-Ichi Nuclear Power Plant Accident Part I

    NASA Astrophysics Data System (ADS)

    Le Petit, G.; Douysset, G.; Ducros, G.; Gross, P.; Achim, P.; Monfort, M.; Raymond, P.; Pontillon, Y.; Jutier, C.; Blanchard, X.; Taffary, T.; Moulin, C.

    2014-03-01

    Part I of this publication deals with the analysis of fission product releases consecutive to the Fukushima Dai-ichi accident. Reactor core damages are assessed relying on radionuclide detections performed by the CTBTO radionuclide network, especially at the particulate station located at Takasaki, 210 km away from the nuclear power plant. On the basis of a comparison between the reactor core inventory at the time of reactor shutdowns and the fission product activities measured in air at Takasaki, especially 95Nb and 103Ru, it was possible to show that the reactor cores were exposed to high temperature for a prolonged time. This diagnosis was confirmed by the presence of 113Sn in air at Takasaki. The 133Xe assessed release at the time of reactor shutdown (8 × 1018 Bq) turned out to be in the order of 80 % of the amount deduced from the reactor core inventories. This strongly suggests a broad meltdown of reactor cores.

  5. Accident safety analysis for 300 Area N Reactor Fuel Fabrication and Storage Facility

    SciTech Connect

    Johnson, D.J.; Brehm, J.R.

    1994-01-01

    The purpose of the accident safety analysis is to identify and analyze a range of credible events, their cause and consequences, and to provide technical justification for the conclusion that uranium billets, fuel assemblies, uranium scrap, and chips and fines drums can be safely stored in the 300 Area N Reactor Fuel Fabrication and Storage Facility, the contaminated equipment, High-Efficiency Air Particulate filters, ductwork, stacks, sewers and sumps can be cleaned (decontaminated) and/or removed, the new concretion process in the 304 Building will be able to operate, without undue risk to the public, employees, or the environment, and limited fuel handling and packaging associated with removal of stored uranium is acceptable.

  6. PTSD symptom severity and psychiatric comorbidity in recent motor vehicle accident victims: a latent class analysis.

    PubMed

    Hruska, Bryce; Irish, Leah A; Pacella, Maria L; Sledjeski, Eve M; Delahanty, Douglas L

    2014-10-01

    We conducted a latent class analysis (LCA) on 249 recent motor vehicle accident (MVA) victims to examine subgroups that differed in posttraumatic stress disorder (PTSD) symptom severity, current major depressive disorder and alcohol/other drug use disorders (MDD/AoDs), gender, and interpersonal trauma history 6-weeks post-MVA. A 4-class model best fit the data with a resilient class displaying asymptomatic PTSD symptom levels/low levels of comorbid disorders; a mild psychopathology class displaying mild PTSD symptom severity and current MDD; a moderate psychopathology class displaying severe PTSD symptom severity and current MDD/AoDs; and a severe psychopathology class displaying extreme PTSD symptom severity and current MDD. Classes also differed with respect to gender composition and history of interpersonal trauma experience. These findings may aid in the development of targeted interventions for recent MVA victims through the identification of subgroups distinguished by different patterns of psychiatric problems experienced 6-weeks post-MVA. PMID:25124501

  7. Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document

    SciTech Connect

    Not Available

    1988-12-15

    The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source. The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.

  8. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  9. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  10. Interpretation methodology and analysis of in-flight lightning data

    NASA Technical Reports Server (NTRS)

    Rudolph, T.; Perala, R. A.

    1982-01-01

    A methodology is presented whereby electromagnetic measurements of inflight lightning stroke data can be understood and extended to other aircraft. Recent measurements made on the NASA F106B aircraft indicate that sophisticated numerical techniques and new developments in corona modeling are required to fully understand the data. Thus the problem is nontrivial and successful interpretation can lead to a significant understanding of the lightning/aircraft interaction event. This is of particular importance because of the problem of lightning induced transient upset of new technology low level microcircuitry which is being used in increasing quantities in modern and future avionics. Inflight lightning data is analyzed and lightning environments incident upon the F106B are determined.

  11. Quantifying Drosophila food intake: comparative analysis of current methodology

    PubMed Central

    Deshpande, Sonali A.; Carvalho, Gil B.; Amador, Ariadna; Phillips, Angela M.; Hoxha, Sany; Lizotte, Keith J.; Ja, William W.

    2014-01-01

    Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the Capillary Feeder (CAFE), food-labeling with a radioactive tracer or a colorimetric dye, and observations of proboscis extension (PE). We show that the CAFE and radioisotope-labeling provide the most consistent results, have the highest sensitivity, and can resolve differences in feeding that dye-labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of food intake methodology will greatly advance Drosophila studies of nutrition, behavior, and disease. PMID:24681694

  12. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  13. The XMM Cluster Survey: X-ray analysis methodology

    NASA Astrophysics Data System (ADS)

    Lloyd-Davies, E. J.; Romer, A. Kathy; Mehrtens, Nicola; Hosmer, Mark; Davidson, Michael; Sabirli, Kivanc; Mann, Robert G.; Hilton, Matt; Liddle, Andrew R.; Viana, Pedro T. P.; Campbell, Heather C.; Collins, Chris A.; Dubois, E. Naomi; Freeman, Peter; Harrison, Craig D.; Hoyle, Ben; Kay, Scott T.; Kuwertz, Emma; Miller, Christopher J.; Nichol, Robert C.; Sahlén, Martin; Stanford, S. A.; Stott, John P.

    2011-11-01

    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM-Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we describe the data processing methodology applied to the 5776 XMM observations used to construct the current XCS source catalogue. A total of 3675 > 4? cluster candidates with >50 background-subtracted X-ray counts are extracted from a total non-overlapping area suitable for cluster searching of 410 deg2. Of these, 993 candidates are detected with >300 background-subtracted X-ray photon counts, and we demonstrate that robust temperature measurements can be obtained down to this count limit. We describe in detail the automated pipelines used to perform the spectral and surface brightness fitting for these candidates, as well as to estimate redshifts from the X-ray data alone. A total of 587 (122) X-ray temperatures to a typical accuracy of <40 (<10) per cent have been measured to date. We also present the methodology adopted for determining the selection function of the survey, and show that the extended source detection algorithm is robust to a range of cluster morphologies by inserting mock clusters derived from hydrodynamical simulations into real XMMimages. These tests show that the simple isothermal ?-profiles is sufficient to capture the essential details of the cluster population detected in the archival XMM observations. The redshift follow-up of the XCS cluster sample is presented in a companion paper, together with a first data release of 503 optically confirmed clusters.

  14. Methodologic research needs in environmental epidemiology: data analysis.

    PubMed Central

    Prentice, R L; Thomas, D

    1993-01-01

    A brief review is given of data analysis methods for the identification and quantification of associations between environmental exposures and health events of interest. Data analysis methods are outlined for each of the study designs mentioned, with an emphasis on topics in need of further research. Particularly noted are the need for improved methods for accommodating exposure assessment measurement errors in analytic epidemiologic studies and for improved methods for the conduct and analysis of aggregate data (ecologic) studies. PMID:8206041

  15. Development of a probabilistic analysis methodology for structural reliability estimation

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.

    1991-01-01

    The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.

  16. A comparative analysis of accident risks in fossil, hydro, and nuclear energy chains

    SciTech Connect

    Burgherr, P.; Hirschberg, S.

    2008-07-01

    This study presents a comparative assessment of severe accident risks in the energy sector, based on the historical experience of fossil (coal, oil, natural gas, and LPG (Liquefied Petroleum Gas)) and hydro chains contained in the comprehensive Energy-related Severe Accident Database (ENSAD), as well as Probabilistic Safety Assessment (PSA) for the nuclear chain. Full energy chains were considered because accidents can take place at every stage of the chain. Comparative analyses for the years 1969-2000 included a total of 1870 severe ({>=} 5 fatalities) accidents, amounting to 81,258 fatalities. Although 79.1% of all accidents and 88.9% of associated fatalities occurred in less developed, non-OECD countries, industrialized OECD countries dominated insured losses (78.0%), reflecting their substantially higher insurance density and stricter safety regulations. Aggregated indicators and frequency-consequence (F-N) curves showed that energy-related accident risks in non-OECD countries are distinctly higher than in OECD countries. Hydropower in non-OECD countries and upstream stages within fossil energy chains are most accident-prone. Expected fatality rates are lowest for Western hydropower and nuclear power plants; however, the maximum credible consequences can be very large. Total economic damages due to severe accidents are substantial, but small when compared with natural disasters. Similarly, external costs associated with severe accidents are generally much smaller than monetized damages caused by air pollution.

  17. Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology

    SciTech Connect

    J. Scaglione

    1999-09-09

    This report, ''Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology'', contains a summary of the laboratory critical experiment (LCE) analyses used to support the validation of the disposal criticality analysis methodology. The objective of this report is to present a summary of the LCE analyses' results. These results demonstrate the ability of MCNP to accurately predict the critical multiplication factor (keff) for fuel with different configurations. Results from the LCE evaluations will support the development and validation of the criticality models used in the disposal criticality analysis methodology. These models and their validation have been discussed in the ''Disposal Criticality Analysis Methodology Topical Report'' (CRWMS M&O 1998a).

  18. Investigating accident causation through information network modelling.

    PubMed

    Griffin, T G C; Young, M S; Stanton, N A

    2010-02-01

    Management of risk in complex domains such as aviation relies heavily on post-event investigations, requiring complex approaches to fully understand the integration of multi-causal, multi-agent and multi-linear accident sequences. The Event Analysis of Systemic Teamwork methodology (EAST; Stanton et al. 2008) offers such an approach based on network models. In this paper, we apply EAST to a well-known aviation accident case study, highlighting communication between agents as a central theme and investigating the potential for finding agents who were key to the accident. Ultimately, this work aims to develop a new model based on distributed situation awareness (DSA) to demonstrate that the risk inherent in a complex system is dependent on the information flowing within it. By identifying key agents and information elements, we can propose proactive design strategies to optimize the flow of information and help work towards avoiding aviation accidents. Statement of Relevance: This paper introduces a novel application of an holistic methodology for understanding aviation accidents. Furthermore, it introduces an ongoing project developing a nonlinear and prospective method that centralises distributed situation awareness and communication as themes. The relevance of findings are discussed in the context of current ergonomic and aviation issues of design, training and human-system interaction. PMID:20099174

  19. Defense In-Depth Accident Analysis Evaluation of Tritium Facility Bldgs. 232-H, 233-H, and 234-H

    SciTech Connect

    Blanchard, A.

    1999-05-10

    'The primary purpose of this report is to document a Defense-in-Depth (DID) accident analysis evaluation for Department of Energy (DOE) Savannah River Site (SRS) Tritium Facility Buildings 232-H, 233-H, and 234-H. The purpose of a DID evaluation is to provide a more realistic view of facility radiological risks to the offsite public than the bounding deterministic analysis documented in the Safety Analysis Report, which credits only Safety Class items in the offsite dose evaluation.'

  20. Adapting Job Analysis Methodology to Improve Evaluation Practice

    ERIC Educational Resources Information Center

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  1. Accident management information needs

    SciTech Connect

    Hanson, D.J.; Ward, L.W.; Nelson, W.R.; Meyer, O.R. )

    1990-04-01

    In support of the US Nuclear Regulatory Commission (NRC) Accident Management Research Program, a methodology has been developed for identifying the plant information needs necessary for personnel involved in the management of an accident to diagnose that an accident is in progress, select and implement strategies to prevent or mitigate the accident, and monitor the effectiveness of these strategies. This report describes the methodology and presents an application of this methodology to a Pressurized Water Reactor (PWR) with a large dry containment. A risk-important severe accident sequence for a PWR is used to examine the capability of the existing measurements to supply the necessary information. The method includes an assessment of the effects of the sequence on the measurement availability including the effects of environmental conditions. The information needs and capabilities identified using this approach are also intended to form the basis for more comprehensive information needs assessment performed during the analyses and development of specific strategies for use in accident management prevention and mitigation. 3 refs., 16 figs., 7 tabs.

  2. Development of methodology for horizontal axis wind turbine dynamic analysis

    NASA Technical Reports Server (NTRS)

    Dugundji, J.

    1982-01-01

    Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.

  3. Methodology for statistical analysis of SENCAR mouse skin assay data.

    PubMed Central

    Stober, J A

    1986-01-01

    Various response measures and statistical methods appropriate for the analysis of data collected in the SENCAR mouse skin assay are examined. The characteristics of the tumor response data do not readily lend themselves to the classical methods for hypothesis testing. The advantages and limitations of conventional methods of analysis and methods recommended in the literature are discussed. Several alternative response measures that were developed specifically to answer the problems inherent in the data collected in the SENCAR bioassay system are described. These measures take into account animal survival, tumor multiplicity, and tumor regression. Statistical methods for the analysis of these measures to test for a positive dose response and a dose-response relationship are discussed. Sample data from representative initiation/promotion studies are used to compare the response measures and methods of analysis. PMID:3780632

  4. Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.

    1999-01-01

    A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

  5. Methodologies and techniques for analysis of network flow data

    SciTech Connect

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  6. Breath analysis in disease diagnosis: methodological considerations and applications.

    PubMed

    Lourenço, Célia; Turner, Claire

    2014-01-01

    Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. "Breath fingerprinting", indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037

  7. Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications

    PubMed Central

    Lourenço, Célia; Turner, Claire

    2014-01-01

    Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037

  8. Analysis of fission product revaporization in a BWR Reactor Coolant System during a station blackout accident

    SciTech Connect

    Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

    1988-01-01

    This paper presents an analysis of fission product revaporization from the Reactor Coolant System (RCS) following the Reactor Pressure Vessel (RPV) failure. The station blackout accident in a BWR Mark I Power Plant was considered. The TRAPMELT3 models for vaporization, chemisorption, and the decay heating of RCS structures and gases were used and extended beyond the RPV failure in the analysis. The RCS flow models based on the density-difference or pressure-difference between the RCS and containment pedestal region were developed to estimate the RCS outflow which carries the revaporized fission product to the containment. A computer code called REVAP was developed for the analysis. The REVAP code was incorporated with the MARCH, TRAPMELT3 and NAUA codes from the Source Term Code Package (STCP) to estimate the impact of revaporization on environmental release. The results show that the thermal-hydraulic conditions between the RCS and the pedestal region are important factors in determining the magnitude of revaporization and subsequent release of the volatile fission product into the environment. 6 refs., 8 figs.

  9. Consequence analysis of a hypothetical contained criticality accident in the Hanford Critical Mass Laboratory

    SciTech Connect

    Gore, B.F.; Strenge, D.L.; Mishima, J.

    1984-12-01

    The original hazards summary report (i.e., SAR) for the CML addressed the consequences of a hypothetical accidential critical excursion occurring with the experimental assembly room open. That report indicated that the public would receive insignificant radiation exposure regardless of the type of atmospheric condition, while plant personnel could possibly receive exposures greater than the annual exposure limits for radiation workers, when a strong inversion existed. This analysis investigates the consequencs of a hypothetical accident criticality occurring with the experimental assembly room sealed. Due to the containment capabilities designed and built into the critical assembly room, the consequences are greatly reduced below those presented in HW-66266. Despite the incorporation of many extremely conservative assumptions to simplify the analysis, the radiation doses predicted for personnel 100 meters or more distant from the CML are found to be smaller than the annual radiation dose limit for members of the public in uncontrolled areas during routine, nonaccident operations. Therefore, the results of this analysis demonstrate that the occurrence of a hypothetical critical excursion within the sealed experimental assembly room at the Hanford Critical Mass Laboratory presents only a small, acceptable risk to personnel and facilities in the area and no additional safety systems or controls are needed for the continued safe operation of the CML. 11 references, 4 tables. (ACR)

  10. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  11. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    PubMed

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. PMID:26613351

  12. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  13. The Wheels of Misfortune: A Time Series Analysis of Bicycle Accidents on a College Campus.

    ERIC Educational Resources Information Center

    Johnson, Mark S.; And Others

    1978-01-01

    The effectiveness of engineering and policy interventions in reducing bicycle accidents on the campus of the University of California at Santa Barbara was investigated. None of the bikeway modifications were found to have been effective in reducing bicycle accidents. (Author/GDC)

  14. Emergency drinking water treatment during source water pollution accidents in China: origin analysis, framework and technologies.

    PubMed

    Zhang, Xiao-Jian; Chen, Chao; Lin, Peng-Fei; Hou, Ai-Xin; Niu, Zhang-Bin; Wang, Jun

    2011-01-01

    China has suffered frequent source water contamination accidents in the past decade, which has resulted in severe consequences to the water supply of millions of residents. The origins of typical cases of contamination are discussed in this paper as well as the emergency response to these accidents. In general, excessive pursuit of rapid industrialization and the unreasonable location of factories are responsible for the increasing frequency of accidental pollution events. Moreover, insufficient attention to environmental protection and rudimentary emergency response capability has exacerbated the consequences of such accidents. These environmental accidents triggered or accelerated the promulgation of stricter environmental protection policy and the shift from economic development mode to a more sustainable direction, which should be regarded as the turning point of environmental protection in China. To guarantee water security, China is trying to establish a rapid and effective emergency response framework, build up the capability of early accident detection, and develop efficient technologies to remove contaminants from water. PMID:21133359

  15. Social Judgment Analysis: Methodology for Improving Interpersonal Communication and Understanding.

    ERIC Educational Resources Information Center

    Rohrbaugh, John; Harmon, Joel

    Research has found the Social Judgment Analysis (SJA) approach, with its focus on judgment policy and cognitive feedback, to be a significant factor in developing group member agreement and improving member performance. A controlled experiment was designed to assess the relative quality of the judgment making process provided by SJA.…

  16. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    ERIC Educational Resources Information Center

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  17. Statistical theory and methodology for remote sensing data analysis

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1974-01-01

    A model is developed for the evaluation of acreages (proportions) of different crop-types over a geographical area using a classification approach and methods for estimating the crop acreages are given. In estimating the acreages of a specific croptype such as wheat, it is suggested to treat the problem as a two-crop problem: wheat vs. nonwheat, since this simplifies the estimation problem considerably. The error analysis and the sample size problem is investigated for the two-crop approach. Certain numerical results for sample sizes are given for a JSC-ERTS-1 data example on wheat identification performance in Hill County, Montana and Burke County, North Dakota. Lastly, for a large area crop acreages inventory a sampling scheme is suggested for acquiring sample data and the problem of crop acreage estimation and the error analysis is discussed.

  18. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  19. Development of NASA's Accident Precursor Analysis Process Through Application on the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.

  20. Probabilistic analysis of accidents involving pyrophoric particle accumulation in a closed system containing high explosives

    SciTech Connect

    Bott, T.F.; Eisenhawer, S.W.

    1997-07-01

    This paper describes a probabilistic analysis of an unusual safety concern encountered during safety analysis of a closed system containing high explosive (HE). In this application, a pyrophoric material could be generated inside the system by corrosion of system components. This pyrophoric may form extremely small particles that could be transported through a complex path to a high explosive component where they could collect on exposed surfaces as a result of routine handling operations. A potentially serious accident could be initiated if the system is opened for maintenance and oxygen reaches the pyrophoric particles that are resting on the surface of the high explosive. The pyrophoric particles react vigorously in the presence of oxygen and will ignite the high explosive if there is a sufficient concentration of the particles in a localized area. This paper reports on models for estimating the probability of achieving hazardous concentrations of pyrophoric particles on the explosive surface. These probability estimates are based on occupancy likelihood and geometric models. The probability models provide a means for exploring the effects of material behavior, heat transfer, and material transport on the likelihood of achieving unwanted concentrations of pyrophoric particles on the explosive surface.

  1. Overview of the Aerothermodynamics Analysis Conducted in Support of the STS-107 Accident Investigation

    NASA Technical Reports Server (NTRS)

    Campbell, Charles H.

    2004-01-01

    A graphic presentation of the aerothermodynamics analysis conducted in support of the STS-107 accident investigation. Investigation efforts were conducted as part of an integrated AATS team (Aero, Aerothermal, Thermal, Stress) directed by OVEWG. Graphics presented are: STS-107 Entry trajectory and timeline (1st off-nominal event to Post-LOS); Indications from OI telemetry data; Aero/aerothermo/thermal analysis process; Selected STS-107 side fuselage/OMS pod off-nominal temperatures; Leading edge structural subsystem; Relevant forensics evidence; External aerothermal environments; STS-107 Pre-entry EOM3 heating profile; Surface heating and temperatures; Orbiter wing leading edge damage survey; Internal aerothermal environments; Orbiter wing CAD model; Aerodynamic flight reconstruction; Chronology of aerodynamic/aerothermoydynamic contributions; Acreage TPS tile damage; Larger OML perturbations; Missing RCC panel(s); Localized damage to RCC panel/missing T-seal; RCC breach with flow ingestion; and Aero-aerothermal closure. NAIT served as the interface between the CAIB and NASA investigation teams; and CAIB requests for study were addressed.

  2. The role of mitochondrial proteomic analysis in radiological accidents and terrorism.

    PubMed

    Maguire, David; Zhang, Bingrong; Zhang, Amy; Zhang, Lurong; Okunieff, Paul

    2013-01-01

    In the wake of the 9/11 terrorist attacks and the recent Level 7 nuclear event at the Fukushima Daiichi plant, there has been heightened awareness of the possibility of radiological terrorism and accidents and the need for techniques to estimate radiation levels after such events. A number of approaches to monitoring radiation using biological markers have been published, including physical techniques, cytogenetic approaches, and direct, DNA-analysis approaches. Each approach has the potential to provide information that may be applied to the triage of an exposed population, but problems with development and application of devices or lengthy analyses limit their potential for widespread application. We present a post-irradiation observation with the potential for development into a rapid point-of-care device. Using simple mitochondrial proteomic analysis, we investigated irradiated and nonirradiated murine mitochondria and identified a protein mobility shift occurring at 2-3 Gy. We discuss the implications of this finding both in terms of possible mechanisms and potential applications in bio-radiation monitoring. PMID:22879026

  3. The first steps towards a standardized methodology for CSP electricity yield analysis.

    SciTech Connect

    Wagner, Michael; Hirsch, Tobias , Institute of Technical Thermodynamics, Stuttgart,Germany); Benitez, Daniel; Eck, Markus , Institute of Technical Thermodynamics, Stuttgart,Germany); Ho, Clifford Kuofei

    2010-08-01

    The authors have founded a temporary international core team to prepare a SolarPACES activity aimed at the standardization of a methodology for electricity yield analysis of CSP plants. This core team has drafted a structural framework for a standardized methodology and the standardization process itself. The structural framework has to assure that the standardized methodology is applicable to all conceivable CSP systems, can be used on all levels of the project development process and covers all aspects affecting the electricity yield of CSP plants. Since the development of the standardized methodology is a complex task, the standardization process has been structured in work packages, and numerous international experts covering all aspects of CSP yield analysis have been asked to contribute to this process. These experts have teamed up in an international working group with the objective to develop, document and publish standardized methodologies for CSP yield analysis. This paper summarizes the intended standardization process and presents the structural framework of the methodology for CSP yield analysis.

  4. Operator error and system deficiencies: analysis of 508 mining incidents and accidents from Queensland, Australia using HFACS.

    PubMed

    Patterson, Jessica M; Shappell, Scott A

    2010-07-01

    Historically, mining has been viewed as an inherently high-risk industry. Nevertheless, the introduction of new technology and a heightened concern for safety has yielded marked reductions in accident and injury rates over the last several decades. In an effort to further reduce these rates, the human factors associated with incidents/accidents needs to be addressed. A modified version of the Human Factors Analysis and Classification System was used to analyze incident and accident cases from across the state of Queensland to identify human factor trends and system deficiencies within mining. An analysis of the data revealed that skill-based errors were the most common unsafe act and showed no significant differences across mine types. However, decision errors did vary across mine types. Findings for unsafe acts were consistent across the time period examined. By illuminating human causal factors in a systematic fashion, this study has provided mine safety professionals the information necessary to reduce mine incidents/accidents further. PMID:20441855

  5. Analysis of Surface Water Pollution Accidents in China: Characteristics and Lessons for Risk Management.

    PubMed

    Yao, Hong; Zhang, Tongzhu; Liu, Bo; Lu, Feng; Fang, Shurong; You, Zhen

    2016-04-01

    Understanding historical accidents is important for accident prevention and risk mitigation; however, there are no public databases of pollution accidents in China, and no detailed information regarding such incidents is readily available. Thus, 653 representative cases of surface water pollution accidents in China were identified and described as a function of time, location, materials involved, origin, and causes. The severity and other features of the accidents, frequency and quantities of chemicals involved, frequency and number of people poisoned, frequency and number of people affected, frequency and time for which pollution lasted, and frequency and length of pollution zone were effectively used to value and estimate the accumulated probabilities. The probabilities of occurrences of various types based on origin and causes were also summarized based on these observations. The following conclusions can be drawn from these analyses: (1) There was a high proportion of accidents involving multi-district boundary regions and drinking water crises, indicating that more attention should be paid to environmental risk prevention and the mitigation of such incidents. (2) A high proportion of accidents originated from small-sized chemical plants, indicating that these types of enterprises should be considered during policy making. (3) The most common cause (49.8 % of the total) was intentional acts (illegal discharge); accordingly, efforts to increase environmental consciousness in China should be enhanced. PMID:26739714

  6. Finite element methodology for transient conduction/forced-convection thermal analysis

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.; Wieting, A. R.

    1979-01-01

    Finite element methodology for steady state thermal analysis of convectively cooled structures has been extended for transient analysis. The finite elements are based on representing the fluid passages by fluid bulk-temperature nodes and fluid-solid interface nodes. The formulation of the finite element equations for a typical flow passage is based on the weighted residual method with upwind weighting functions. Computer implementation of the convective finite element methodology using explicit and implicit time integration algorithms is described. Accuracy and efficiency of the methodology is evaluated by comparisons with analytical solutions and finite-difference lumped-parameter analyses. The comparative analyses demonstrate that finite element conduction/conduction methodology may be used to predict transient temperatures with an accuracy equal or superior to the lumped-parameter finite-difference method.

  7. The Role of Materials Degradation and Analysis in the Space Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steven J.

    2006-01-01

    The efforts following the loss of the Space Shuttle Columbia included debris recovery, reconstruction, and analysis. The debris was subjected to myriad quantitative and semiquantitative chemical analysis techniques, ranging from examination via the scanning electron microscope (SEM) with energy dispersive spectrometer (EDS) to X-Ray diffraction (XRD) and electron probe micro-analysis (EPMA). The results from the work with the debris helped the investigators determine the location where a breach likely occurred in the leading edge of the left wing during lift off of the Orbiter from the Kennedy Space Center. Likewise, the information evidenced by the debris was also crucial in ascertaining the path of impinging plasma flow once it had breached the wing. After the Columbia Accident Investigation Board (CAIB) issued its findings, the major portion of the investigation was concluded. However, additional work remained to be done on many pieces of debris from portions of the Orbiter which were not directly related to the initial impact during ascent. This subsequent work was not only performed in the laboratory, but was also performed with portable equipment, including examination via portable X-Ray fluorescence (XRF) and Fourier transform infrared spectroscopy (FTIR). Likewise, acetate and silicon-rubber replicas of various fracture surfaces were obtained for later macroscopic and fractographic examination. This paper will detail the efforts and findings from the initial investigation, as well as present results obtained by the later examination and analysis of debris from the Orbiter including its windows, bulkhead structures, and other components which had not been examined during the primary investigation.

  8. Methodology for Establishment of Integrated Flood Analysis System

    NASA Astrophysics Data System (ADS)

    Kim, B.; Sanders, B. F.; Kim, K.; Han, K.; Famiglietti, J. S.

    2012-12-01

    Flood risk management efforts face considerable uncertainty in flood hazard delineation as a consequence of changing climatic conditions including shifts in precipitation, soil moisture, and land uses. These changes can confound efforts to characterize flood impacts over decadal time scales and thus raise questions about the true benefits and drawbacks of alternative flood management projects including those of a structural and non-structural nature. Here we report an integrated flood analysis system that is designed to bring climate change information into flood risk context and characterize flood hazards in both rural and urban areas. Distributed rainfall-runoff model, one-dimensional (1D) NWS-FLDWAV model, 1D Storm Water Management Model (SWMM) and two-dimensional (2D) BreZo model are coupled. Distributed model using the multi-directional flow allocation and real time updating is used for rainfall-runoff analysis in ungauged watershed and its outputs are taken as boundary conditions to the FLDWAV model which was employed for 1D river hydraulic routing and predicting the overflow discharge at levees which were overtopped. In addition, SWMM is chosen to analyze storm sewer flow in urban areas and BreZo is used to estimate the inundation zones, depths and velocities due to the surcharge flow at sewer system or overflow at levees on the land surface. The overflow at FLDWAV or surcharged flow at SWMM becomes point sources in BreZo. Applications in Korea and California are presented.

  9. Three-dimensional visualization and analysis methodologies: a current perspective.

    PubMed

    Udupa, J K

    1999-01-01

    Three-dimensional (3D) imaging was developed to provide both qualitative and quantitative information about an object or object system from images obtained with multiple modalities including digital radiography, computed tomography, magnetic resonance imaging, positron emission tomography, single photon emission computed tomography, and ultrasonography. Three-dimensional imaging operations may be classified under four basic headings: preprocessing, visualization, manipulation, and analysis. Preprocessing operations (volume of interest, filtering, interpolation, registration, segmentation) are aimed at extracting or improving the extraction of object information in given images. Visualization operations facilitate seeing and comprehending objects in their full dimensionality and may be either scene-based or object-based. Manipulation may be either rigid or deformable and allows alteration of object structures and of relationships between objects. Analysis operations, like visualization operations, may be either scene-based or object-based and deal with methods of quantifying object information. There are many challenges involving matters of precision, accuracy, and efficiency in 3D imaging. Nevertheless, 3D imaging is an exciting technology that promises to offer an expanding number and variety of applications. PMID:10336203

  10. The effect of gamma-ray transport on afterheat calculations for accident analysis

    SciTech Connect

    Reyes, S.; Latkowski, J.F.; Sanz, J.

    2000-05-01

    Radioactive afterheat is an important source term for the release of radionuclides in fusion systems under accident conditions. Heat transfer calculations are used to determine time-temperature histories in regions of interest, but the true source term needs to be the effective afterheat, which considers the transport of penetrating gamma rays. Without consideration of photon transport, accident temperatures may be overestimated in others. The importance of this effect is demonstrated for a simple, one-dimensional problem. The significance of this effect depends strongly on the accident scenario being analyzed.

  11. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  12. UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY

    SciTech Connect

    John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

    2013-07-01

    It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

  13. Review of current analysis methodology for reinforced concrete structural evaluations

    SciTech Connect

    Sharma, S.; Reich, M.; Chang, T.Y.

    1983-04-01

    This report presents an in-depth review of current modeling techniques and nonlinear analysis procedures for reinforced concrete structural evaluations. A detailed discussion is given of numerous mathematical models that have been proposed in the literature for modeling the nonlinear material behavior of concrete and steel and for representing the complex concrete-steel interaction effects. Finite element discretizations of reinforced concrete structures based on discrete, embedded and smeared approaches are described. Special modelng techniques for representing such effects as cracking, crushing, tension stiffening, shear transfer, concrete-steel bond degradation and loss of prestress are discussed in detail. Further, various static and dynamic nonlinear solution procedures are reviewed in terms of solution accuracy and computational efficiency. Numerical examples are presented to demonstrate the extent of variations in analytical predictions that can result when different modeling approaches are used.

  14. Methodology for analysis and simulation of large multidisciplinary problems

    NASA Technical Reports Server (NTRS)

    Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

    1989-01-01

    The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

  15. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analyses is presented. New thermal finite elements which yield exact nodal and element temperature for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal-structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  16. A faster reactor transient analysis methodology for PCs

    SciTech Connect

    Ott, K.O. . School of Nuclear Engineering)

    1991-10-01

    The simplified ANL model for LMR transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes the form of a quadratic equation, the quadratic dynamics equation.'' This model forms the basis for GW-BASIC program, LTC, for LMR Transient Calculation program, which can effectively be run on a PC. The GW-BASIC version of the LTC program is described in detail in Volume 2 of this report.

  17. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  18. Analysis methodology and recent results of the IGS network combination

    NASA Astrophysics Data System (ADS)

    Ferland, R.; Kouba, J.; Hutchison, D.

    2000-11-01

    A working group of the International GPS Service (IGS) was created to look after Reference Frame (RF) issues and contribute to the densification and improvement of the International Terrestrial Reference Frame (ITRF). One important objective of the Reference Frame Working Group is to generate consistent IGS station coordinates and velocities, Earth Rotation Parameters (ERP) and geocenter estimates along with the appropriate covariance information. These parameters have a direct impact on other IGS products such as the estimation of GPS satellite ephemerides, as well as satellite and station clocks. The information required is available weekly from the Analysis Centers (AC) (cod, emr, esa, gfz, jpl, ngs, sio) and from the Global Network Associate Analysis Centers (GNAAC) (JPL, mit, ncl) using a "Software Independent Exchange Format" (SINEX). The AC are also contributing daily ERPs as part of their weekly submission. The procedure in place simultaneously combines the weekly station coordinates, geocenter and daily ERP estimates. A cumulative solution containing station coordinates and velocity is also updated with each weekly combination. This provides a convenient way to closely monitor the quality of the estimated station coordinates and to have an up to date cumulative solution available at all times. To provide some necessary redundancy, the weekly station coordinates solution is compared against the GNAAC solutions. Each of the 3 GNAAC uses its own software, allowing independent verification of the combination process. The RMS of the coordinate differences in the north, east and up components between the AC/GNAAC and the ITRF97 Reference Frame Stations are 4-10 mm, 5-20 mm and 6-25 mm. The station velocities within continental plates are compared to the NNR-NUVEL1A plate motion model (DeMets et al., 1994). The north, east and up velocity RMS are 2 mm/y, 3 mm/y and 8 mm/y. Note that NNR-NUVEL1A assumes a zero vertical velocity.

  19. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  20. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  1. Routes to failure: analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system.

    PubMed

    Li, Wen-Chin; Harris, Don; Yu, Chung-San

    2008-03-01

    The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents. PMID:18329391

  2. Assessment of ISLOCA risk-methodology and application to a combustion engineering plant

    SciTech Connect

    Kelly, D.L.; Auflick, J.L.; Haney, L.N.

    1992-04-01

    Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISOLOCA core damage frequency and risk. This report presents a detailed of description of the application of this analysis methodology to a Combustion Engineering plant.

  3. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    ERIC Educational Resources Information Center

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  4. Energy minimization in medical image analysis: Methodologies and applications.

    PubMed

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26186171

  5. Traffic Analysis and Road Accidents: A Case Study of Hyderabad using GIS

    NASA Astrophysics Data System (ADS)

    Bhagyaiah, M.; Shrinagesh, B.

    2014-06-01

    Globalization has impacted many developing countries across the world. India is one such country, which benefited the most. Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. It is estimated that since 2001 there is an increase of 202 percent of two wheeler and 286 percent of four wheeler vehicles with no road expansion. Motor vehicle crashes are a common cause of death, disability and demand for emergency medical care. Globally, more than 1 million people die each year from traffic crashes and about 20-50 million are injured or permanently disabled. There has been increasing trend in road accidents in Hyderabad over a few years. GIS helps in locating the accident hotspots and also in analyzing the trend of road accidents in Hyderabad.

  6. Analysis of labour accidents in tunnel construction and introduction of prevention measures.

    PubMed

    Kikkawa, Naotaka; Itoh, Kazuya; Hori, Tomohito; Toyosawa, Yasuo; Orense, Rolando P

    2015-12-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707

  7. Analysis of labour accidents in tunnel construction and introduction of prevention measures

    PubMed Central

    KIKKAWA, Naotaka; ITOH, Kazuya; HORI, Tomohito; TOYOSAWA, Yasuo; ORENSE, Rolando P.

    2015-01-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707

  8. Identification of Behavior Based Safety by Using Traffic Light Analysis to Reduce Accidents

    NASA Astrophysics Data System (ADS)

    Mansur, A.; Nasution, M. I.

    2016-01-01

    This work present the safety assessment of a case study and describes an important area within the field production in oil and gas industry, namely behavior based safety (BBS). The company set a rigorous BBS and its intervention program that implemented and deployed continually. In this case, observers requested to have discussion and spread a number of determined questions related with work behavior to the workers during observation. Appraisal of Traffic Light Analysis (TLA) as one tools of risk assessment used to determine the estimated score of BBS questionnaire. Standardization of TLA appraisal in this study are based on Regulation of Minister of Labor and Occupational Safety and Health No:PER.05/MEN/1996. The result shown that there are some points under 84%, which categorized in yellow category and should corrected immediately by company to prevent existing bad behavior of workers. The application of BBS expected to increase the safety performance at work time-by-time and effective in reducing accidents.

  9. A QUALITATIVE APPROACH TO UNCERTAINTY ANALYSIS FOR THE PWR ROD EJECTION ACCIDENT

    SciTech Connect

    DIAMOND,D.J.; ARONSON,A.; YANG,C.

    2000-06-19

    In order to understand best-estimate calculations of the peak local fuel enthalpy during a rod ejection accident, an assessment of the uncertainty has been completed. The analysis took into account point kinetics parameters which would be available from a three-dimensional core model and engineering judgment as to the uncertainty in those parameters. Sensitivity studies to those parameters were carried out using the best-estimate code PARCS. The results showed that the uncertainty (corresponding to one standard deviation) in local fuel enthalpy would be determined primarily by the uncertainty in ejected rod worth and delayed neutron fraction. For an uncertainty in the former of 8% and the latter of 5%, the uncertainty in fuel enthalpy varied from 51% to 69% for control rod worth varying from $1.2 to $1.0. Also considered in the uncertainty were the errors introduced by uncertainties in the Doppler reactivity coefficient, the fuel pellet specific heat, and assembly and fuel pin peaking factors.

  10. Sensitivity analysis of a ship accident at a deep-ocean site in the northwest Atlantic

    SciTech Connect

    Kaplan, M.F.

    1985-04-01

    This report presents the results of a sensitivity analysis for an HLW ship accident occurring in the Nares Abyssal Plain in the northwestern Atlantic. Waste form release rate, canister lifetime and sorption in the water column (partition coefficients) were varied. Also investigated were the relative importance of the dose from the food chain and from seaweed in the diet. Peak individual doses and integrated collective doses for populations were the units of comparison. In accordance with international guidelines on radiological protection, the comparisons of different options were carried out over ''all time''; the study uses a million-year time frame. Partition coefficients have the most pronounced effect on collective dose of the parameters studied. Variations in partition coefficients affect the shape of the collective dose curve over the entire time frame. Peak individual doses decrease markedly when the value for the sorption of americium is increased, but show no increase when less sorption is assumed. Waste form release rates and canister lifetimes affect collective doses only in periods prior to 20,000 years. Hence, comparisons of these options need not be carried out beyond 20,000 years. Waste from release rates below 10/sup -3//yr (nominal value) affect individual doses in a linear manner, i.e., an order-of-magnitude reduction in release rate leads to an order-of-magnitude reduction in peak individual dose. Little reduction in peak individual doses is seen with canister lifetimes extended beyond the nominal 100 years. 32 refs., 14 figs., 16 tabs.

  11. Severe accident analysis of TMI-1 seal LOCA scenario using MELCOR 1.8.2

    SciTech Connect

    Alammar, M.A.

    1994-12-31

    The pump seal LOCA scenario for Three Mile Island Unit 1 Nuclear Power Plant is analyzed using the NORC`s MELCOR 1.8.2 Code. This scenario was a major contributor to containment failure for the LOCA group as a result of the IPE Level 2 analysis which was done using EPRI`s MAAP3B code. The main purpose of this paper is to see if conclusions would have been different with regard to the impact of this scenario on containment performance had MELCOR been used instead of MAAP3B. The major areas addressed were in-vessel and ex-vessel phenomena. For the in-vessel part, three major stages of a severe accident were investigated, namely (1) thermal-hydraulic behavior before core uncovers; (2) core heatup, relocation, and hydrogen generation; and (3) lower head failure. For the ex-vessel part, the following were addressed: (1) Corium-concrete interaction; (2) containment failure; and (3) source term release. It is shown that the same conclusions are made with regard to containment performance and its impact on Level 2 results.

  12. Landscape equivalency analysis: methodology for estimating spatially explicit biodiversity credits.

    PubMed

    Bruggeman, Douglas J; Jones, Michael L; Lupi, Frank; Scribner, Kim T

    2005-10-01

    We propose a biodiversity credit system for trading endangered species habitat designed to minimize and reverse the negative effects of habitat loss and fragmentation, the leading cause of species endangerment in the United States. Given the increasing demand for land, approaches that explicitly balance economic goals against conservation goals are required. The Endangered Species Act balances these conflicts based on the cost to replace habitat. Conservation banking is a means to manage this balance, and we argue for its use to mitigate the effects of habitat fragmentation. Mitigating the effects of land development on biodiversity requires decisions that recognize regional ecological effects resulting from local economic decisions. We propose Landscape Equivalency Analysis (LEA), a landscape-scale approach similar to HEA, as an accounting system to calculate conservation banking credits so that habitat trades do not exacerbate regional ecological effects of local decisions. Credits purchased by public agencies or NGOs for purposes other than mitigating a take create a net investment in natural capital leading to habitat defragmentation. Credits calculated by LEA use metapopulation genetic theory to estimate sustainability criteria against which all trades are judged. The approach is rooted in well-accepted ecological, evolutionary, and economic theory, which helps compensate for the degree of uncertainty regarding the effects of habitat loss and fragmentation on endangered species. LEA requires application of greater scientific rigor than typically applied to endangered species management on private lands but provides an objective, conceptually sound basis for achieving the often conflicting goals of economic efficiency and long-term ecological sustainability. PMID:16132443

  13. Biomechanical analysis of occupant kinematics in rollover motor vehicle accidents: dynamic spit test.

    PubMed

    Sances, Anthony; Kumaresan, Srirangam; Clarke, Richard; Herbst, Brian; Meyer, Steve

    2005-01-01

    A better understanding of occupant kinematics in rollover accidents helps to advance biomechanical knowledge and to enhance the safety features of motor vehicles. While many rollover accident simulation studies have adopted the static approach to delineate the occupant kinematics in rollover accidents, very few studies have attempted the dynamic approach. The present work was designed to study the biomechanics of restrained occupants during rollover accidents using the steady-state dynamic spit test and to address the importance of keeping the lap belt fastened. Experimental tests were conducted using an anthropometric 50% Hybrid III dummy in a vehicle. The vehicle was rotated at 180 degrees/second and the dummy was restrained using a standard three-point restraint system. The lap belt of the dummy was fastened either by using the cinching latch plate or by locking the retractor. Three configurations of shoulder belt harness were simulated: shoulder belt loose on chest with cinch plate, shoulder belt under the left arm and shoulder belt behind the chest. In all tests, the dummy stayed within the confinement of the vehicle indicating that the securely fastened lap belt holds the dummy with dynamic movement of 3 1/2" to 4". The results show that occupant movement in rollover accidents is least affected by various shoulder harness positions with a securely fastened lap belt. The present study forms a first step in delineating the biomechanics of occupants in rollover accidents. PMID:15850090

  14. Comparative analysis of two weight-of-evidence methodologies for integrated sediment quality assessment.

    PubMed

    Khosrovyan, A; Rodríguez-Romero, A; Antequera Ramos, M; DelValls, T A; Riba, I

    2015-02-01

    The results of sediment quality assessment by two different weight-of-evidence methodologies were compared. Both methodologies used the same dataset but as criteria and procedures were different, the results emphasized different aspects of sediment contamination. One of the methodologies integrated the data by means of a multivariate analysis and suggested bioavailability of contaminants and their spatial distribution. The other methodology, used in the dredged material management framework recently proposed in Spain, evaluated sediment toxicity in general by assigning categories. Despite the differences in the interpretation and presentation of results, the methodologies evaluated sediment risk similarly, taking into account chemical concentrations and toxicological effects. Comparison of the results of different approaches is important to define their limitations and thereby avoid implications of potential environmental impacts from different management options, as in the case of dredged material risk assessment. Consistent results of these two methodologies emphasized validity and robustness of the integrated, weight-of-evidence, approach to sediment quality assessment. Limitations of the methodologies were discussed. PMID:25016337

  15. Behavior of an heterogeneous annular FBR core during an unprotected loss of flow accident: Analysis of the primary phase with SAS-SFR

    SciTech Connect

    Massara, S.; Schmitt, D.; Bretault, A.; Lemasson, D.; Darmet, G.; Verwaerde, D.; Struwe, D.; Pfrang, W.; Ponomarev, A.

    2012-07-01

    In the framework of a substantial improvement on FBR core safety connected to the development of a new Gen IV reactor type, heterogeneous core with innovative features are being carefully analyzed in France since 2009. At EDF R and D, the main goal is to understand whether a strong reduction of the Na-void worth - possibly attempting a negative value - allows a significant improvement of the core behavior during an unprotected loss of flow accident. Also, the physical behavior of such a core is of interest, before and beyond the (possible) onset of Na boiling. Hence, a cutting-edge heterogeneous design, featuring an annular shape, a Na-plena with a B{sub 4}C plate and a stepwise modulation of fissile core heights, was developed at EDF by means of the SDDS methodology, with a total Na-void worth of -1 $. The behavior of such a core during the primary phase of a severe accident, initiated by an unprotected loss of flow, is analyzed by means of the SAS-SFR code. This study is carried-out at KIT and EDF, in the framework of a scientific collaboration on innovative FBR severe accident analyses. The results show that the reduction of the Na-void worth is very effective, but is not sufficient alone to avoid Na-boiling and, hence, to prevent the core from entering into the primary phase of a severe accident. Nevertheless, the grace time up to boiling onset is greatly enhanced in comparison to a more traditional homogeneous core design, and only an extremely low fraction of the fuel (<0.1%) enters into melting at the end of this phase. A sensitivity analysis shows that, due to the inherent neutronic characteristics of such a core, the gagging scheme plays a major role on the core behavior: indeed, an improved 4-zones gagging scheme, associated with an enhanced control rod drive line expansion feed-back effect, finally prevents the core from entering into sodium boiling. This major conclusion highlights both the progress already accomplished and the need for more detailed future analyses, particularly concerning: the neutronic burn-up scheme, the modeling of the diagrid effect and the control rod drive line expansion feed-backs, as well as the primary/secondary systems thermal-hydraulics behavior. (authors)

  16. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    SciTech Connect

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to an aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.

  17. Analysis of criticality accident alarm system coverage in the X-700, X-705, and X-720 facilities at the Portsmouth Gaseous Diffusion plant

    SciTech Connect

    Skapik, C.W.; Dobelbower, M.C.; Woollard, J.E.

    1995-12-01

    Additional services for the uranium enrichment cascade process, such as maintenance and decontamination operations, are provided by several ancillary facilities at the PORTS site. These facilities include the X-700 Maintenance Facility, the X-705 Decontamination Facility, and the X-720 Maintenance and Stores Facility. As uranium operations are performed within these facilities, the potential for a criticality accident exists. In the event of a criticality accident within one of these facilities at PORTS, a Criticality Accident Alarm System (CAAS) is in place to detect the criticality accident and sound an alarm. In this report, an analysis was performed to provide verification that the existing CAAS at PORTS provides complete criticality accident coverage in the X-700, X-705, and X-720 facilities. The analysis has determined that the X-705 and X-720 facilities have complete CAAS coverage; the X-700 facility has not been shown to have complete CAAS coverage at this time.

  18. Domino effect in chemical accidents: main features and accident sequences.

    PubMed

    Darbra, R M; Palacios, Adriana; Casal, Joaquim

    2010-11-15

    The main features of domino accidents in process/storage plants and in the transportation of hazardous materials were studied through an analysis of 225 accidents involving this effect. Data on these accidents, which occurred after 1961, were taken from several sources. Aspects analyzed included the accident scenario, the type of accident, the materials involved, the causes and consequences and the most common accident sequences. The analysis showed that the most frequent causes are external events (31%) and mechanical failure (29%). Storage areas (35%) and process plants (28%) are by far the most common settings for domino accidents. Eighty-nine per cent of the accidents involved flammable materials, the most frequent of which was LPG. The domino effect sequences were analyzed using relative probability event trees. The most frequent sequences were explosion→fire (27.6%), fire→explosion (27.5%) and fire→fire (17.8%). PMID:20709447

  19. Preclosure radiological safety analysis for accident conditions of the potential Yucca Mountain Repository: Underground facilities; Yucca Mountain Site Characterization Project

    SciTech Connect

    Ma, C.W.; Sit, R.C.; Zavoshy, S.J.; Jardine, L.J.; Laub, T.W.

    1992-06-01

    This preliminary preclosure radiological safety analysis assesses the scenarios, probabilities, and potential radiological consequences associated with postulated accidents in the underground facility of the potential Yucca Mountain repository. The analysis follows a probabilistic-risk-assessment approach. Twenty-one event trees resulting in 129 accident scenarios are developed. Most of the scenarios have estimated annual probabilities ranging from 10{sup {minus}11}/yr to 10{sup {minus}5}/yr. The study identifies 33 scenarios that could result in offsite doses over 50 mrem and that have annual probabilities greater than 10{sup {minus}9}/yr. The largest offsite dose is calculated to be 220 mrem, which is less than the 500 mrem value used to define items important to safety in 10 CFR 60. The study does not address an estimate of uncertainties, therefore conclusions or decisions made as a result of this report should be made with caution.

  20. Modeling and analysis of the unprotected loss-of-flow accident in the Clinch River Breeder Reactor

    SciTech Connect

    Morris, E.E.; Dunn, F.E.; Simms, R.; Gruber, E.E.

    1985-01-01

    The influence of fission-gas-driven fuel compaction on the energetics resulting from a loss-of-flow accident was estimated with the aid of the SAS3D accident analysis code. The analysis was carried out as part of the Clinch River Breeder Reactor licensing process. The TREAT tests L6, L7, and R8 were analyzed to assist in the modeling of fuel motion and the effects of plenum fission-gas release on coolant and clad dynamics. Special, conservative modeling was introduced to evaluate the effect of fission-gas pressure on the motion of the upper fuel pin segment following disruption. For the nominal sodium-void worth, fission-gas-driven fuel compaction did not adversely affect the outcome of the transient. When uncertainties in the sodium-void worth were considered, however, it was found that if fuel compaction occurs, loss-of-flow driven transient overpower phenomenology could not be precluded.

  1. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  2. Retrospection of Chernobyl nuclear accident for decision analysis concerning remedial actions in Ukraine

    SciTech Connect

    Georgievskiy, Vladimir

    2007-07-01

    It is considered the efficacy of decisions concerning remedial actions when of-site radiological monitoring in the early and (or) in the intermediate phases was absent or was not informative. There are examples of such situations in the former Soviet Union where many people have been exposed: releases of radioactive materials from 'Krasnoyarsk-26' into Enisey River, releases of radioactive materials from 'Chelabinsk-65' (the Kishtim accident), nuclear tests at the Semipalatinsk Test Site, the Chernobyl nuclear accident etc. If monitoring in the early and (or) in the intermediate phases is absent the decisions concerning remedial actions are usually developed on the base of permanent monitoring. However decisions of this kind may be essentially erroneous. For these cases it is proposed to make retrospection of radiological data of the early and intermediate phases of nuclear accident and to project decisions concerning remedial actions on the base of both retrospective data and permanent monitoring data. In this Report the indicated problem is considered by the example of the Chernobyl accident for Ukraine. Their of-site radiological monitoring in the early and intermediate phases was unsatisfactory. In particular, the pasture-cow-milk monitoring had not been made. All official decisions concerning dose estimations had been made on the base of measurements of {sup 137}Cs in body (40 measurements in 135 days and 55 measurements in 229 days after the Chernobyl accident). For the retrospection of radiological data of the Chernobyl accident dynamic model has been developed. This model has structure similar to the structure of Pathway model and Farmland model. Parameters of the developed model have been identified for agricultural conditions of Russia and Ukraine. By means of this model dynamics of 20 radionuclides in pathways and dynamics of doses have been estimated for the early, intermediate and late phases of the Chernobyl accident. The main results are following: - During the first year after the Chernobyl accident 75-93% of Commitment Effective Dose had been formed; - During the first year after the Chernobyl accident 85-90% of damage from radiation exposure had been formed. During the next 50 years (the late phase of accident) only 10-15% of damage from radiation exposure will have been formed; - Remedial actions (agricultural remedial actions as most effective) in Ukraine are intended for reduction of the damage from consumption of production which is contaminated in the late phase of accident. I.e. agricultural remedial actions have been intended for minimization only 10 % of the total damage from radiation exposure; - Medical countermeasures can minimize radiation exposure damage by an order of magnitude greater than agricultural countermeasures. - Thus, retrospection of nuclear accident has essentially changed type of remedial actions and has given a chance to increase effectiveness of spending by an order of magnitude. This example illustrates that in order to optimize remedial actions it is required to use data of retrospection of nuclear accidents in all cases when monitoring in the early and (or) intermediate phases is unsatisfactory. (author)

  3. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  4. Accident investigation

    NASA Technical Reports Server (NTRS)

    Laynor, William G. Bud

    1987-01-01

    The National Transportation Safety Board (NTSB) has attributed wind shear as a cause or contributing factor in 15 accidents involving transport-categroy airplanes since 1970. Nine of these were nonfatal; but the other six accounted for 440 lives. Five of the fatal accidents and seven of the nonfatal accidents involved encounters with convective downbursts or microbursts. Of other accidents, two which were nonfatal were encounters with a frontal system shear, and one which was fatal was the result of a terrain induced wind shear. These accidents are discussed with reference to helping the aircraft to avoid the wind shear or if impossible to help the pilot to get through the wind shear.

  5. Analysis of 121 fatal passenger car-adult pedestrian accidents in China.

    PubMed

    Zhao, Hui; Yin, Zhiyong; Yang, Guangyu; Che, Xingping; Xie, Jingru; Huang, Wei; Wang, Zhengguo

    2014-10-01

    To study the characteristics of fatal vehicle-pedestrian accidents in China?a team was established and passenger car-pedestrian crash cases occurring between 2006 and 2011 in Beijing and Chongqing, China were collected. A total of 121 fatal passenger car-adult pedestrian collisions were sampled and analyzed. The pedestrian injuries were scored according to Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). The demographical distributions of fatal pedestrian accidents differed from other pedestrian accidents. Among the victims, no significant discrepancy in the distribution of ISS and AIS in head, thorax, abdomen, and extremities by pedestrian age was found, while pedestrian behaviors prior to the crashes may affect the ISS. The distributions of AIS in head, thorax, and abdomen among the fatalities did not show any association with impact speeds or vehicle types, whereas there was a strong relationship between the ISS and impact speeds. Whether pedestrians died in the accident field or not was not associated with the ISS or AIS. The present results may be useful for not only forensic experts but also vehicle safety researchers. More investigations regarding fatal pedestrian accidents need be conducted in great detail. PMID:25287805

  6. Testing and analysis of structural integrity of electrosleeved tubes under severe accident transients

    SciTech Connect

    Majumdar, S.

    1999-12-10

    The structural integrity of flawed steam generator tubing with Electrosleeves{trademark} under simulated severe accident transients was analyzed by analytical models that used available material properties data and results from high-temperature tests conducted on Electrosleeved tubes. The Electrosleeve material is almost pure Ni and derives its strength and other useful properties from its nanocrystalline microstructure, which is stable at reactor operating temperatures. However, it undergoes rapid grain growth, at the high temperatures expected during severe accidents, resulting in a loss of strength and a corresponding decrease in flow stress. The magnitude of this decrease depends on the time-temperature history during the accident. Failure tests were conducted at ANL and FTI on internally pressurized Electrosleeved tubes with 80% and 100% throughwall machined axial notches in tie parent tubes that were subjected to simulated severe accident temperature transients. The test results, together with the analytical model, were used to estimate the unaged flow stress curve of the Electrosleeved material at high temperatures. Failure temperatures for Electrosleeved tubes with throughwall and part-throughwall axial cracks of various lengths in the parent tubes were calculated for a postulated severe accident transient.

  7. ‘Doing’ health policy analysis: methodological and conceptual reflections and challenges

    PubMed Central

    Walt, Gill; Shiffman, Jeremy; Schneider, Helen; Murray, Susan F; Brugha, Ruairi; Gilson, Lucy

    2008-01-01

    The case for undertaking policy analysis has been made by a number of scholars and practitioners. However, there has been much less attention given to how to do policy analysis, what research designs, theories or methods best inform policy analysis. This paper begins by looking at the health policy environment, and some of the challenges to researching this highly complex phenomenon. It focuses on research in middle and low income countries, drawing on some of the frameworks and theories, methodologies and designs that can be used in health policy analysis, giving examples from recent studies. The implications of case studies and of temporality in research design are explored. Attention is drawn to the roles of the policy researcher and the importance of reflexivity and researcher positionality in the research process. The final section explores ways of advancing the field of health policy analysis with recommendations on theory, methodology and researcher reflexivity. PMID:18701552

  8. Estimating Loss-of-Coolant Accident Frequencies for the Standardized Plant Analysis Risk Models

    SciTech Connect

    S. A. Eide; D. M. Rasmuson; C. L. Atwood

    2008-09-01

    The U.S. Nuclear Regulatory Commission maintains a set of risk models covering the U.S. commercial nuclear power plants. These standardized plant analysis risk (SPAR) models include several loss-of-coolant accident (LOCA) initiating events such as small (SLOCA), medium (MLOCA), and large (LLOCA). All of these events involve a loss of coolant inventory from the reactor coolant system. In order to maintain a level of consistency across these models, initiating event frequencies generally are based on plant-type average performance, where the plant types are boiling water reactors and pressurized water reactors. For certain risk analyses, these plant-type initiating event frequencies may be replaced by plant-specific estimates. Frequencies for SPAR LOCA initiating events previously were based on results presented in NUREG/CR-5750, but the newest models use results documented in NUREG/CR-6928. The estimates in NUREG/CR-6928 are based on historical data from the initiating events database for pressurized water reactor SLOCA or an interpretation of results presented in the draft version of NUREG-1829. The information in NUREG-1829 can be used several ways, resulting in different estimates for the various LOCA frequencies. Various ways NUREG-1829 information can be used to estimate LOCA frequencies were investigated and this paper presents two methods for the SPAR model standard inputs, which differ from the method used in NUREG/CR-6928. In addition, results obtained from NUREG-1829 are compared with actual operating experience as contained in the initiating events database.

  9. APT Blanket System Loss-of-Flow Accident (LOFA) Analysis Based on Initial Conceptual Design - Case 1: with Beam Shutdown and Active RHR

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report.

  10. APT Blanket System Loss-of-Coolant Accident (LOCA) Analysis Based on Initial Conceptual Design - Case 3: External HR Break at Pump Outlet without Pump Trip

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.

  11. A Comprehensive Analysis of the X-15 Flight 3-65 Accident

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Orr, Jeb S.; Barshi, Immanuel; Statler, Irving C.

    2014-01-01

    The November 15, 1967, loss of X-15 Flight 3-65-97 (hereafter referred to as Flight 3-65) was a unique incident in that it was the first and only aerospace flight accident involving loss of crew on a vehicle with an adaptive flight control system (AFCS). In addition, Flight 3-65 remains the only incidence of a single-pilot departure from controlled flight of a manned entry vehicle in a hypersonic flight regime. To mitigate risk to emerging aerospace systems, the NASA Engineering and Safety Center (NESC) proposed a comprehensive review of this accident. The goal of the assessment was to resolve lingering questions regarding the failure modes of the aircraft systems (including the AFCS) and thoroughly analyze the interactions among the human agents and autonomous systems that contributed to the loss of the pilot and aircraft. This document contains the outcome of the accident review.

  12. The failure analysis of composite material flight helmets as an aid in aircraft accident investigation.

    PubMed

    Caine, Y G; Bain-Ungerson, O; Schochat, I; Marom, G

    1991-06-01

    Understanding why a flying helmet fails to maintain its integrity during an accident can contribute to an understanding of the mechanism of injury and even of the accident itself. We performed a post-accident evaluation of failure modes in glass and aramid fibre-reinforced composite helmets. Optical and microscopic (SEM) techniques were employed to identify specific fracture mechanisms. They were correlated with the failure mode. Stress and energy levels were estimated from the damage extent. Damage could be resolved into distinct impact, flexure and compression components. Delamination was identified as a specific mode, dependent upon the matrix material and bonding between the layers. From the energy dissipated in specific fracture mechanisms we calculated the minimum total energy imparted to the helmet-head combination and the major injury vector (MIV) direction and magnitude. The level of protection provided by the helmet can also be estimated. PMID:1859350

  13. Analysis of hospitalization occurred due to motorcycles accidents in São Paulo city

    PubMed Central

    Gorios, Carlos; Armond, Jane de Eston; Rodrigues, Cintia Leci; Pernambuco, Henrique; Iporre, Ramiro Ortiz; Colombo-Souza, Patrícia

    2015-01-01

    OBJECTIVE: To characterize the motorcycle accidents occurred in the city of São Paulo, SP, Brazil in the year 2013, with emphasis on information about hospital admissions from SIH/SUS. METHODS: This is a retrospective cross-sectional study. The study covered 5,597 motorcyclists traumatized in traffic accident during the year 2013 occurred in the city of São Paulo. A survey was conducted using secondary data from the Information System of Hospitalization Health System (SIH/SUS). RESULTS: In 2013, in the city of São Paulo there were 5,597 admissions of motorcyclists traumatized in traffic accidents, of which 89.8% were male. The admission diagnosis were: leg fracture, femur fracture, and intracranial injury. CONCLUSION: This study confirms other preliminary studies on several points, among which stands out the highest prevalence of male young adults. Level of Evidence II, Retrospective Study. PMID:26327804

  14. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models.

  15. Environmental risk management for radiological accidents: integrating risk assessment and decision analysis for remediation at different spatial scales.

    PubMed

    Yatsalo, Boris; Sullivan, Terrence; Didenko, Vladimir; Linkov, Igor

    2011-07-01

    The consequences of the Tohuku earthquake and subsequent tsunami in March 2011 caused a loss of power at the Fukushima Daiichi nuclear power plant, in Japan, and led to the release of radioactive materials into the environment. Although the full extent of the contamination is not currently known, the highly complex nature of the environmental contamination (radionuclides in water, soil, and agricultural produce) typical of nuclear accidents requires a detailed geospatial analysis of information with the ability to extrapolate across different scales with applications to risk assessment models and decision making support. This article briefly summarizes the approach used to inform risk-based land management and remediation decision making after the Chernobyl, Soviet Ukraine, accident in 1986. PMID:21608109

  16. Cost analysis and financial risk profile for severe reactor accidents at Waterford-3

    SciTech Connect

    Cutbush, J.D.; Abbott, E.C. ); Carpenter, W.L. Jr. )

    1992-01-01

    To support Louisiana Power and Light Company (LP and L) in determining an appropriate level of nuclear property insurance for Waterford Steam Electric Station, Unit 3 (Waterford-3), ABZ, Incorporated, performed a series of cost analyses and developed a financial risk profile. This five-month study, conducted in 1991, identified the potential Waterford-3 severe reactor accidents and described each from a cleanup perspective, estimated the cost and schedule to cleanup from each accident, developed a probability distribution of associated financial exposure, and developed a profile of financial risk as a function of insurance coverage.

  17. Experiments on natural circulation during PWR severe accidents and their analysis

    SciTech Connect

    Sehgal, B.R.; Stewart, W.A.; Sha, W.T.

    1988-01-01

    Buoyancy-induced natural circulation flows will occur during the early-part of PWR high pressure accident scenarios. These flows affect several key parameters; in particular, the course of such accidents will most probably change due to local failures occurring in the primary coolant system (CS) before substantial core degradation. Natural circulation flow patterns were measured in a one-seventh scale PWR PCS facility at Westinghouse RandD laboratories. The measured flow and temperature distributions are report in this paper. The experiments were analyzed with the COMMIX code and good agreement was obtained between data and calculations. 10 refs., 8 figs., 2 tabs.

  18. Analysis of main steam isolation valve leakage in design basis accidents using MELCOR 1.8.6 and RADTRAD.

    SciTech Connect

    Salay, Michael; Kalinich, Donald A.; Gauntt, Randall O.; Radel, Tracy E.

    2008-10-01

    Analyses were performed using MELCOR and RADTRAD to investigate main steam isolation valve (MSIV) leakage behavior under design basis accident (DBA) loss-of-coolant (LOCA) conditions that are presumed to have led to a significant core melt accident. Dose to the control room, site boundary and LPZ are examined using both approaches described in current regulatory guidelines as well as analyses based on best estimate source term and system response. At issue is the current practice of using containment airborne aerosol concentrations as a surrogate for the in-vessel aerosol concentration that exists in the near vicinity of the MSIVs. This study finds current practice using the AST-based containment aerosol concentrations for assessing MSIV leakage is non-conservative and conceptually in error. A methodology is proposed that scales the containment aerosol concentration to the expected vessel concentration in order to preserve the simplified use of the AST in assessing containment performance under assumed DBA conditions. This correction is required during the first two hours of the accident while the gap and early in-vessel source terms are present. It is general practice to assume that at {approx}2hrs, recovery actions to reflood the core will have been successful and that further core damage can be avoided. The analyses performed in this study determine that, after two hours, assuming vessel reflooding has taken place, the containment aerosol concentration can then conservatively be used as the effective source to the leaking MSIV's. Recommendations are provided concerning typical aerosol removal coefficients that can be used in the RADTRAD code to predict source attenuation in the steam lines, and on robust methods of predicting MSIV leakage flows based on measured MSIV leakage performance.

  19. Knowledge Consolidation Analysis: Toward a Methodology for Studying the Role of Argument in Technology Development

    ERIC Educational Resources Information Center

    Dyehouse, Jeremiah

    2007-01-01

    Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…

  20. 76 FR 35431 - Federal Need Analysis Methodology for the 2012-2013 Award Year

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-17

    ... From the Federal Register Online via the Government Publishing Office ] DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2012-2013 Award Year Correction In notice document 2010-12812 appearing on pages 30139 through 30142 in the issue of Tuesday, May 24, 2011, make the following...

  1. A Quasi Meta-Analysis of Youth and Career Research Methodologies.

    ERIC Educational Resources Information Center

    Bernes, Kerry

    A quasi meta-analysis approach was used to examine the research methodologies used to study issues related to youth (ages 13-25) and careers. Psychlit, ERIC, Dissertation Abstracts, and four journals were searched to identify articles for the study. A total of 67 articles from 18 different sources were analyzed. Eighty-seven percent were from…

  2. Analysis of Folk Tales for Prosocial Behavior among Slaves: Exploration of a Methodology.

    ERIC Educational Resources Information Center

    Harrison, Algea Othella

    The purpose of this paper is to discuss some of the methodological problems in using a content analysis of slave folk tales as a source for incidences of prosocial behaviors. Forty tales taken from four collections of black folktales will be included in this study. The categories for scoring prosocial behavior include cooperativeness, altruism,…

  3. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    PubMed Central

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  4. Success story in software engineering using NIAM (Natural language Information Analysis Methodology)

    SciTech Connect

    Eaton, S.M.; Eaton, D.S.

    1995-10-01

    To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

  5. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1993-01-01

    In this study involving advanced fluid flow codes, an incremental iterative formulation (also known as the delta or correction form) together with the well-known spatially-split approximate factorization algorithm, is presented for solving the very large sparse systems of linear equations which are associated with aerodynamic sensitivity analysis. For smaller 2D problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. Iterative methods are needed for larger 2D and future 3D applications, however, because direct methods require much more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioning of the coefficient matrix; this problem can be overcome when these equations are cast in the incremental form. These and other benefits are discussed. The methodology is successfully implemented and tested in 2D using an upwind, cell-centered, finite volume formulation applied to the thin-layer Navier-Stokes equations. Results are presented for two sample airfoil problems: (1) subsonic low Reynolds number laminar flow; and (2) transonic high Reynolds number turbulent flow.

  6. What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?

    PubMed

    Tivesten, Emma; Wiberg, Henrik

    2013-03-01

    Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such as in-depth accident investigations and pre-crash data recordings. PMID:23314359

  7. Launch Vehicle Fire Accident Preliminary Analysis of a Liquid-Metal Cooled Thermionic Nuclear Reactor: TOPAZ-II

    NASA Astrophysics Data System (ADS)

    Hu, G.; Zhao, S.; Ruan, K.

    2012-01-01

    In this paper, launch vehicle propellant fire accident analysis of TOPAZ-II reactor has been done by a thermionic reactor core analytic code-TATRHG(A) developed by author. When a rocket explodes on a launch pad, its payload-TOPAZ-II can be subjected to a severe thermal environment from the resulting fireball. The extreme temperatures associated with propellant fires can create a destructive environment in or near the fireball. Different kind of propellants - liquid propellant and solid propellant which will lead to different fire temperature are considered. Preliminary analysis shows that the solid propellant fires can melt the whole toxic beryllium radial reflector.

  8. Severe Accident Sequence Analysis Program: Anticipated transient without scram simulations for Browns Ferry Nuclear Plant Unit 1

    SciTech Connect

    Dallman, R J; Gottula, R C; Holcomb, E E; Jouse, W C; Wagoner, S R; Wheatley, P D

    1987-05-01

    An analysis of five anticipated transients without scram (ATWS) was conducted at the Idaho National Engineering Laboratory (INEL). The five detailed deterministic simulations of postulated ATWS sequences were initiated from a main steamline isolation valve (MSIV) closure. The subject of the analysis was the Browns Ferry Nuclear Plant Unit 1, a boiling water reactor (BWR) of the BWR/4 product line with a Mark I containment. The simulations yielded insights to the possible consequences resulting from a MSIV closure ATWS. An evaluation of the effects of plant safety systems and operator actions on accident progression and mitigation is presented.

  9. A modeling methodology for the analysis of concurrent systems and computations

    SciTech Connect

    Kapelnikov, A. ); Muntz, R.R.; Ercegovac, M.D. . Dept. of Computer Science)

    1989-06-01

    A modeling methodology is described for evaluating the performance of distributed, multiple-computer systems. The approach employs a set of analytical tools to obtain an estimate of the average execution time of a parallel implementation of a program (or transaction) in a distributed environment. These tools are based on an amalgamation of queueing network theory and graph models of program behavior. Hierarchical application of heuristic optimization techniques facilitates the analysis of large and complex programs. A realistic example is used to illustrate the practical application of the methodology.

  10. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis

    SciTech Connect

    Goldhaber, M.K.; Staub, S.L.; Tokuhata, G.K.

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss.

  11. Risk Analysis for Public Consumption: Media Coverage of the Ginna Nuclear Reactor Accident.

    ERIC Educational Resources Information Center

    Dunwoody, Sharon; And Others

    Researchers have determined that the lay public makes risk judgments in ways that are very different from those advocated by scientists. Noting that these differences have caused considerable concern among those who promote and regulate health and safety, a study examined media coverage of the accident at the Robert E. Ginna nuclear power plant…

  12. Analysis of general aviation accidents during operations under instrument flight rules

    NASA Technical Reports Server (NTRS)

    Bennett, C. T.; Schwirzke, Martin; Harm, C.

    1990-01-01

    A report is presented to describe some of the errors that pilots make during flight under IFR. The data indicate that there is less risk during the approach and landing phase of IFR flights, as compared to VFR operations. Single-pilot IFR accident rates continue to be higher than two-pilot IFR incident rates, reflecting the high work load of IFR operations.

  13. Fukushima Daiichi Unit 1 Accident Progression Uncertainty Analysis and Implications for Decommissioning of Fukushima Reactors - Volume I.

    SciTech Connect

    Gauntt, Randall O.; Mattie, Patrick D.

    2016-01-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysis (UA) on the Fukushima Daiichi unit (1F1) accident progression with the MELCOR code. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). That study focused on reconstructing the accident progressions, as postulated by the limited plant data. This work was focused evaluation of uncertainty in core damage progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, reactor damage state, fraction of intact fuel, vessel lower head failure). The primary intent of this study was to characterize the range of predicted damage states in the 1F1 reactor considering state of knowledge uncertainties associated with MELCOR modeling of core damage progression and to generate information that may be useful in informing the decommissioning activities that will be employed to defuel the damaged reactors at the Fukushima Daiichi Nuclear Power Plant. Additionally, core damage progression variability inherent in MELCOR modeling numerics is investigated.

  14. A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C, M.

    2007-01-01

    Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

  15. Validation and verification of RELAP5 for Advanced Neutron Source accident analysis: Part I, comparisons to ANSDM and PRSDYN codes

    SciTech Connect

    Chen, N.C.J.; Ibn-Khayat, M.; March-Leuba, J.A.; Wendel, M.W.

    1993-12-01

    As part of verification and validation, the Advanced Neutron Source reactor RELAP5 system model was benchmarked by the Advanced Neutron Source dynamic model (ANSDM) and PRSDYN models. RELAP5 is a one-dimensional, two-phase transient code, developed by the Idaho National Engineering Laboratory for reactor safety analysis. Both the ANSDM and PRSDYN models use a simplified single-phase equation set to predict transient thermal-hydraulic performance. Brief descriptions of each of the codes, models, and model limitations were included. Even though comparisons were limited to single-phase conditions, a broad spectrum of accidents was benchmarked: a small loss-of-coolant-accident (LOCA), a large LOCA, a station blackout, and a reactivity insertion accident. The overall conclusion is that the three models yield similar results if the input parameters are the same. However, ANSDM does not capture pressure wave propagation through the coolant system. This difference is significant in very rapid pipe break events. Recommendations are provided for further model improvements.

  16. A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)

    NASA Astrophysics Data System (ADS)

    Johnson, C. W.; Holloway, C. M.

    Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

  17. A new analysis methodology for the motion of self-propelled particles and its application

    NASA Astrophysics Data System (ADS)

    Byun, Young-Moo; Lammert, Paul; Crespi, Vincent

    2011-03-01

    The self-propelled particle (SPP) on the microscale in the solution is a growing field of study, which has a potential to be used for nanomedicine and nanorobots. However, little detailed quantitative analysis on the motion of the SPP has been performed so far because its self-propelled motion is strongly coupled to Brownian motion, which makes the extraction of intrinsic propulsion mechanisms problematic, leading to inconsistent conclusions. Here, we present a novel way to decompose the motion of the SPP into self-propelled and Brownian components; accurate values for self-propulsion speed and diffusion coefficients of the SPP are obtained for the first time. Then, we apply our analysis methodology to ostensible chemotaxis of SPP, and reveal the actual (non-chemotactic) mechanism of the phenomenon, demonstrating that our analysis methodology is a powerful and reliable tool.

  18. Accident analysis for high-level waste management alternatives in the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement

    SciTech Connect

    Folga, S.; Mueller, C.; Roglans-Ribas, J.

    1994-02-01

    A comparative generic accident analysis was performed for the programmatic alternatives for high-level waste (HLW) management in the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement (EM PEIS). The key facilities and operations of the five major HLW management phases were considered: current storage, retrieval, pretreatment, treatment, and interim canister storage. A spectrum of accidents covering the risk-dominant accidents was analyzed. Preliminary results are presented for HLW management at the Hanford site. A comparison of these results with those previously advanced shows fair agreement.

  19. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    PubMed Central

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  20. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  1. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    SciTech Connect

    D. A. Brownson

    2002-09-26

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002).

  2. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    SciTech Connect

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  3. The effects of aircraft certification rules on general aviation accidents

    NASA Astrophysics Data System (ADS)

    Anderson, Carolina Lenz

    The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi-Square test indicated that there was no significant difference in the number of accidents among the different certification categories when either Controlled Flight into Terrain or Structural Failure was listed as cause. However, there was a significant difference in the frequency of accidents with regard to Loss of Control and Engine Failure accidents. The results of the ANCOVA test indicated that there was no significant difference in the accident rate with regard to Loss of Control, Controlled Flight into Terrain, or Structural Failure accidents. There was, however, a significant difference in Engine Failure accidents between Experimental-Amateur Built and the other categories.The text mining analysis of the narrative causes of Loss of Control accidents indicated that only the Civil Air Regulations 3 category airplanes had clusters of words associated with visual flight into instrument meteorological conditions. Civil Air Regulations 3 airplanes were designed and manufactured prior to the 1960s and in most cases have not been retrofitted to take advantage of newer technologies that could help prevent Loss of Control accidents. The study indicated that General Aviation aircraft certification rules do not have a statistically significant effect on aircraft accidents except for Loss of Control and Engine Failure. According to the literature, government oversight could have become an obstacle in the implementation of safety enhancing equipment that could reduce Loss of Control accidents. Oversight should focus on ensuring that Experimental-Amateur Built aircraft owners perform a functional test that could prevent some of the Engine Failure accidents.

  4. Retrospective reconstruction of Ioidne-131 distribution at the Fukushima Daiichi Nuclear Power Plant accident by analysis of Ioidne-129

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Toyama, Chiaki; Ohno, Takeshi; Kusuno, Haruka; Miyake, Yasuto; Honda, Maki

    2014-05-01

    Among various radioactive nuclides emitted from the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident, Iodine-131 displayed high radioactivity just after the accident. Moreover if taken into human body, Iodine-131 concentrates in the thyroid and may cause the thyroid cancer. The recognition about the risk of Iodine-131 dose originated from the experience of the Chernobyl accident based on the epidemiological study [1]. It is thus important to investigate the detailed deposition distribution of I-131 to evaluate the radiation dose due to I-131 and watch the influence on the human health. However I-131 decays so rapidly (half life = 8.02 d) that it cannot be detected several months after the accident. At the recognition of the risk of I-131 on the Chernobyl occasion, it had gone several years after the accident. The reconstruction of I-131 distribution from Cs-137 distribution was not successful because the behavior of iodine and cesium was different because they have different chemical properties. Long lived radioactive isotope I-129 (half life = 1.57E+7 yr,), which is also a fission product as well as I-131, is ideal proxy for I-131 because they are chemically identical. Several studies had tried to quantify I-129 in 1990's but the analytical technique, especially AMS (Accelerator Mass Spectrometry), had not been developed well and available AMS facility was limited. Moreover because of the lack of enough data on I-131 just after the accident, the isotopic ratio I-129/I-131 of the Chernobyl derived iodine could not been estimated precisely [2]. Calculated estimation of the isotopic ratio showed scattered results. On the other hand, at the FDNPP accident detailed I-131 distribution is going to be successfully reconstructed by the systematical I-129 measurements by our group. We measured soil samples selected from a series of soil collection taken from every 2 km (or 5km, in the distant area) meshed region around FDNPP conducted by the Japanese Ministry of Science and Education on June, 2011. So far more than 500 samples were measured and determined I-129 deposition amount by AMS at MALT (Micro Analysis Laboratory, Tandem accelerator), The University of Tokyo. The measurement error from AMS is less than 5%, typically 3%. The overall uncertainty is estimated less than 30%, including the uncertainty from that of the nominal value of the standard reference material used, that of I-129/I-131 ratio estimation, that of the "representativeness" for the region by the analyzed sample, etc. The isotopic ratio I-129/I-131 from the reactor was estimated [3] (to be 22.3 +- 6.3 as of March 11, 2011) from a series of samples collected by a group of The University of Tokyo on the 20th of April, 2011 for which the I-131 was determined by gamma-ray spectrometry with good precision. Complementarily, we had investigated the depth profile in soil of the accident derived I-129 and migration speed after the deposition and found that more than 90% of I-129 was concentrated within top 5 cm layer and the downward migration speed was less than 1cm/yr [4]. From the set of I-129 data, corresponding I-131 were calculated and the distribution map is going to be constructed. Various fine structures of the distribution came in sight. [1] Y. Nikiforov and D. R. Gnepp, 1994, Cancer, Vol. 47, pp748-766. [2] T. Straume, et al., 1996, Health Physics, Vol. 71, pp733-740. [3] Y. Miyake, H. Matsuzaki et al.,2012, Geochem. J., Vol. 46, pp327-333. [4] M. Honda, H. Matsuzaki et al., under submission.

  5. Analysis of potential for jet-impingement erosion from leaking steam generator tubes during severe accidents.

    SciTech Connect

    Majumdar, S.; Diercks, D. R.; Shack, W. J.; Energy Technology

    2002-05-01

    This report summarizes analytical evaluation of crack-opening areas and leak rates of superheated steam through flaws in steam generator tubes and erosion of neighboring tubes due to jet impingement of superheated steam with entrained particles from core debris created during severe accidents. An analytical model for calculating crack-opening area as a function of time and temperature was validated with tests on tubes with machined flaws. A three-dimensional computational fluid dynamics code was used to calculate the jet velocity impinging on neighboring tubes as a function of tube spacing and crack-opening area. Erosion tests were conducted in a high-temperature, high-velocity erosion rig at the University of Cincinnati, using micrometer-sized nickel particles mixed in with high-temperature gas from a burner. The erosion results, together with analytical models, were used to estimate the erosive effects of superheated steam with entrained aerosols from the core during severe accidents.

  6. Methodological considerations for the harmonization of non-cholesterol sterol bio-analysis.

    PubMed

    Mackay, Dylan S; Jones, Peter J H; Myrie, Semone B; Plat, Jogchum; Lütjohann, Dieter

    2014-04-15

    Non-cholesterol sterols (NCS) are used as surrogate markers of cholesterol metabolism which can be measured from a single blood sample. Cholesterol precursors are used as markers of endogenous cholesterol synthesis and plant sterols are used as markers of cholesterol absorption. However, most aspects of NCS analysis show wide variability among researchers within the area of biomedical research. This variability in methodology is a significant contributor to variation between reported NCS values and hampers the confidence in comparing NCS values across different research groups, as well as the ability to conduct meta-analyses. This paper summarizes the considerations and conclusions of a workshop where academic and industrial experts met to discuss NCS measurement. Highlighted is why each step in the analysis of NCS merits critical consideration, with the hopes of moving toward more standardized and comparable NCS analysis methodologies. Alkaline hydrolysis and liquid-liquid extraction of NCS followed by parallel detection on GC-FID and GC-MS is proposed as an ideal methodology for the bio-analysis of NCS. Furthermore the importance of cross-comparison or round robin testing between various groups who measure NCS is critical to the standardization of NCS measurement. PMID:24674990

  7. Third annual Warren K. Sinclair keynote address: retrospective analysis of impacts of the Chernobyl accident.

    PubMed

    Balonov, Mikhail

    2007-11-01

    The accident at the Chernobyl Nuclear Power Plant in 1986 was the most severe in the history of the nuclear industry, causing a huge release of radionuclides over large areas of Europe. The recently completed Chernobyl Forum concluded that after a number of years, along with reduction of radiation levels and accumulation of humanitarian consequences, severe social and economic depression of the affected regions and associated psychological problems of the general public and the workers had become the most significant problem to be addressed by the authorities. The majority of the >600,000 emergency and recovery operation workers and five million residents of the contaminated areas in Belarus, Russia, and Ukraine received relatively minor radiation doses which are comparable with the natural background levels. An exception is a cohort of several hundred emergency workers who received high radiation doses and of whom 28 persons died in 1986 due to acute radiation sickness. Apart from the dramatic increase in thyroid cancer incidence among those exposed to radioiodine at a young age and some increase of leukemia in the most exposed workers, there is no clearly demonstrated increase in the somatic diseases due to radiation. There was, however, an increase in psychological problems among the affected population, compounded by the social disruption that followed the break-up of the Soviet Union. Despite the unprecedented scale of the Chernobyl accident, its consequences on the health of people are far less severe than those of the atomic bombings of the cities of Hiroshima and Nagasaki. Studying the consequences of the Chernobyl accident has made an invaluable scientific contribution to the development of nuclear safety, radioecology, radiation medicine and protection, and also the social sciences. The Chernobyl accident initiated the global nuclear and radiation safety regime. PMID:18049216

  8. Thermodynamic analysis of spent pyrochemical salts in the stored condition and in viable accident scenarios

    SciTech Connect

    Axler, K.M.

    1994-03-01

    This study involves examining ``spent`` electrorefining (ER) salts in the form present after usage (as stored), and then after exposure to water in a proposed accident scenario. Additionally, the equilibrium composition of the salt after extended exposure to air was also calculated by computer modeling and those results are also presented herein. It should be noted that these salts are extremely similar to spent MSE salts from the Rocky Flats MSE campaigns using NaCl-KCl- MgCl{sub 2}.

  9. Analysis of Radionuclide Releases from the Fukushima Dai-ichi Nuclear Power Plant Accident Part II

    NASA Astrophysics Data System (ADS)

    Achim, Pascal; Monfort, Marguerite; Le Petit, Gilbert; Gross, Philippe; Douysset, Guilhem; Taffary, Thomas; Blanchard, Xavier; Moulin, Christophe

    2014-03-01

    The present part of the publication (Part II) deals with long range dispersion of radionuclides emitted into the atmosphere during the Fukushima Dai-ichi accident that occurred after the March 11, 2011 tsunami. The first part (Part I) is dedicated to the accident features relying on radionuclide detections performed by monitoring stations of the Comprehensive Nuclear Test Ban Treaty Organization network. In this study, the emissions of the three fission products Cs-137, I-131 and Xe-133 are investigated. Regarding Xe-133, the total release is estimated to be of the order of 6 × 1018 Bq emitted during the explosions of units 1, 2 and 3. The total source term estimated gives a fraction of core inventory of about 8 × 1018 Bq at the time of reactors shutdown. This result suggests that at least 80 % of the core inventory has been released into the atmosphere and indicates a broad meltdown of reactor cores. Total atmospheric releases of Cs-137 and I-131 aerosols are estimated to be 1016 and 1017 Bq, respectively. By neglecting gas/particulate conversion phenomena, the total release of I-131 (gas + aerosol) could be estimated to be 4 × 1017 Bq. Atmospheric transport simulations suggest that the main air emissions have occurred during the events of March 14, 2011 (UTC) and that no major release occurred after March 23. The radioactivity emitted into the atmosphere could represent 10 % of the Chernobyl accident releases for I-131 and Cs-137.

  10. SiC MODIFICATIONS TO MELCOR FOR SEVERE ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    Brad J. Merrill; Shannon M Bragg-Sitton

    2013-09-01

    The Department of Energy (DOE) Office of Nuclear Energy (NE) Light Water Reactor (LWR) Sustainability Program encompasses strategic research focused on improving reactor core economics and safety margins through the development of an advanced fuel cladding system. The Fuels Pathway within this program focuses on fuel system components outside of the fuel pellet, allowing for alteration of the existing zirconium-based clad system through coatings, addition of ceramic sleeves, or complete replacement (e.g. fully ceramic cladding). The DOE-NE Fuel Cycle Research & Development (FCRD) Advanced Fuels Campaign (AFC) is also conducting research on materials for advanced, accident tolerant fuels and cladding for application in operating LWRs. To aide in this assessment, a silicon carbide (SiC) version of the MELCOR code was developed by substituting SiC in place of Zircaloy in MELCOR’s reactor core oxidation and material property routines. The purpose of this development effort is to provide a numerical capability for estimating the safety advantages of replacing Zr-alloy components in LWRs with SiC components. This modified version of the MELCOR code was applied to the Three Mile Island (TMI-2) plant accident. While the results are considered preliminary, SiC cladding showed a dramatic safety advantage over Zircaloy cladding during this accident.

  11. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  12. A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.

    ERIC Educational Resources Information Center

    Cantor, Jeffrey A.

    1991-01-01

    A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…

  13. A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.

    ERIC Educational Resources Information Center

    Cantor, Jeffrey A.

    1991-01-01

    A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…

  14. An analysis of thermionic space nuclear reactor power system: I. Effect of disassembling radial reflector, following a reactivity initiated accident

    SciTech Connect

    El-Genk, M.S.; Paramonov, D. )

    1993-01-10

    An analysis is performed to determine the effect of disassembling the radial reflector of the TOPAZ-II reactor, following a hypothetical severe Reactivity Initiated Accident (RIA). Such an RIA is assumed to occur during the system start-up in orbit due to a malfunction of the drive mechanism of the control drums, causing the drums to rotate the full 180[degree] outward at their maximum speed of 1.4[degree]/s. Results indicate that disassembling only three of twelve radial reflector panels would successfully shutdown the reactor, with little overheating of the fuel and the moderator.

  15. Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

    2010-01-01

    Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

  16. Social representations, correspondence factor analysis and characterization questionnaire: a methodological contribution.

    PubMed

    Lo Monaco, Grégory; Piermattéo, Anthony; Guimelli, Christian; Abric, Jean-Claude

    2012-11-01

    The characterization questionnaire is inspired by Q-sort methodologies (i.e. qualitative sorting). It consists in asking participants to give their opinion on a list of items by sorting them into categories depending on their level of characterization of the object. This technique allows us to obtain distributions for each item and each response modality (i.e. characteristic vs. not chosen vs. not characteristic). This contribution intends to analyze these frequencies by means of correspondence factor analysis. The originality of this contribution lies in the fact that this kind of analysis has never been used to process data collected by means of this questionnaire. The procedure will be detailed and exemplified by means of two empirical studies on social representations of the good wine and the good supermarket. The interests of such a contribution will be discussed from both methodological points of view and an applications perspective. PMID:23156928

  17. Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis

    PubMed Central

    Harpaz, Rave; DuMouchel, William; Shah, Nigam H.; Madigan, David; Ryan, Patrick; Friedman, Carol

    2013-01-01

    Introduction Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis. PMID:22549283

  18. Novel data-mining methodologies for adverse drug event discovery and analysis.

    PubMed

    Harpaz, R; DuMouchel, W; Shah, N H; Madigan, D; Ryan, P; Friedman, C

    2012-06-01

    An important goal of the health system is to identify new adverse drug events (ADEs) in the postapproval period. Datamining methods that can transform data into meaningful knowledge to inform patient safety have proven essential for this purpose. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used to support ADE discovery and analysis. PMID:22549283

  19. Methodology for social accountability: multiple methods and feminist, poststructural, psychoanalytic discourse analysis.

    PubMed

    Phillips, D A

    2001-06-01

    Bridging the gap between the individual and social context, methodology that aims to surface and explore the regulatory function of discourse on subjectivity production moves nursing research beyond the individual level in order to theorize social context and its influence on health and well-being. This article describes the feminist, poststructural, psychoanalytic discourse analysis and multiple methods used in a recent study exploring links between cultural discourses of masculinity, performativity of masculinity, and practices of male violence. PMID:11393249

  20. Methodology for the analysis of pollutant emissions from a city bus

    NASA Astrophysics Data System (ADS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  1. [Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Hermann, Robert

    1997-01-01

    The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

  2. A systematic review and analysis of factors associated with methodological quality in laparoscopic randomized controlled trials.

    PubMed

    Antoniou, Stavros Athanasios; Andreou, Alexandros; Antoniou, George Athanasios; Bertsias, Antonios; Köhler, Gernot; Koch, Oliver Owen; Pointner, Rudolph; Granderath, Frank-Alexander

    2015-01-01

    Several methods for assessment of methodological quality in randomized controlled trials (RCTs) have been developed during the past few years. Factors associated with quality in laparoscopic surgery have not been defined till date. The aim of this study was to investigate the relationship between bibliometric and the methodological quality of laparoscopic RCTs. The PubMed search engine was queried to identify RCTs on minimally invasive surgery published in 2012 in the 10 highest impact factor surgery journals and the 5 highest impact factor laparoscopic journals. Eligible studies were blindly assessed by two independent investigators using the Scottish Intercollegiate Guidelines Network (SIGN) tool for RCTs. Univariate and multivariate analyses were performed to identify potential associations with methodological quality. A total of 114 relevant RCTs were identified. More than half of the trials were of high or acceptable quality. Half of the reports provided information on comparative demo graphic data and only 21% performed intention-to-treat analysis. RCTs with sample size of at least 60 patients presented higher methodological quality (p = 0.025). Upon multiple regression, reporting on preoperative care and the experience level of surgeons were independent factors of quality. PMID:25896540

  3. Inferring Functional Neural Connectivity with Phase Synchronization Analysis: A Review of Methodology

    PubMed Central

    Sun, Junfeng; Li, Zhijun; Tong, Shanbao

    2012-01-01

    Functional neural connectivity is drawing increasing attention in neuroscience research. To infer functional connectivity from observed neural signals, various methods have been proposed. Among them, phase synchronization analysis is an important and effective one which examines the relationship of instantaneous phase between neural signals but neglecting the influence of their amplitudes. In this paper, we review the advances in methodologies of phase synchronization analysis. In particular, we discuss the definitions of instantaneous phase, the indexes of phase synchronization and their significance test, the issues that may affect the detection of phase synchronization and the extensions of phase synchronization analysis. In practice, phase synchronization analysis may be affected by observational noise, insufficient samples of the signals, volume conduction, and reference in recording neural signals. We make comments and suggestions on these issues so as to better apply phase synchronization analysis to inferring functional connectivity from neural signals. PMID:22577470

  4. Socio-economic Value Analysis in Geospatial and Earth Observation: A methodology review (Invited)

    NASA Astrophysics Data System (ADS)

    Coote, A. M.; Bernknopf, R.; Smart, A.

    2013-12-01

    Many industries have long since realised that applying macro-economic analysis methodologies to assess the socio-economic value of a programme is a critical step to convincing decision makers to authorise investment. The geospatial and earth observation industry has however been slow to embrace economic analysis. There are however a growing number of studies, published in the last few years, that have applied economic principles to this domain. They have adopted a variety of different approaches, including: - Computable General Equilibrium Modelling (CGE) - Revealed preference, stated preference (Willingness to Pay surveys) - Partial Analysis - Simulations - Cost-benefit analysis (with and without risk analysis) This paper will critically review these approaches and assess their applicability to different situations and to meet multiple objectives.

  5. Verification of the three-dimensional thermal-hydraulic models of the TRAC accident-analysis code. [PWR

    SciTech Connect

    Motley, F.

    1982-01-01

    The Transient Reactor Analysis Code (TRAC) being developed at Los Alamos National Laboratory provides a best-estimate prediction of the response of light water reactors or test facilities to postulated accident sequences. One of the features of the code is the ability to analyze the vessel and its heated core in three dimensions. The code is being used to analyze the results of tests in a large-scale reflood test facility built in Japan, known as the Cylindrical Core Test Facility (CCTF). Two test runs have been analyzed that are useful for verification of the three-dimensional analysis capability of the TRAC code. One test began with an initial temperature skew across the heated core. The second test had a large radial power skew between the central and peripheral assemblies. The good agreement between the calculation and the experiment for both of these experiments demonstrates the three-dimensional analysis capability of the TRAC code.

  6. Finite-element methodology for thermal analysis of convectively cooled structures

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A finite-element method for steady-state thermal analysis of convectively cooled structures is presented. The method is based on representing the coolant passages by finite elements with fluid bulk temperature nodes and fluid/structure interface modes. Four finite elements are described: two basic elements (mass transport and surface convection) and two integrated elements for applications to discrete tube and plate-fin convectively cooled structural configurations. Comparative finite-element and lumped-parameter thermal analyses of several convectively cooled structures demonstrate the practicality of utilizing finite-element methodology for thermal analysis of realistic convectively cooled structures.

  7. MossWinn—methodological advances in the field of Mössbauer data analysis

    NASA Astrophysics Data System (ADS)

    Klencsár, Zoltán

    2013-04-01

    The methodology of Mössbauer data analysis has been advanced via the development of a novel scientific database system concept and its realization in the field of Mössbauer spectroscopy, as well as by the application of parallel computing techniques for the enhancement of the efficiency of various processes encountered in the practice of Mössbauer data handling and analysis. The present article describes the new database system concept along with details of its realization in the form of the MossWinn Internet Database (MIDB), and illustrates the performance advantage that may be realized on multi-core processor systems by the application of parallel algorithms for the implementation of database system functions.

  8. Analysis of Japanese radionuclide monitoring data of food before and after the Fukushima nuclear accident.

    PubMed

    Merz, Stefan; Shozugawa, Katsumi; Steinhauser, Georg

    2015-03-01

    In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima (137)Cs and (90)Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in (137)Cs and vegetarian produce was usually higher in (90)Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of (90)Sr being 10% of the respective (137)Cs concentrations may soon be at risk, as the (90)Sr/(137)Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the (90)Sr content of Japanese foods. PMID:25621976

  9. Analysis of Japanese Radionuclide Monitoring Data of Food Before and After the Fukushima Nuclear Accident

    PubMed Central

    2015-01-01

    In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima 137Cs and 90Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in 137Cs and vegetarian produce was usually higher in 90Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of 90Sr being 10% of the respective 137Cs concentrations may soon be at risk, as the 90Sr/137Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the 90Sr content of Japanese foods. PMID:25621976

  10. SACO-1: a fast-running LMFBR accident-analysis code

    SciTech Connect

    Mueller, C.J.; Cahalan, J.E.; Vaurio, J.K.

    1980-01-01

    SACO is a fast-running computer code that simulates hypothetical accidents in liquid-metal fast breeder reactors to the point of permanent subcriticality or to the initiation of a prompt-critical excursion. In the tradition of the SAS codes, each subassembly is modeled by a representative fuel pin with three distinct axial regions to simulate the blanket and core regions. However, analytic and integral models are used wherever possible to cut down the computing time and storage requirements. The physical models and basic equations are described in detail. Comparisons of SACO results to analogous SAS3D results comprise the qualifications of SACO and are illustrated and discussed.

  11. Resolve! Version 2.5: Flammable Gas Accident Analysis Tool Acceptance Test Plan and Test Results

    SciTech Connect

    LAVENDER, J.C.

    2000-10-17

    RESOLVE! Version 2 .5 is designed to quantify the risk and uncertainty of combustion accidents in double-shell tanks (DSTs) and single-shell tanks (SSTs). The purpose of the acceptance testing is to ensure that all of the options and features of the computer code run; to verify that the calculated results are consistent with each other; and to evaluate the effects of the changes to the parameter values on the frequency and consequence trends associated with flammable gas deflagrations or detonations.

  12. Recursive modeling of loss of control in human and organizational processes: a systemic model for accident analysis.

    PubMed

    Kontogiannis, Tom; Malakis, Stathis

    2012-09-01

    A recursive model of accident investigation is proposed by exploiting earlier work in systems thinking. Safety analysts can understand better the underlying causes of decision or action flaws by probing into the patterns of breakdown in the organization of safety. For this deeper analysis, a cybernetic model of organizational factors and a control model of human processes have been integrated in this article (i.e., the viable system model and the extended control model). The joint VSM-ECOM framework has been applied to a case study to help safety practitioners with the analysis of patterns of breakdown with regard to how operators and organizations manage goal conflicts, monitor work progress, recognize weak signals, align goals across teams, and adapt plans on the fly. The recursive accident representation brings together several organizational issues (e.g., the dilemma of autonomy versus compliance, or the interaction between structure and strategy) and addresses how operators adapt to challenges in their environment by adjusting their modes of functioning and recovery. Finally, it facilitates the transfer of knowledge from diverse incidents and near misses within similar domains of practice. PMID:22664695

  13. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  14. Sensitivity and uncertainty analysis within a methodology for evaluating environmental restoration technologies

    NASA Astrophysics Data System (ADS)

    Zio, Enrico; Apostolakis, George E.

    1999-03-01

    This paper illustrates an application of sensitivity and uncertainty analysis techniques within a methodology for evaluating environmental restoration technologies. The methodology consists of two main parts: the first part ("analysis") integrates a wide range of decision criteria and impact evaluation techniques in a framework that emphasizes and incorporates input from stakeholders in all aspects of the process. Its products are the rankings of the alternative options for each stakeholder using, essentially, expected utility theory. The second part ("deliberation") utilizes the analytical results of the "analysis" and attempts to develop consensus among the stakeholders in a session in which the stakeholders discuss and evaluate the analytical results. This paper deals with the analytical part of the approach and the uncertainty and sensitivity analyses that were carried out in preparation for the deliberative process. The objective of these investigations was that of testing the robustness of the assessments and of pointing out possible existing sources of disagreements among the participating stakeholders, thus providing insights for the successive deliberative process. Standard techniques, such as differential analysis, Monte Carlo sampling and a two-dimensional policy region analysis proved sufficient for the task.

  15. [Application Status of Evaluation Methodology of Electronic Medical Record: Evaluation of Bibliometric Analysis].

    PubMed

    Lin, Dan; Liu, Jialin; Zhang, Rui; Li, Yong; Huang, Tingting

    2015-04-01

    In order to provide a reference and theoretical guidance of the evaluation of electronic medical record (EMR) and establishment of evaluation system in China, we applied a bibliometric analysis to assess the application of methodologies used at home and abroad, as well as to summarize the advantages and disadvantages of them. We systematically searched international medical databases of Ovid-MEDLINE, EBSCOhost, EI, EMBASE, PubMed, IEEE, and China's medical databases of CBM and CNKI between Jan. 1997 and Dec. 2012. We also reviewed the reference lists of articles for relevant articles. We selected some qualified papers according to the pre-established inclusion and exclusion criteria, and did information extraction and analysis to the papers. Eventually, 1 736 papers were obtained from online database and other 16 articles from manual retrieval. Thirty-five articles met the inclusion and exclusion criteria and were retrieved and assessed. In the evaluation of EMR, US counted for 54.28% in the leading place, and Canada and Japan stood side by side and ranked second with 8.58%, respectively. For the application of evaluation methodology, Information System Success Model, Technology Acceptance Model (TAM), Innovation Diffusion Model and Cost-Benefit Access Model were widely applied with 25%, 20%, 12.5% and 10%, respectively. In this paper, we summarize our study on the application of methodologies of EMR evaluation, which can provide a reference to EMR evaluation in China. PMID:26211253

  16. A Gap Analysis Methodology for Collecting Crop Genepools: A Case Study with Phaseolus Beans

    PubMed Central

    Ramírez-Villegas, Julián; Khoury, Colin; Jarvis, Andy; Debouck, Daniel Gabriel; Guarino, Luigi

    2010-01-01

    Background The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis) of crop wild relatives as a means to guide efficient and effective collecting activities. Methodology/Principal Findings The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5%) are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap “hotspots”, representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. Conclusions/Significance Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding). Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources. PMID:20976009

  17. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis

    PubMed Central

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-01-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%. PMID:26504638

  18. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    PubMed

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%. PMID:26504638

  19. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    PubMed

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition). PMID:24760596

  20. Establishing equivalence: methodological progress in group-matching design and analysis.

    PubMed

    Kover, Sara T; Atwoo, Amy K

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, Fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios. PMID:23301899

  1. Methodology and application of surrogate plant PRA analysis to the Rancho Seco Power Plant: Final report

    SciTech Connect

    Gore, B.F.; Huenefeld, J.C.

    1987-07-01

    This report presents the development and the first application of generic probabilistic risk assessment (PRA) information for identifying systems and components important to public risk at nuclear power plants lacking plant-specific PRAs. A methodology is presented for using the results of PRAs for similar (surrogate) plants, along with plant-specific information about the plant of interest and the surrogate plants, to infer important failure modes for systems of the plant of interest. This methodology, and the rationale on which it is based, is presented in the context of its application to the Rancho Seco plant. The Rancho Seco plant has been analyzed using PRA information from two surrogate plants. This analysis has been used to guide development of considerable plant-specific information about Rancho Seco systems and components important to minimizing public risk, which is also presented herein.

  2. A Feasibility Analysis Methodology for Decentralized Wastewater Systems - Energy-Efficiency and Cost.

    PubMed

    Naik, Kartiki S; Stenstrom, Michael K

    2016-01-01

    Centralized wastewater treatment, widely practiced in developed areas, involves transporting wastewater from large urban areas to a large capacity plant using a single network of sewers, whereas decentralization is the concept of wastewater collection, treatment and reuse at or near its point of generation. Smaller decentralized plants can achieve extensive reclamation and wastewater management with energy-efficient reclaimed water pumping, modularized expansion and lower capital investment. We devised a methodology to preliminarily assess these alternatives using local constraints and conducted a feasibility analysis for each option. It addressed various scenarios using the pump-back energy consumption, sewer and treatment plant construction and capacity expansion cost. We demonstrated this methodology by applying it to the Hollywood vicinity (California). In this study, the decentralized configuration was more economical and energy-efficient than the centralized system. The pump-back energy consumption was about 50% of the aeration energy consumption for the centralized option. PMID:26730575

  3. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    SciTech Connect

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-08-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

  4. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    SciTech Connect

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-01-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

  5. Analysis of an AP600 intermediate-size loss-of-coolant accident

    SciTech Connect

    Boyack, B.E.; Lime, J.F.

    1995-04-01

    A postulated double-ended guillotine break of an AP600 direct-vessel-injection line has been analyzed. This event is characterized as an intermediate-break loss-of-coolant accident. Most of the insights regarding the response of the AP600 safety systems to the postulated accident are derived from calculations performed with the TRAC-PF1/MOD2 code. However, complementary insights derived from a scaled experiment conducted in the ROSA facility, as well as insights based upon calculations by other codes, are also presented. Based upon the calculated and experimental results, the AP600 will not experience a core heat up and will reach a safe shutdown state using only safety-class equipment. Only the early part of the long-term cooling period initiated by In-containment Refueling Water Storage Tank injection was evaluated. Thus, the observation that the core is continuously cooled should be verified for the later phase of the long-term cooling period when sump injection and containment cooling processes are important.

  6. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    SciTech Connect

    Hugo, Jacques

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  7. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  8. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  9. Methodology for spatial and temporal analysis of drought using large-scale gridded data

    NASA Astrophysics Data System (ADS)

    Corzo P, Gerald A.; van Huijgevoort, Marjolein H. J.; van Lanen, Henny A. J.

    2010-05-01

    In recent years, there is an increased understanding of the importance of drought, in particular due to global change. For a good understanding of historic droughts, and to evaluate the impact of future global change scenarios, more advanced techniques to account for spatio-temporal variability are required. So far, methodologies to characterize spatio-temporal patterns of large-scale drought (e.g. global scale) are still limited. This explorative work presents methodological processes developed to analyze a gridded dataset with forcing data that has been compiled through the EU-FP6 WATCH project (0.5o, daily, 1958-2001). Two new methodologies are proposed: the Standardized Clustered Precipitation Index (SCPI), which quantifies monthly precipitation changes, and the Cluster Precipitation Distributions (CPDs) which consider the spatial reduction of continuous period without daily rain. Both methods are used to characterize meteorological drought. The SCPI methodology is an extension of the Standardized Precipitation Index (SPI) that incorporates a multivariate clustering analysis to determine the spatial changes of the index and the rate of change. The SPIs are calculated based on a monthly moving average of a specified length (e.g. 30 days), and their variability is calculated in k years to identify a change in the SPI levels. To determine this k-years change, a monthly spatial pattern of severity is calculated in the time frame defined in the calculation of the SPI. A second method is presented (i.e. CPD) to prepare for analysis of hydrological drought in a next phase of this research. CPD identifies spatial regions where probabilities of longer periods of non-precipitation events are present. These probability distributions, however, do not consider geographical positioning which may affect drought analysis of a particular region. Therefore, the probabilities of non-precipitation events are grouped using a clustering technique that allows for geo-referenced information. The two methodologies provide important information for principles that can be used to develop methods to evaluate meteorological and subsequently hydrological drought from different types of large-scale grid-based models (e.g. RCMs, LSHMs, GHMs).

  10. Hazards and accident analyses, an integrated approach, for the Plutonium Facility at Los Alamos National Laboratory

    SciTech Connect

    Pan, P.Y.; Goen, L.K.; Letellier, B.C.; Sasser, M.K.

    1995-07-01

    This paper describes an integrated approach to perform hazards and accident analyses for the Plutonium Facility at Los Alamos National Laboratory. A comprehensive hazards analysis methodology was developed that extends the scope of the preliminary/process hazard analysis methods described in the AIChE Guidelines for Hazard Evaluations. Results fro the semi-quantitative approach constitute a full spectrum of hazards. For each accident scenario identified, there is a binning assigned for the event likelihood and consequence severity. In addition, each accident scenario is analyzed for four possible sectors (workers, on-site personnel, public, and environment). A screening process was developed to link the hazard analysis to the accident analysis. Specifically the 840 accident scenarios were screened down to about 15 accident scenarios for a more through deterministic analysis to define the operational safety envelope. The mechanics of the screening process in the selection of final scenarios for each representative accident category, i.e., fire, explosion, criticality, and spill, is described.

  11. Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2

    SciTech Connect

    Evans, J.S.; Abrahmson, S.; Bender, M.A.; Boecker, B.B.; Scott, B.R.; Gilbert, E.S.

    1993-10-01

    This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

  12. Three years of the OCRA methodology in Brazil: critical analysis and results.

    PubMed

    Ruddy, Facci; Eduardo, Marcatto; Edoardo, Santino

    2012-01-01

    The Authors make a detailed analysis of the introduction of the OCRA Methodology in Brazil that started in August 2008 with the launching of the "OCRA Book" translated to Portuguese. They evaluate the importance of the assessment of the exposure of the upper limbs to the risk due to repetitive movements and efforts, according to the national and international legislation, demonstrating the interconnection of the OCRA Methodology with the Regulating Norms of the Ministry of Labor and Work (NRs - MTE), especially with the NR-17 and its Application Manual. They discuss the new paradigms of the OCRA Method in relation to the classic paradigms of the ergonomic knowledge. They indicate the OCRA Method as the tool to be used for the confirmation or not of the New Previdentiary Epidemiologic Nexus NTEP/FAP. The Authors present their conclusions based on the practical results the "participants certified by the OCRA Methodology" achieved in the application on different laboral activities in diverse economic segments, showing the risk reduction and the productivity of the companies. PMID:22316774

  13. Analysis of 303 Road Traffic Accident Victims Seen Dead on Arrival at Emergency Room-Assir Central Hospital

    PubMed Central

    Batouk, Abdul N.; Abu-Eisheh, Nader; Abu-Eshy, Saeed; Al-Shehri, Mohammad; AI-Naami, Mohammad; Jastaniah, Suleiman

    1996-01-01

    Background: Although Rood Traffic Accident (RTA) is a noticeable common cause of death in Saudi Arabia, there is no published data showing the relative frequency of this disease as a cause of death. Aim of the study: This study attempted to find out the relative frequency of RTA as a cause of death. Also, to identify age groups at risk as well as make some inferences from the different types of injuries seen. Methodology: In a period of over a four and half years, 574 patients were seen dead on arrival at the Emergency Department of Assir Central Hospital, Abha, Saudi Arabia. Of these, 303 (52.8%) were victims of RTA. Results: The 303 victims revealed a male to female ratio of 14:1, Saudi nationals of 69% and age range of 3 months - 85 years (mean = 34.25 years). The peak age group was between 21 and 49 years and the peak period of presentation at the Emergency Department was between 12:00 noon and 18:00 hours. The month of ten in Hegira Calendar represented the peak period; a significant (P<0.05) seasonal variations was also seen, summer being the highest. Clinical assessment of the victims revealed that head and neck injuries were the commonest followed by chest injuries. Conclusion: RTA is the primary cause of death among dead on arrival cases affecting the most active and productive age group. The study recommended the implementation of pre -hospital emergency medical system. PMID:23008545

  14. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  15. Analysis of sertraline in postmortem fluids and tissues in 11 aviation accident victims.

    PubMed

    Lewis, Russell J; Angier, Mike K; Williamson, Kelly S; Johnson, Robert D

    2013-05-01

    Sertraline (Zoloft) is a selective serotonin reuptake inhibitor that is a commonly prescribed drug for the treatment of depression, obsessive-compulsive disorder, panic disorder, social anxiety disorder, premenstrual dysphoric disorder and post-traumatic stress disorder. Although the use of sertraline is relatively safe, certain side effects may negatively affect a pilot's performance and become a factor in an aviation accident. The authors' laboratory investigated the distribution of sertraline and its primary metabolite, desmethylsertraline, in various postmortem tissues and fluids obtained from 11 fatal aviation accident cases between 2001 and 2004. Eleven specimen types were analyzed for each case, including blood, urine, vitreous humor, liver, lung, kidney, spleen, muscle, brain, heart and bile. Human specimens were processed utilizing solid-phase extraction, followed by characterization and quantitation employing gas chromatography-mass spectrometry. Whole blood sertraline concentrations obtained from these 11 cases ranged from 0.005 to 0.392 µg/mL. The distribution coefficients of sertraline, expressed as specimen/blood ratio, were as follows: urine, 0.47 ± 0.39 (n = 6); vitreous humor, 0.02 ± 0.01 (n = 4); liver, 74 ± 59 (n = 11); lung, 67 ± 45 (n = 11); kidney, 7.4 ± 5 (n = 11); spleen, 46 ± 45 (n = 10); muscle, 2.1 ± 1.3 (n = 8); brain, 22 ± 14 (n = 10); heart, 9 ± 7 (n = 11); and bile, 36 ± 26 (n = 8). Postmortem distribution coefficients obtained for sertraline had coefficients of variation ranging from 47-99%. This study suggests that sertraline likely undergoes significant postmortem redistribution. PMID:23511306

  16. TRANSIENT ACCIDENT ANALYSIS OF THE GLOVEBOX SYSTEM IN A LARGE PROCESS ROOM

    SciTech Connect

    Lee, S

    2008-01-11

    Local transient hydrogen concentrations were evaluated inside a large process room when the hydrogen gas was released by three postulated accident scenarios associated with the process tank leakage and fire leading to a loss of gas confinement. The three cases considered in this work were fire in a room, loss of confinement from a process tank, and loss of confinement coupled with fire event. Based on these accident scenarios in a large and unventilated process room, the modeling calculations of the hydrogen migration were performed to estimate local transient concentrations of hydrogen due to the sudden leakage and release from a glovebox system associated with the process tank. The modeling domain represented the major features of the process room including the principal release or leakage source of gas storage system. The model was benchmarked against the literature results for key phenomena such as natural convection, turbulent behavior, gas mixing due to jet entrainment, and radiation cooling because these phenomena are closely related to the gas driving mechanisms within a large air space of the process room. The modeling results showed that at the corner of the process room, the gas concentrations migrated by the Case 2 and Case 3 scenarios reached the set-point value of high activity alarm in about 13 seconds, while the Case 1 scenario takes about 90 seconds to reach the concentration. The modeling results were used to estimate transient radioactive gas migrations in an enclosed process room installed with high activity alarm monitor when the postulated leakage scenarios are initiated without room ventilation.

  17. HPCC Methodologies for Structural Design and Analysis on Parallel and Distributed Computing Platforms

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel

    1998-01-01

    In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.

  18. An Analysis Methodology for the Gamma-ray Large Area Space Telescope

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Cohen-Tanugi, Johann

    2004-01-01

    The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.

  19. Total materials consumption; an estimation methodology and example using lead; a materials flow analysis

    USGS Publications Warehouse

    Biviano, Marilyn B.; Wagner, Lorie A.; Sullivan, Daniel E.

    1999-01-01

    Materials consumption estimates, such as apparent consumption of raw materials, can be important indicators of sustainability. Apparent consumption of raw materials does not account for material contained in manufactured products that are imported or exported and may thus under- or over-estimate total consumption of materials in the domestic economy. This report demonstrates a methodology to measure the amount of materials contained in net imports (imports minus exports), using lead as an example. The analysis presents illustrations of differences between apparent and total consumption of lead and distributes these differences into individual lead-consuming sectors.

  20. Spectral Multi-peak Analysis Methodology for Eliminating Effects of NaI Temperature Drift

    SciTech Connect

    Uckan, Taner; March-Leuba, Jose A; Brukiewa, Patrick D; Upadhyaya, Belle R

    2010-01-01

    This paper presents a methodology of spectral multi-peak analysis for eliminating temperature drift effects of NaI scintillation gamma-ray detectors. Properties and cost of NaI make it an excellent choice for low energy (< 1-MeV) spectroscopic studies. However, the light output of NaI scintillators has high temperature dependence, and the common practice of selecting a region of interest (ROI) for the peak of study in the measured spectrum requires that NaI temperature drift be monitored and corrected, typically by gain control of the measurement electronics.

  1. Designing an experimenters database using the Nijssen Information Analysis Methodology (NIAM)

    SciTech Connect

    Eaton, M.J.

    1990-01-01

    This paper presents a discussion of the use of the Nijssen Information Analysis Methodology (NIAM) in the design of an experimenters database. This database is used by physicists and technicians to describe the configuration and diagnostic systems used on Sandia National Laboratories Particle Beam Fusion Accelerator II (PBFA II). The design of this database presented some unique challenges because of the large degree of flexibility required to enable timely response to changing experimental configurations. The NIAM user-oriented technique proved to be invaluable in translating experimenter's requirements into an information model and then to a normalized relational design.

  2. Transient Analysis for Evaluating the Potential Boiling in the High Elevation Emergency Cooling Units of PWR Following a Hypothetical Loss of Coolant Accident (LOCA) and Subsequent Water Hammer Due to Pump Restart

    SciTech Connect

    Husaini, S. Mahmood; Qashu, Riyad K.

    2004-07-01

    The Generic Letter GL-96-06 issued by the U.S. Nuclear Regulatory Commission (NRC) required the utilities to evaluate the potential for voiding in their Containment Emergency Cooling Units (ECUs) due to a hypothetical Loss Of Coolant Accident (LOCA) or a Main Steam Line Break (MSLB) accompanied by the Loss Of Offsite Power (LOOP). When the offsite power is restored, the Component Cooling Water (CCW) pumps restart causing water hammer to occur due to cavity closure. Recently EPRI (Electric Power Research Institute) performed a research study that recommended a methodology to mitigate the water hammer due to cavity closure. The EPRI methodology allows for the cushioning effects of hot steam and released air, which is not considered in the conventional water column separation analysis. The EPRI study was limited in scope to the evaluation of water hammer only and did not provide any guidance for evaluating the occurrence of boiling and the extent of voiding in the ECU piping. This paper presents a complete methodology based on first principles to evaluate the onset of boiling. Also, presented is a methodology for evaluating the extent of voiding and the water hammer resulting from cavity closure by using an existing generalized computer program that is based on the Method of Characteristics. The EPRI methodology is then used to mitigate the predicted water hammer. Thus it overcomes the inherent complications and difficulties involved in performing hand calculations for water hammer. The heat transfer analysis provides an alternative to the use of very cumbersome modeling in using CFD (computational fluid dynamics) based computer programs. (authors)

  3. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    SciTech Connect

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  4. Source term and radiological consequences of the Chernobyl accident

    SciTech Connect

    Mourad, R.; Snell, V.

    1987-01-01

    The objective of this work is to assess the source term and to evaluate the maximum hypothetical individual doses in European countries (including the Soviet Union) from the Chernobyl accident through the analyses of measurements of meteorological data, radiation fields, and airborne and deposited activity in these countries. Applying this information to deduce the source term involves a reversal of the techniques of nuclear accident analysis, which estimate the off-site consequences of postulated accidents. In this study the authors predict the quantities of radionuclides that, if released at Chernobyl and following the calculated trajectories, would explain and unify the observed radiation levels and radionuclide concentrations as measured by European countries and the Soviet Union. The simulation uses the PEAR microcomputer program following the methodology described in Canadian Standards Association standard N288.2. The study was performed before the Soviets published their estimate of the source term and the two results are compared.

  5. Analyzing facility accidents in DOE`s Programmatic Environmental Impact Statement

    SciTech Connect

    Roglans-Ribas, J.; Mueller, C.J.

    1995-12-31

    This paper summarizes the methodology used in performing the facility accident analysis for events initiated by natural phenomena in the U.S. Department of Energy (DOE) programmatic environmental impact statements (PEIS). DOE has been conducting a PEIS for it`s environmental management program.

  6. Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor

    NASA Astrophysics Data System (ADS)

    Mkhabela, Peter Tshepo

    The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis methodology for the PBMR to provide reference solutions. Investigation of different aspects of the coupled methodology and development of efficient kinetics treatment for the PBMR were carried out, which accounts for all feedback phenomena in an efficient manner. The OECD/NEA PBMR-400 coupled code benchmark was used as a test matrix for the proposed investigations. The integrated thermal-hydraulics and neutronics (multi-physics) methods were extended to enable modeling of a wider range of transients pertinent to the PBMR. First, the effect of the spatial mapping schemes (spatial coupling) was studied and quantified for different types of transients, which resulted in implementation of improved mapping methodology based on user defined criteria. The second aspect that was studied and optimized is the temporal coupling and meshing schemes between the neutronics and thermal-hydraulics time step selection algorithms. The coupled code convergence was achieved supplemented by application of methods to accelerate it. Finally, the modeling of all feedback phenomena in PBMRs was investigated and a novel treatment of cross-section dependencies was introduced for improving the representation of cross-section variations. The added benefit was that in the process of studying and improving the coupled multi-physics methodology more insight was gained into the physics and dynamics of PBMR, which will help also to optimize the PBMR design and improve its safety. One unique contribution of the PhD research is the investigation of the importance of the correct representation of the three-dimensional (3-D) effects in the PBMR analysis. The performed studies demonstrated that explicit 3-D modeling of control rod movement is superior and removes the errors associated with the grey curtain (2-D homogenized) approximation.

  7. An aftermath analysis of the 2014 coal mine accident in Soma, Turkey: Use of risk performance indicators based on historical experience.

    PubMed

    Spada, Matteo; Burgherr, Peter

    2016-02-01

    On the 13th of May 2014 a fire related incident in the Soma coal mine in Turkey caused 301 fatalities and more than 80 injuries. This has been the largest coal mine accident in Turkey, and in the OECD country group, so far. This study investigated if such a disastrous event should be expected, in a statistical sense, based on historical observations. For this purpose, PSI's ENSAD database is used to extract accident data for the period 1970-2014. Four different cases are analyzed, i.e., OECD, OECD w/o Turkey, Turkey and USA. Analysis of temporal trends for annual numbers of accidents and fatalities indicated a non-significant decreasing tendency for OECD and OECD w/o Turkey and a significant one for USA, whereas for Turkey both measures showed an increase over time. The expectation analysis revealed clearly that an event with the consequences of the Soma accident is rather unlikely for OECD, OECD w/o Turkey and USA. In contrast, such a severe accident has a substantially higher expectation for Turkey, i.e. it cannot be considered an extremely rare event, based on historical experience. This indicates a need for improved safety measures and stricter regulations in the Turkish coal mining sector in order to get closer to the rest of OECD. PMID:26687539

  8. Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system

    NASA Technical Reports Server (NTRS)

    Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)

    1992-01-01

    Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

  9. Modeling and analysis of core debris recriticality during hypothetical severe accidents in the Advanced Neutron Source Reactor

    SciTech Connect

    Taleyarkhan, R.P.; Kim, S.H.; Slater, C.O.; Moses, D.L.; Simpson, D.B.; Georgevich, V.

    1993-05-01

    This paper discusses salient aspects of severe-accident-related recriticality modeling and analysis in the Advanced Neutron Source (ANS) reactor. The development of an analytical capability using the KENO V.A-SCALE system is described including evaluation of suitable nuclear cross-section sets to account for the effects of system geometry, mixture temperature, material dispersion and other thermal-hydraulic conditions. Benchmarking and validation efforts conducted with KENO V.A-SCALE and other neutronic codes against critical experiment data are described. Potential deviations and biases resulting from use of the 16-group Hansen-Roach library are shown. A comprehensive test matrix of calculations to evaluate the threat of a recriticality event in the ANS is described. Strong dependencies on geometry, material constituents, and thermal-hydraulic conditions are described. The introduction of designed mitigative features is described.

  10. DEFORM-4: fuel pin characterization and transient response in the SAS4A accident analysis code system

    SciTech Connect

    Miles, K.J.; Hill, D.J.

    1986-01-01

    The DEFORM-4 module is the segment of the SAS4A Accident Analysis Code System that calculates the fuel pin characterization in response to a steady state irradiation history, thereby providing the initial conditions for the transient calculation. The various phenomena considered include fuel porosity migration, fission gas bubble induced swelling, fuel cracking and healing, fission gas release, cladding swelling, and the thermal-mechanical state of the fuel and cladding. In the transient state, the module continues the thermal-mechanical response calculation, including fuel melting and central cavity pressurization, until cladding failure is predicted and one of the failed fuel modules is initiated. Comparisons with experimental data have demonstrated the validity of the modeling approach.

  11. Contributive factors to aviation accidents.

    PubMed

    Fajer, Marcia; Almeida, Ildeberto Muniz de; Fischer, Frida Marina

    2011-04-01

    The objective of the study was to compare the results of aviation accident analyses performed by the Center for Investigation and Prevention of Aviation Accidents (CENIPA) with the method Human Factors Analysis and Classification System (HFACS). The final reports of thirty-six general aviation accidents occurring between 2000 and 2005 in the State of São Paulo, Southeastern Brazil were analyzed and compared. CENIPA reports mentioned 163 contributive factors, while HFACS identified 370 factors. It was concluded that CENIPA reports did not contemplate the organizational factors associated with aviation accidents. PMID:21344127

  12. FRETmatrix: a general methodology for the simulation and analysis of FRET in nucleic acids.

    PubMed

    Preus, Søren; Kilså, Kristine; Miannay, Francois-Alexandre; Albinsson, Bo; Wilhelmsson, L Marcus

    2013-01-01

    Förster resonance energy transfer (FRET) is a technique commonly used to unravel the structure and conformational changes of biomolecules being vital for all living organisms. Typically, FRET is performed using dyes attached externally to nucleic acids through a linker that complicates quantitative interpretation of experiments because of dye diffusion and reorientation. Here, we report a versatile, general methodology for the simulation and analysis of FRET in nucleic acids, and demonstrate its particular power for modelling FRET between probes possessing limited diffusional and rotational freedom, such as our recently developed nucleobase analogue FRET pairs (base-base FRET). These probes are positioned inside the DNA/RNA structures as a replacement for one of the natural bases, thus, providing unique control of their position and orientation and the advantage of reporting from inside sites of interest. In demonstration studies, not requiring molecular dynamics modelling, we obtain previously inaccessible insight into the orientation and nanosecond dynamics of the bases inside double-stranded DNA, and we reconstruct high resolution 3D structures of kinked DNA. The reported methodology is accompanied by a freely available software package, FRETmatrix, for the design and analysis of FRET in nucleic acid containing systems. PMID:22977181

  13. Cardiovascular risk analysis by means of pulse morphology and clustering methodologies.

    PubMed

    Almeida, Vânia G; Borba, J; Pereira, H Catarina; Pereira, Tânia; Correia, Carlos; Pêgo, Mariano; Cardoso, João

    2014-11-01

    The purpose of this study was the development of a clustering methodology to deal with arterial pressure waveform (APW) parameters to be used in the cardiovascular risk assessment. One hundred sixteen subjects were monitored and divided into two groups. The first one (23 hypertensive subjects) was analyzed using APW and biochemical parameters, while the remaining 93 healthy subjects were only evaluated through APW parameters. The expectation maximization (EM) and k-means algorithms were used in the cluster analysis, and the risk scores (the Framingham Risk Score (FRS), the Systematic COronary Risk Evaluation (SCORE) project, the Assessing cardiovascular risk using Scottish Intercollegiate Guidelines Network (ASSIGN) and the PROspective Cardiovascular Münster (PROCAM)), commonly used in clinical practice were selected to the cluster risk validation. The result from the clustering risk analysis showed a very significant correlation with ASSIGN (r=0.582, p<0.01) and a significant correlation with FRS (r=0.458, p<0.05). The results from the comparison of both groups also allowed to identify the cluster with higher cardiovascular risk in the healthy group. These results give new insights to explore this methodology in future scoring trials. PMID:25023535

  14. Causality analysis in business performance measurement system using system dynamics methodology

    NASA Astrophysics Data System (ADS)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  15. Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model''

    SciTech Connect

    Dmitriy Y. Anistratov; Marvin L. Adams; Todd S. Palmer; Kord S. Smith; Kevin Clarno; Hikaru Hiruta; Razvan Nes

    2003-08-04

    OAK (B204) Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model'' The present generation of reactor analysis methods uses few-group nodal diffusion approximations to calculate full-core eigenvalues and power distributions. The cross sections, diffusion coefficients, and discontinuity factors (collectively called ''group constants'') in the nodal diffusion equations are parameterized as functions of many variables, ranging from the obvious (temperature, boron concentration, etc.) to the more obscure (spectral index, moderator temperature history, etc.). These group constants, and their variations as functions of the many variables, are calculated by assembly-level transport codes. The current methodology has two main weaknesses that this project addressed. The first weakness is the diffusion approximation in the full-core calculation; this can be significantly inaccurate at interfaces between different assemblies. This project used the nodal diffusion framework to implement nodal quasidiffusion equations, which can capture transport effects to an arbitrary degree of accuracy. The second weakness is in the parameterization of the group constants; current models do not always perform well, especially at interfaces between unlike assemblies. The project developed a theoretical foundation for parameterization and homogenization models and used that theory to devise improved models. The new models were extended to tabulate information that the nodal quasidiffusion equations can use to capture transport effects in full-core calculations.

  16. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  17. Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008

    SciTech Connect

    Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.

    2008-09-30

    The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAE by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort, (Elliott et al. 2004b). The areas most changed since the publication of that previous document are those discussing the calculation of lighting and HVAC interactive effects (for both lighting and envelope/whole-building projects). This report does not attempt to convey inputs to BEAMS or the methodology of their derivation.

  18. Nonlinear Structural Analysis Methodology and Dynamics Scaling of Inflatable Parabolic Reflector Antenna Concepts

    NASA Technical Reports Server (NTRS)

    Sreekantamurthy, Tham; Gaspar, James L.; Mann, Troy; Behun, Vaughn; Pearson, James C., Jr.; Scarborough, Stephen

    2007-01-01

    Ultra-light weight and ultra-thin membrane inflatable antenna concepts are fast evolving to become the state-of-the-art antenna concepts for deep-space applications. NASA Langley Research Center has been involved in the structural dynamics research on antenna structures. One of the goals of the research is to develop structural analysis methodology for prediction of the static and dynamic response characteristics of the inflatable antenna concepts. This research is focused on the computational studies to use nonlinear large deformation finite element analysis to characterize the ultra-thin membrane responses of the antennas. Recently, structural analyses have been performed on a few parabolic reflector antennas of varying size and shape, which are referred in the paper as 0.3 meters subscale, 2 meters half-scale, and 4 meters full-scale antenna. The various aspects studied included nonlinear analysis methodology and solution techniques, ways to speed convergence in iterative methods, the sensitivities of responses with respect to structural loads, such as inflation pressure, gravity, and pretension loads in the ground and in-space conditions, and the ultra-thin membrane wrinkling characteristics. Several such intrinsic aspects studied have provided valuable insight into evaluation of structural characteristics of such antennas. While analyzing these structural characteristics, a quick study was also made to assess the applicability of dynamics scaling of the half-scale antenna. This paper presents the details of the nonlinear structural analysis results, and discusses the insight gained from the studies on the various intrinsic aspects of the analysis methodology. The predicted reflector surface characteristics of the three inflatable ultra-thin membrane parabolic reflector antenna concepts are presented as easily observable displacement fringe patterns with associated maximum values, and normal mode shapes and associated frequencies. Wrinkling patterns are presented to show how surface wrinkle progress with increasing tension loads. Antenna reflector surface accuracies were found to be very much dependent on the type and size of the antenna, the reflector surface curvature, reflector membrane supports in terms of spacing of catenaries, as well as the amount of applied load.

  19. The definitive analysis of the Bendandi's methodology performed with a specific software

    NASA Astrophysics Data System (ADS)

    Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro

    2015-04-01

    The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.

  20. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans.

    PubMed

    Bernaldo de Quirós, Yara; González-Díaz, Oscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

    2011-01-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

  1. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans

    PubMed Central

    de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

    2011-01-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

  2. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans

    NASA Astrophysics Data System (ADS)

    de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; di Guardo, Giovanni; Fernández, Antonio

    2011-12-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

  3. Methodology for cost analysis of film-based and filmless portable chest systems

    NASA Astrophysics Data System (ADS)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  4. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    NASA Technical Reports Server (NTRS)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.

  5. Episode analysis of deposition of radiocesium from the Fukushima Daiichi nuclear power plant accident.

    PubMed

    Morino, Yu; Ohara, Toshimasa; Watanabe, Mirai; Hayashi, Seiji; Nishizawa, Masato

    2013-03-01

    Chemical transport models played key roles in understanding the atmospheric behaviors and deposition patterns of radioactive materials emitted from the Fukushima Daiichi nuclear power plant after the nuclear accident that accompanied the great Tohoku earthquake and tsunami on 11 March 2011. However, model results could not be sufficiently evaluated because of limited observational data. We assess the model performance to simulate the deposition patterns of radiocesium ((137)Cs) by making use of airborne monitoring survey data for the first time. We conducted ten sensitivity simulations to evaluate the atmospheric model uncertainties associated with key model settings including emission data and wet deposition modules. We found that simulation using emissions estimated with a regional-scale (? 500 km) model better reproduced the observed (137)Cs deposition pattern in eastern Japan than simulation using emissions estimated with local-scale (? 50 km) or global-scale models. In addition, simulation using a process-based wet deposition module reproduced the observations well, whereas simulation using scavenging coefficients showed large uncertainties associated with empirical parameters. The best-available simulation reproduced the observed (137)Cs deposition rates in high-deposition areas (? 10 kBq m(-2)) within 1 order of magnitude and showed that deposition of radiocesium over land occurred predominantly during 15-16, 20-23, and 30-31 March 2011. PMID:23391028

  6. Health effects models for nuclear power plant accident consequence analysis: Low LET radiation

    SciTech Connect

    Evans, J.S. . School of Public Health)

    1990-01-01

    This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

  7. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information (sensitivity result) to reduce sampling number. (4) Allowing grid independence for scaled integral effect test (IET) simulation and real plant applications: (a) eliminate numerical uncertainty on scaling; (b) reduce experimental cost by allowing smaller scaled IET; (c) eliminate user effects. This paper will review the issues related to the current CSAU, introduce FSA, discuss a potential Q-PIRT process, and show simple examples to perform FSA. Finally, the general research direction and requirements to use FSA in a system analysis code will be discussed.

  8. Use of Kalman filter methods in analysis of in-pile LMFBR accident simulations

    SciTech Connect

    Meek, C.C.; Doerner, R.C.

    1983-01-01

    Kalman filter methodology has been applied to inpile liquid-metal fast breeder reactor simulation experiments to obtain estimates of the fuel-clad thermal gap conductance. A transient lumped parameter model of the experiment is developed. An optimal estimate of the state vector chosen to characterize the experiment is obtained through the use of the Kalman filter. From this estimate, the fuel-clad thermal gap conductance is calculated as a function of time into the test and axial position along the length of the fuel pin.

  9. Bedding control on landslides: a methodological approach for computer-aided mapping analysis

    NASA Astrophysics Data System (ADS)

    Grelle, G.; Revellino, P.; Donnarumma, A.; Guadagno, F. M.

    2011-05-01

    Litho-structural control on the spatial and temporal evolution of landslides is one of the major typical aspects on slopes constituted of structurally complex sequences. Mainly focused on instabilities of the earth flow type, a semi-quantitative analysis has been developed with the purpose of identifying and characterizing litho-structural control exerted by bedding on slopes and its effects on landsliding. In quantitative terms, a technique for azimuth data interpolation, Non-continuous Azimuth Distribution Methodological Approach (NADIA), is presented by means of a GIS software application. In addition, processed by NADIA, two indexes have been determined: (i) Δ, aimed at defining the relationship between the orientation of geological bedding planes and slope aspect, and (ii) C, which recognizes localized slope sectors in which the stony component of structurally complex formations is abundant and therefore operates an evolutive control of landslide masses. Furthermore, some Litho-Structural Models (LSMs) of slopes are proposed aiming at characterizing recurrent forms of structural control in the source, channel and deposition areas of gravitational movements. In order to elaborate evolutive models controlling landslide scenarios, LSMs were qualitatively related and compared with Δ and C quantitative indexes. The methodological procedure has been applied to a lithostructurally complex area of Southern Italy where data about azimuth measurements and landslide mapping were known. It was found that the proposed methodology enables the recognition of typical control conditions on landslides in relation to the LSMs. Different control patterns on landslide shape and on style and distribution of the activity resulted for each LSM. This provides the possibility for first-order identification to be made of the spatial evolution of landslide bodies.

  10. A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium

    SciTech Connect

    1999-08-31

    The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports [DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively]. At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity analysis. These steps are described below.

  11. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  12. ADAPT (Analysis of Dynamic Accident Progression Trees) Beta Version 0.9

    Energy Science and Technology Software Center (ESTSC)

    2010-01-07

    The purpose of the ADAPT code is to generate Dynamic Event Trees (DET) using a user specified simulator. ADAPT can utilize any simulation tool which meets a minimal set of requirements. ADAPT is based on the concept of DET which use explicit modeling of the deterministic dynamic processes that take place during a nuclear reactor plant system evolution along with stochastic modeling. When DET are used to model different aspects of Probabilistic Risk Assessment (PRA),more » all accident progression scenarios starting from an initiating event are considered simultaneously. The DET branching occurs at user specified times and/or when an action is required by the system and/or the operator. These outcomes then decide how the dynamic system variables will evolve in time for each DET branch. Since two different outcomes at a DET branching may lead to completely different paths for system evolution, the next branching for these paths may occur not only at different times, but can be based on different branching criteria. The computational infrastructure allows for flexibility in ADAPT to link with different system simulation codes, parallel processing of the scenarios under consideration, on-line scenario management (initiation as well as termination) and user friendly graphical capabilities. The ADAPT system is designed for a distributed computing environment; the scheduler can track multiple concurrent branches simultaneously. The scheduler is modularized so that the DET branching strategy can be modified (e.g. biasing towards the worse case scenario/event). Independent database systems store data from the simulation tasks and the DET structure so that the event tree can be constructed and analyzed later. ADAPT is provided with a user-friendly client which can easily sort through and display the results of an experiment, precluding the need for the user to manually inspect individual simulator runs.« less

  13. Modeling and sensitivity analysis of transport and deposition of radionuclides from the Fukushima Daiichi accident

    NASA Astrophysics Data System (ADS)

    Hu, X.; Li, D.; Huang, H.; Shen, S.; Bou-Zeid, E.

    2014-01-01

    The atmospheric transport and ground deposition of radioactive isotopes 131I and 137Cs during and after the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident (March 2011) are investigated using the Weather Research and Forecasting/Chemistry (WRF/Chem) model. The aim is to assess the skill of WRF in simulating these processes and the sensitivity of the model's performance to various parameterizations of unresolved physics. The WRF/Chem model is first upgraded by implementing a radioactive decay term into the advection-diffusion solver and adding three parameterizations for dry deposition and two parameterizations for wet deposition. Different microphysics and horizontal turbulent diffusion schemes are then tested for their ability to reproduce observed meteorological conditions. Subsequently, the influence on the simulated transport and deposition of the characteristics of the emission source, including the emission rate, the gas partitioning of 131I and the size distribution of 137Cs, is examined. The results show that the model can predict the wind fields and rainfall realistically. The ground deposition of the radionuclides can also potentially be captured well but it is very sensitive to the emission characterization. It is found that the total deposition is most influenced by the emission rate for both 131I and 137Cs; while it is less sensitive to the dry deposition parameterizations. Moreover, for 131I, the deposition is also sensitive to the microphysics schemes, the horizontal diffusion schemes, gas partitioning and wet deposition parameterizations; while for 137Cs, the deposition is very sensitive to the microphysics schemes and wet deposition parameterizations, and it is also sensitive to the horizontal diffusion schemes and the size distribution.

  14. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Astrophysics Data System (ADS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1994-09-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  15. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Astrophysics Data System (ADS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1995-05-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modelling strategy. The structural response for each cracked configuration is obtained using a geometrically non-linear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology, and its applicability to performing practical analyses of realistic structures, is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  16. Accuracy of ionospheric models used in GNSS and SBAS: methodology and analysis

    NASA Astrophysics Data System (ADS)

    Rovira-Garcia, A.; Juan, J. M.; Sanz, J.; González-Casado, G.; Ibáñez, D.

    2015-10-01

    The characterization of the accuracy of ionospheric models currently used in global navigation satellite systems (GNSSs) is a long-standing issue. The characterization remains a challenging problem owing to the lack of sufficiently accurate slant ionospheric determinations to be used as a reference. The present study proposes a methodology based on the comparison of the predictions of any ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences are separated as hardware delays (a receiver constant plus a satellite constant) per day. The present study was conducted for the entire year of 2014, i.e. during the last solar cycle maximum. The ionospheric models assessed are the operational models broadcast by the global positioning system (GPS) and Galileo constellations, the satellite-based augmentation system (SBAS) (i.e. European Geostationary Navigation Overlay System (EGNOS) and wide area augmentation system (WAAS)), a number of post-process global ionospheric maps (GIMs) from different International GNSS Service (IGS) analysis centres (ACs) and, finally, a more sophisticated GIM computed by the research group of Astronomy and GEomatics (gAGE). Ionospheric models based on GNSS data and represented on a grid (IGS GIMs or SBAS) correct about 85 % of the total slant ionospheric delay, whereas the models broadcasted in the navigation messages of GPS and Galileo only account for about 70 %. Our gAGE GIM is shown to correct 95 % of the delay. The proposed methodology appears to be a useful tool to improve current ionospheric models.

  17. Accuracy of ionospheric models used in GNSS and SBAS: methodology and analysis

    NASA Astrophysics Data System (ADS)

    Rovira-Garcia, A.; Juan, J. M.; Sanz, J.; González-Casado, G.; Ibáñez, D.

    2016-03-01

    The characterization of the accuracy of ionospheric models currently used in global navigation satellite systems (GNSSs) is a long-standing issue. The characterization remains a challenging problem owing to the lack of sufficiently accurate slant ionospheric determinations to be used as a reference. The present study proposes a methodology based on the comparison of the predictions of any ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences are separated as hardware delays (a receiver constant plus a satellite constant) per day. The present study was conducted for the entire year of 2014, i.e. during the last solar cycle maximum. The ionospheric models assessed are the operational models broadcast by the global positioning system (GPS) and Galileo constellations, the satellite-based augmentation system (SBAS) (i.e. European Geostationary Navigation Overlay System (EGNOS) and wide area augmentation system (WAAS)), a number of post-process global ionospheric maps (GIMs) from different International GNSS Service (IGS) analysis centres (ACs) and, finally, a more sophisticated GIM computed by the research group of Astronomy and GEomatics (gAGE). Ionospheric models based on GNSS data and represented on a grid (IGS GIMs or SBAS) correct about 85 % of the total slant ionospheric delay, whereas the models broadcasted in the navigation messages of GPS and Galileo only account for about 70 %. Our gAGE GIM is shown to correct 95 % of the delay. The proposed methodology appears to be a useful tool to improve current ionospheric models.

  18. The U-tube sampling methodology and real-time analysis of geofluids

    SciTech Connect

    Freifeld, Barry; Perkins, Ernie; Underschultz, James; Boreham, Chris

    2009-03-01

    The U-tube geochemical sampling methodology, an extension of the porous cup technique proposed by Wood [1973], provides minimally contaminated aliquots of multiphase fluids from deep reservoirs and allows for accurate determination of dissolved gas composition. The initial deployment of the U-tube during the Frio Brine Pilot CO{sub 2} storage experiment, Liberty County, Texas, obtained representative samples of brine and supercritical CO{sub 2} from a depth of 1.5 km. A quadrupole mass spectrometer provided real-time analysis of dissolved gas composition. Since the initial demonstration, the U-tube has been deployed for (1) sampling of fluids down gradient of the proposed Yucca Mountain High-Level Waste Repository, Armagosa Valley, Nevada (2) acquiring fluid samples beneath permafrost in Nunuvut Territory, Canada, and (3) at a CO{sub 2} storage demonstration project within a depleted gas reservoir, Otway Basin, Victoria, Australia. The addition of in-line high-pressure pH and EC sensors allows for continuous monitoring of fluid during sample collection. Difficulties have arisen during U-tube sampling, such as blockage of sample lines from naturally occurring waxes or from freezing conditions; however, workarounds such as solvent flushing or heating have been used to address these problems. The U-tube methodology has proven to be robust, and with careful consideration of the constraints and limitations, can provide high quality geochemical samples.

  19. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Technical Reports Server (NTRS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1994-01-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  20. Probabilistic Climate Forecasting: Methodological issues arising from analysis in climateprediction.net

    NASA Astrophysics Data System (ADS)

    Rowlands, D. J.; Frame, D. J.; Meinshausen, N.; Aina, T.; Jewson, S.; Allen, M. R.

    2009-12-01

    One of the chief goals of climate research is to produce meaningful probabilistic forecasts that can be used in the formation of future policy and adaptation strategies. The current range of methodologies presented in the scientific literature show that this is not an easy task, especially with the various philosophical interpretations of how to combine the information contained in Perturbed-Physics and Multi-Model Ensembles (PPE & MME). The focus of this research is to present some of the methodological issues that have arisen in the statistical analysis of the latest climateprediciton.net experiment, a large PPE of transient simulations using HadCM3L. Firstly we consider model evaluation and propose a method for calculating the Likelihood of each ensemble member based on a transient constraint involving regional temperature changes. We argue that this approach is more meaningful for future climate change projections than climatology based constraints. A further question we consider is which observations to include in our Likelihood function; should we care how well a model performs simulating the climate of Europe if we are producing a forecast for South Africa? The second issue deals with how to combine multiple models from such an ensemble together into a probabilistic forecast. Much has been said about the Bayesian methodology given the sensitivity of forecasts to prior assumptions. For simple models of the climate, with inputs such as climate sensitivity, there may be strong prior information, but for complex climate models where parameters correspond to non-observable quantities this is not so straightforward, and so we may have no reason to believe that a parameter has a uniform distribution or an inverse uniform distribution. We therefore propose two competing methodologies for dealing with this problem, namely Likelihood profiling, and the Jeffreys' prior, which is an approach typically known as OBJECTIVE Bayesian Statistics, where the use of the word objective simply implies that the prior is generated using a rule, rather than from expert opinion. We present novel results using a simple climate model as an illustrative example, with a view to applying these techniques to the full climateprediction.net ensemble.